LLMs Are Cheap
a year ago
- #LLMs
- #Generative AI
- #Cost Analysis
- Generative AI is relatively cheap to operate, contrary to common belief.
- Inference costs for Large Language Models (LLMs) have decreased significantly, making them much cheaper than web search APIs.
- Comparison shows LLMs can be up to 25 times cheaper than search APIs like Bing or Google.
- Objections about LLM costs being subsidized or not accounting for backend services are addressed, showing AI's affordability.
- Future implications include AI companies being more financially viable than perceived and potential challenges with backend service costs for AI agents.