Hasty Briefsbeta

Why blocking LLMs from your website is dumb

18 days ago
  • #LLMs
  • #Content Distribution
  • #SEO
  • Perplexity was accused of scraping sites that disallowed LLM crawlers in robots.txt files.
  • Blocking LLMs is criticized on moral grounds and general distaste for AI.
  • LLMs are the next generation’s search layer, generating traffic for websites.
  • Blocking LLMs cuts off a fast-growing distribution channel.
  • Adapting to LLMs and providing high-quality content is more beneficial than blocking them.