Cloudflare launched 'Agent-Ready Scanner' to audit if your website can handle AI agents. Are we building the web for machines now? Let's dive into the drama.

Everywhere you look these days, tech bros are preaching about how "AI Agents are the future." But here’s the million-dollar question: can these hyped-up bots even read the spaghetti code your site is serving? Probably not.
Over on Product Hunt, Cloudflare just dropped a tool called Agent-Ready Scanner (scoring a solid 200+ upvotes). Basically, it runs a health check on your website to see if it plays nice with AI agents.
Instead of checking standard SEO or Lighthouse scores, it audits protocols like robots.txt, sitemaps, MCP, OAuth, and agent skills. The goal? To see if ai tools and agents can actually browse, interact, and transact on your site without throwing a 403 Forbidden tantrum. Identifying these gaps is the first step to making your site ready for the impending bot invasion.
Diving into the comments, the community is torn between praising the tool and having existential crises:
llms.txt convention? And does it differentiate between actively blocking AI crawlers versus simply not explicitly allowing them? Those are two very different developer intents.Whether you think AI agents are the ultimate paradigm shift or just VCs blowing hot air, the reality is that bot traffic (often running behind massive proxy networks) is the new normal. "Agentic SEO" is going to be the next buzzword PMs shove down your throat.
Run the scan. It’s better to know your site’s blind spots now. That way, when your CEO inevitably asks if the platform is "AI-ready," you can smugly reply that you've already handled it.