Is your site
agent-ready?

Scan any website to check if it supports the standards AI agents rely on — robots.txt rules, llms.txt, MCP, OAuth discovery, and more.

What do we check?
  • Discoverabilityrobots.txt, Sitemap, Link headers
  • Content AccessibilityMarkdown content negotiation, llms.txt
  • Bot Access ControlAI bot rules, Content Signals, Web Bot Auth
  • Protocol DiscoveryMCP Server Card, Agent Skills, WebMCP, API Catalog, OAuth discovery, OAuth Protected Resource
  • Commercex402 Payment, MPP, UCP, ACP
What's the easiest way to improve?

Start with the easy wins: publish a valid robots.txt with AI bot rules and a sitemap directive, then add an /llms.txt file describing your site for language models.

What is llms.txt?

An llms.txt file (at /llms.txt) provides a curated, LLM-friendly overview of your site — similar to what robots.txt does for crawlers, but optimised for language models.

What is MCP?

The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external data sources and tools. A server card at /.well-known/mcp.json advertises your MCP server to agents.

Results are informational. Accuracy may vary.