See if AI systems can actually find and cite your site.

Beaconly checks your robots.txt, llms.txt, structured data, and page signals against the configuration patterns that AI crawlers expect. Enter any URL to get a full report with specific fixes.

beaconly.orygn.tech

Beaconly audit tool interface

What we check

Three layers of AI discoverability, audited in seconds.

Beaconly runs checks across the three areas AI crawlers actually look at when deciding whether to index and cite a site. Each check returns a specific pass or fail with a fix you can act on immediately.

Tier 1: AI Crawler Access

Checks your robots.txt for explicit AI crawler permissions and validates your llms.txt structure and sitemap. Covers GPTBot, ClaudeBot, PerplexityBot, Google-Extended, and six other crawlers by name.

Tier 2: Schema and Structured Data

Inspects your JSON-LD for the entity signals, freshness markers, and content types AI systems use when deciding what to cite. Checks Organization identity, sameAs links, dateModified, FAQPage schema, and more.

Tier 3: Page Structure

Reviews your meta tags, heading structure, Open Graph data, canonical URL, HTTPS, and response speed. These are the page-level signals AI crawlers read before anything else.

Specific fixes, not grades

Every failed check returns an explanation of why it matters and an exact fix you can implement. No letter grades or vague scores - just what is wrong and how to address it.

Built with

Server-side auditing with a fast, lightweight frontend.

Beaconly uses a Cloudflare Worker to fetch and analyze each target domain server-side, with all checks running in parallel. The frontend is vanilla HTML, CSS, and JavaScript with no build step and no external dependencies.

Cloudflare Pages Cloudflare Workers Vanilla JS robots.txt parsing JSON-LD analysis llms.txt validation

FAQ

Common questions about AI discoverability.

Beaconly is a free tool that audits whether a website is properly configured for AI crawler discovery and citation. It checks robots.txt permissions, llms.txt structure, JSON-LD schema signals, and page structure against the patterns AI systems like ChatGPT, Perplexity, and Claude use when indexing and citing content.
AI systems like ChatGPT, Perplexity, and Google AI Overviews actively crawl the web and cite sources in their answers. If your site is not configured correctly, these systems cannot access, understand, or reference your content, regardless of how good it is.
llms.txt is a file you place at the root of your domain that gives AI models a structured summary of your site, including what it is, what it covers, and which pages are most important. It follows a simple Markdown format and helps AI systems understand your site without having to crawl every page.
No. These checks cover the technical prerequisites for AI discoverability. Whether AI systems ultimately cite your content also depends on content quality, authority, and relevance to the queries being asked. Think of this audit as making sure the door is open, not guaranteeing visitors will walk through it.
Yes. Beaconly is a free tool published by Orygn LLC. No account or signup is required.

Built by Orygn

Beaconly is one of several tools Orygn has built around AI visibility and security.

Orygn builds custom software, security tooling, and infrastructure-level systems. Beaconly came out of the same work we do for clients, applied to the problem of making sites discoverable to the AI systems people are increasingly using to find information.

Work with Orygn