Watch the live training this came from
This article is drawn from Shanee Moret's Day 2 live training on Codex, websites, agent-ready infrastructure, and real business-owner implementation.
Watch the replay →Most business owners assume their biggest visibility problem is marketing. Post more. Run ads. Show up consistently on social.
That assumption was built for an internet where humans do the searching.
The internet is no longer exclusively that.
When someone needs a service provider today, they increasingly deploy an agent to research on their behalf. That agent does not scroll Instagram. It crawls your website, reads your robots.txt, evaluates your published content, and determines whether you are a credible expert in the category the human needs. A site that blocks the wrong crawlers does not underperform in agent search results — it disappears from them entirely.
There is now a test for this. It is at isitagentready.com. And during a recent live session with established business owners, the scores were: 17. 0. 25.
These were not new businesses. These were owners with years of experience, real client histories, and functioning offers. One of them discovered, mid-session, that their site was actively configured to block AI crawlers — a setting they had never knowingly chosen.
I scored 100. Not because I started there. Because I spent months iterating until I did.
This post teaches you what that score measures, why it matters for your revenue, and what to fix first.
For the complete framework on building an agent-ready business, read the full guide.
What the Test Actually Measures
Watch me explain this live — including the moment audience members ran their scores in real time.
isitagentready.com is hosted by Cloudflare. It tests whether your website can be found, crawled, and understood by AI agents operating on behalf of human buyers.
The test does not evaluate your design. The test does not evaluate your copywriting. It evaluates whether the infrastructure of your site makes you accessible to non-human visitors — the agents your potential customers will send to vet you before they ever make contact themselves.
A human visitor can navigate a slow-loading site, infer expertise from tone, and work around a confusing menu. An AI agent operates on structured data. It reads your robots.txt file to determine whether crawling is permitted at all. It looks for consistent published content as evidence of expertise. It checks whether your site architecture is parseable by a non-human reader. Fail on any of those dimensions and the agent reports back to its human: no credible expert found at that domain.
Your years of experience do not transfer to that agent automatically. You have to build the infrastructure that communicates them.
The Two-Layer Foundation Every Agent Needs to Find You
I think about agent visibility in terms of two foundational requirements. Both must be working before anything else on your site matters.
Layer 1: Crawl Access — Your robots.txt Configuration
Your robots.txt file is a small text document that tells crawlers what they are and are not allowed to access on your site. It was originally designed for search engine bots. Every AI agent that attempts to crawl your domain reads it.
Many business owners have never looked at this file. Install a security plugin, migrate platforms, or use a theme with built-in crawler restrictions, and your robots.txt may be blocking agents you want to allow — without your knowledge. Different AI systems use different crawler identities. "GPTBot" is ChatGPT's crawler. "Google-Extended" affects how Google's AI systems handle your content. Each is a discrete configuration, and each has different consequences depending on where your buyers are searching.
If your site blocks ChatGPT's crawler and your target clients are using ChatGPT to find service providers, no amount of content production will fix that. The agent searching on their behalf will never reach you. Fix layer 1 before doing anything else.
Layer 2: Expertise Proof — Consistent Long-Form Published Content
When an AI agent attempts to verify expertise, it looks for proof of sustained, substantive engagement with a subject. Short-form social posts do not qualify. Ad copy does not qualify. A static service page does not qualify. Indexed long-form material — video, newsletters, articles — published consistently over time, qualifies.
Long-form video carries the heaviest weighting when agents evaluate expertise credibility. Every long-form piece you publish and index is a permanent addition to your agent-credibility record. A piece published in 2023 still contributes to your score today. The business owners who started this early have a structural advantage that is widening every month.
| Signal Type | Agent Weighting | Common Mistake |
|---|---|---|
| Long-form video (indexed) | High — primary proof signal | Publishing video but not indexing it to website |
| Newsletter / long-form articles | High — expertise documentation | Writing newsletters that never connect to site |
| Short-form social posts | Low — insufficient expertise signal | Treating social activity as a substitute for long-form |
| Static service pages | Low — no temporal expertise proof | Relying on homepage copy alone |
| robots.txt (correct config) | Gate — must pass to access above signals | Never reviewing the file after platform migrations |
When layer 1 and layer 2 are both broken, the score is 0 or near it. Layer 1 failure means the agent cannot enter. Layer 2 failure means the agent enters, finds no evidence, and leaves. A score of 17 or 25 almost always reflects both problems partially present — partial crawl access, thin long-form content, or both.
Common Mistakes When Business Owners First Run the Test
Treating the score as a reflection of content quality. The score measures infrastructure and crawlability. A site with excellent writing and a broken robots.txt will still score near zero.
Fixing design instead of architecture. After seeing a low score, many business owners instinctively update their homepage design or rewrite their headline. Neither of those changes the agent-readiness score. The score responds to robots.txt configuration, structured content, and crawl access.
Assuming the problem is temporary. One business owner in the live session had a score of 0. When they asked Codex to review their site, Codex confirmed the site was actively blocking AI crawlers. That was a configuration requiring a deliberate fix — not a temporary indexing lag.
Waiting until the site is "ready" before fixing crawlability. Buyers are already deploying agents to research vendors in your category. Every day your site blocks those agents is revenue those agents are directing elsewhere. Fix layer 1 before anything else — it takes minutes once you know what to look for.
What to Do After You Run the Score
- Go to isitagentready.com and run your domain. Write the score down.
- Pull up your robots.txt file — it is at yourdomain.com/robots.txt — and read it. Look for any entries that block major AI crawlers by name.
- Ask Codex: "Review my website at [URL] and tell me what an AI agent searching on behalf of a potential client would find — and what it would not be able to access."
- Count how many pieces of long-form indexed content you have published in the last 12 months. Be honest about the number.
- Identify one recurring long-form format you can commit to and connect to your website for automatic indexing going forward.
My score of 100 is the result of months of iteration. The score you see today is your baseline — every structural fix moves it. Learn about the platform infrastructure that enables full agent access, and understand what a fully agent-ready site looks like once these foundations are in place.
The Principle
The agent economy is already running searches, already vetting vendors, already returning results to buyers who never visited a website themselves.
Visibility in the agent economy comes from being crawlable and consistently documented — built so that when an agent arrives at your domain, it finds evidence that you are exactly what its human is looking for.
Fix layer 1. Build layer 2. The rest compounds.
Use the skills behind this system
The Growth Academy Skills Dashboard includes 100+ Codex skills and prompts for SMB owners, including website audits, GitHub and Cloudflare setup, permissions, business intelligence, sales, and operations workflows.
See the Skills Dashboard →