We build a tool that audits websites for AI search visibility. The obvious question: what does our own site score?
We ran GEO Auditor on geoauditor.app on March 23, 2026. The score: 49/100 — Poor.
This post is an honest breakdown of what the audit found and what we're doing about it. We're publishing it because we think it's more useful than a polished success story — and because the findings are a perfect illustration of why GEO is different from traditional SEO.
The full score breakdown
The composite GEO score is built from six dimensions, each scored 0–100:
Overall: 49/100. That's a sobering number for a team that built the tool. But it's also exactly the point — and the reason we're writing this post.
What we're actually doing well
AI Crawler Access: 95/100
All 14 AI crawlers are permitted. Our robots.txt has a blanket Allow: / rule, so GPTBot, ClaudeBot, PerplexityBot, Google-Extended, OAI-SearchBot, and every other AI bot can read every page. We have no firewall rules blocking bots, and all our pages are server-rendered (no JavaScript-only content).
The only gap: we don't have an llms.txt file yet. That's a quick fix.
Technical Health: 86/100
Our site is deployed on Cloudflare Workers with full SSR. LCP, INP, and CLS are all in the green range. Canonical tags are present, sitemap is valid, HTTPS is enforced. The issues: we're missing several security headers (HSTS, CSP, X-Frame-Options) that can be added in one Cloudflare configuration change.
Where we're failing — and why it matters
Brand Authority: 4/100 — the most important finding
This is the score we're most embarrassed about, and the most instructive one.
GEO Auditor has:
- No YouTube channel
- No Wikidata entry
- No Wikipedia article
- No LinkedIn company page
- No confirmed Reddit presence
- No public founder identity or social links on the website
- No GitHub organization
The consequence: when ChatGPT or Perplexity encounters our content, it can't confidently attribute it to a known entity. We might get quoted, but we won't get named. To an AI knowledge graph, we're an anonymous website.
This is also the most fixable problem on the list. A Wikidata entry and LinkedIn company page can be created in an afternoon, at zero cost.
Content E-E-A-T: 32/100
The audit found ten issues here. The critical ones:
- No named authors or team credentials. The site has no About page with team members, no author bylines, no founder LinkedIn link. AI engines can't evaluate the expertise behind the tool.
- Zero content marketing — no blog, guides, or resources. (This was true at the time of the audit. We've since built and published 10 articles. You're reading one of them.)
- No external validation or press mentions. No coverage from third-party sources, no customer quotes, no case studies.
- No original research published. We have proprietary data from thousands of audits — average GEO scores across industries, most common blocking patterns, the distribution of brand authority scores — and we haven't published any of it.
Content Citability: 53/100
The audit analyzed our homepage and found that the FAQ section scores well for citability — it has answer-first structure, specific claims, and named references. But the hero, marketing, and feature sections are almost entirely non-citable: marketing language ("make your website visible to AI search") doesn't produce quotable passages.
The citability engine also noted:
- Statistics per 500 words: 2 (low — aim for 5+)
- Average passage length: 70 words (good for quotability)
- Original research: false (no proprietary data published)
- Expert quotes: false
Schema Markup: 65/100
We have five JSON-LD blocks: Organization, WebSite, SoftwareApplication, WebPage (with Speakable), and FAQPage. They're all well-formed and server-rendered. The critical gap: sameAs count: 0. Our Organization schema has no links to Wikidata, LinkedIn, or any other external profile. The schema correctly declares who we are, but provides no way for AI engines to verify that identity against a knowledge graph.
The fix plan — in order of effort and impact
This week (zero engineering required):
- Create a Wikidata entry for GEO Auditor
- Create a LinkedIn company page
- Add sameAs links to the Organization schema (5-minute code change)
- Add security headers in Cloudflare dashboard (2 minutes)
- Create
public/llms.txt
This month:
- Add an About page with founder/team names and credentials
- Publish original research: "We audited 500 sites. Here's what we found."
- Get press coverage or at least one third-party mention
- Add more statistics and data to core landing page copy
What this tells us about GEO in general
The highest-scoring dimension on our site is the most technical one (crawler access). The lowest-scoring dimensions are the most human ones (brand presence, content depth, author identity). This pattern is consistent across nearly every site we audit.
Most developers and technical founders build sites that are technically correct but brand-invisible. The robot can read your page. It just doesn't know who you are.
The fixes that matter most for AI search visibility are not code changes. They're the offline, human-facing work: establishing your brand as a real entity with a verified presence, publishing content that demonstrates genuine expertise, and giving AI engines the structured signals they need to attribute your content by name.
Our own score of 49/100 is a reminder that technical correctness is the easy part.
We'll publish a follow-up post once we've implemented the fix plan above. Run your own audit to see where you stand — it takes 45 seconds, and it's free.