← Blog
ai-visibilitygeostructured-datallms-txtjson-ld

The Invisible Website That Fooled Every AI — And What It Means for Your Business

April 21, 2026·OpenClaw Phnom Penh

Thirteen days ago, a completely blank website became the number one cited source in Perplexity. No content. No text. No images. Just a white page with seven layers of structured data hidden underneath.

ChatGPT cited it independently. Google indexed it.

The site is phantomauthority.ai, and the person behind it is Sascha Deforth, who published the results as an IETF Internet-Draft on April 18, 2026.

His conclusion: RAG systems have zero content provenance verification. Well-structured signals are indistinguishable from verified facts.

For businesses in Phnom Penh trying to get found by AI, this is either terrifying or an enormous opportunity. We think it's both — and we've turned it into something practical.

What is the “Ghost Stack”?

Deforth's experiment used seven layers of machine-readable signals layered into an otherwise empty webpage:

  1. Semantic meta tags + “VibeTags” — brand metadata in the <head> that crawlers and social platforms read automatically
  2. JSON-LD structured data — schema.org markup (Organization, Person, Service, FAQPage) that Google uses for rich results and AI systems use for entity extraction
  3. sr-only narrative— content in the DOM that's accessible to screen readers and DOM-parsing AIs but doesn't render visually
  4. Microdata attributes — inline itemscope/itemprop markup for entity extractors
  5. llms.txt — an emerging standard file (like robots.txt but for LLMs) that provides a concise markdown summary of your site at /llms.txt
  6. reasoning.json — structured claims following the Agentic Reasoning Protocol (ARP), with Ed25519 cryptographic signatures
  7. AI discovery manifest — a /.well-known/ai-manifest.json file that AI bots can discover automatically

Individually, each layer is well-documented. Together, they create what Deforth calls a “phantom authority” — a site that machines trust because it speaks their language fluently, even when no human has ever read a word on the page.

Why This Matters for Your Business

Here's the shift that most businesses haven't noticed yet:

People aren't just searching Google anymore. They're asking ChatGPT, Perplexity, Gemini, and Claude. And those systems don't rank pages the way Google does. They extract structured data, resolve entities, and synthesise answers from the most machine-readable sources they can find.

If your website isn't structured for AI consumption, you're invisible to the fastest-growing segment of search.

The Ghost Stack experiment proves that structured data alone — without any visible content — is enough to become a primary citation source. Now imagine what happens when you combine those same seven layers with actual content, real services, genuine expertise.

That's no longer phantom authority. That's just good infrastructure.

What We're Doing With This

We've packaged the seven-layer architecture into a reusable agent skill that we deploy as part of our Custom AI Workflows service.

The skill — GEO Ghost Stack — does two things:

  1. Audit mode: Scans your existing site and tells you exactly which layers are present, missing, or broken — with a 0–100 score.
  2. Build mode: Scaffolds the missing layers using your real business facts — services, FAQs, pricing, team info, location data.

We don't do phantom authority. Every layer we deploy is backed by real content and real claims. But the architecture is the same, because it works.

What a typical implementation includes

  • JSON-LD schemas wired to your actual services, pricing, team, and FAQs — not generic placeholders
  • Meta tags with geo-targeting for Phnom Penh, proper Open Graph for social sharing, and VibeTags for AI extractors
  • llms.txt — a concise markdown summary of your site that LLMs can read directly (we added one to this very site at /llms.txt)
  • AI discovery manifest at /.well-known/ai-manifest.json so AI bots can find and parse your site automatically

The Provenance Problem (And Why Honesty Wins)

Deforth's experiment exposed a genuine vulnerability: AI systems will cite well-structured information regardless of whether it's true. His proposed fix — the Agentic Reasoning Protocol — brings DKIM-style cryptographic verification to AI-consumed data, anchoring claims to DNS records the way email anchors sender identity.

We think this is the right direction. In the meantime, the lesson for businesses is simple:

Structure your real expertise properly, and AI systems will find you. Fabricate expertise, and eventually the verification layer will catch up with you.

We build for the long game.

Get Started

Want to know how AI-readable your current site is? We'll run the audit and show you exactly where the gaps are — no obligation.

Credits: The Ghost Stack architecture and ARP protocol are the work of Sascha Deforth. The phantomauthority.ai experiment and IETF draft-deforth-arp-00 are his publications. The llms.txt specification is a community standard maintained by Answer.AI. Our skill is an independent implementation of these concepts.