George is right, and the implications are bigger than they look. Why does this matter?
Most brands, and their SEO teams, are still optimizing for content volume. They're publishing more posts, chasing more impressions, recalibrating keywords, and measuring traffic. Yet, AI search engines are doing something completely different. They're following digital trails of evidence, weighing where the evidence (a.k.a. proof points) comes from, and deciding whom to recommend based on what the digital trail actually shows.
The brands winning citations in the four major AI search engines (ChatGPT, Claude, Perplexity, and Gemini) right now aren't publishing more than their competitors. They're simply leaving more evidence.
The Old SEO + Content Playbook Is Quietly Dying
Despite these seismic shifts happening beneath everyone's feet, the old playbook for getting found hasn't caught up to how AI engines work.
Flash back to circa 2010. For 15 years, the rules were clear: publish a lot of content, optimize it for keywords, promote it on social, measure traffic, time on page, and conversions. Rinse and repeat. The brand that built the loudest content engine won the search results page.
Flash forward to today and that model is dying. The metrics that used to measure success are quietly leading brands off a cliff.
Half of B2B buyers now start their research on ChatGPT, not Google. Fifty-four percent use it to build their vendor shortlists. Nine out of ten pick the first recommendation an AI platform presents. ChatGPT, Claude, Perplexity, and Gemini cite only two to seven sources per answer, on average.
Notice what is NOT on that list of inputs:
- Your blog traffic.
- Your social engagement.
- Your time-on-page metrics.
AI engines aren't looking at any of those old metrics when they decide which company to cite to the prospective buyer.
So the brands that are most "active" in the old sense are often the most invisible in the new one.
How AI Engines Actually Decide Who to Cite
To understand why a digital trail matters more than volume, it helps to think about what an AI engine actually does when someone asks it a purchase-intent question.
It doesn't search the web in real time and pick the best blog post. Instead, it synthesizes from a model that has already been trained on, and recently augmented with, a wide consensus of sources, often recent ones. It looks for cross-referenced mentions and weighs the credibility of where those mentions appear. It checks whether different online sources agree on who you are, what you do, and whether you're authoritative enough to recommend.
A digital trail, in this context, is therefore the cumulative evidence of your expertise across the surfaces AI engines actually trust.
Earned media coverage. Community discussions on Reddit, Hacker News, and niche forums. Third-party reviews on G2, Trustpilot, and category-specific platforms. Analyst commentary. Structured directories. Cross-platform consistency in how your brand is named and described.
Each of those is a breadcrumb. Collectively, they should form a rich digital trail that any AI engine can follow, verify, and use to confidently cite you when someone asks the right kind of question, anywhere along the buyer journey.
Brent van der Heiden, the CEO of MapAtlas, made the structural version of this point in the same Neil Patel LinkedIn post and thread: structured data and entity recognition are the missing layer most brands ignore. If different sources call you slightly different things, or if your company messaging varies across platforms, an AI engine can't connect the trail. The breadcrumbs don't lead anywhere coherent.
Ben Wise, head of programmatic media at Google, validated something that should make every social media manager nervous: social clout does not translate when AI models prioritize formal citations and structured data. Your viral LinkedIn post with 500 likes and 100 reposts doesn't feed the model. A Reddit thread, a G2 review, and a contributed article do, and carry far more value for being cited.
The Five Surfaces AI Engines Trust
A digital trail isn't a single post or a single press hit. It's an interconnected web of signals across at least five different surfaces.
Earned media. Coverage in publications that AI engines weight as authoritative. Tier-2 trade pubs and category-specific outlets often perform better than top-tier mainstream outlets, because many of the largest publications have implemented AI crawl blockers that prevent their content from reaching the models in the first place. Approximately 73% of sites are now blocking AI crawlers. The implications for media strategy are significant.
Community Channels. Reddit, Quora, niche forums, and Slack groups where buyers authentically compare options without brand involvement. Around 52.5% of what AI platforms cite comes from community channels, not from corporate websites. Most brands have no presence on the surfaces where the actual conversation happens.
Review Platforms. Platforms like G2, Capterra, Trustpilot, Glassdoor, Yelp, and category-specific review sites. These are formal citations that are structured for machine consumption and weighted heavily.
Analyst and expert commentary. When credible third parties cite or quote you in a category context, AI engines treat that as a strong signal of relevance and credibility.
Structured directories. Crunchbase, Wikipedia, LinkedIn company pages, and industry-specific directories. The data here is highly structured, easy for machines to ingest, and forms the entity backbone an AI engine builds its understanding around.
When all five surfaces tell a consistent story about who you are and what you do, an AI engine has a coherent digital trail it can follow with confidence. When the surfaces are inconsistent, sparse, or missing, the engine has nothing to cite, and someone else (your competitor) gets the recommendation (citation) instead.
Why Content Volume Doesn't Equal Visibility
Most companies are stuck in a dated volume mode and, even worse, don't know it.
They confuse impressions for evidence. They count blog posts published, LinkedIn posts shared, and webinars hosted, then assume that activity equals visibility and authority. They treat their owned channels as the main game, often the only game, and treat earned media, community channels, and third-party surfaces as nice-to-haves.
That logic worked "wayback" when Google's algorithm rewarded content volume, keywords, and backlinks. It does not work today, when AI engines reward consensus and evidence.
Klemen Gradisar called this out in the same Neil Patel LinkedIn thread: companies still measuring content success with traffic instead of AI citations are measuring the wrong thing. The dashboard says everything is working. The AI search results say the brand is invisible.
The fix isn't to publish less. The fix is to think about every piece of content as a digital breadcrumb that needs a place to land. A single blog post on your domain is one breadcrumb that both AI engines and visitors treat with a grain of salt. The same expertise echoed as a contributed article in a category publication, a relevant Reddit thread, an analyst quote, and a podcast appearance gives you five interconnected breadcrumbs that AI engines can verify across sources.
That's what a strong digital trail looks like.
The Playbook: Five Steps to Get Cited
If your goal is to be cited consistently across AI engines, the work shifts from publishing to placing, and from broadcasting to building consensus.
1. Audit your existing digital trail. Run searches on yourself across ChatGPT, Claude, Perplexity, and Gemini using buyer-intent queries. Note what gets cited and what doesn't. Note the surfaces the engines pull from. Most teams are surprised, often panicked, at how thin and inconsistent their existing trail looks, if it shows up at all.
2. Identify your category surfaces. Where does your category get discussed when no marketer is in the room? Reddit subreddits, Quora threads, niche communities, specific G2 categories. Map the five to ten places where buyers actually talk about products like yours. That's your real distribution map. Not another round of keyword recalibration.
3. Build cross-surface consistency. Make sure your name, description, and category positioning are identical across your online presence such as LinkedIn, Crunchbase, Wikipedia, your website, and your directory listings. Small inconsistencies cause AI engines to lose the digital trail. Cross-platform precision is unsexy but it matters a lot more than another blog post.
4. Earn media coverage in surfaces that feed the AI models. Tier-2 trade publications and online news sites, category-specific outlets, and contributed articles often do more for AI visibility than a single hit in a mainstream publication that has blocked AI crawlers. Choose your media targets accordingly.
5. Measure citations, not clicks. The dashboard that matters today tracks citation frequency across major AI engines, share of voice when your category is discussed, and accuracy of how AI describes your brand. Traffic remains useful, but it is downstream of the new scoreboard.
This is harder than publishing another blog post. It's also what is going to determine whether your brand gets cited in the next few years.
George Ilic's framing is correct, and it is going to age well.
The brands that win the next era of AI search aren't going to be the ones with the loudest content engine calibrated for AEO, SEO, GEO, etc. They're going to be the ones that left the most proof points in the right places, structured for machines to find, verify, and cite.
Posting a lot is SEO and content marketing thinking. Leaving a digital trail is something entirely different. Call it evidence infrastructure. We call it Machine Relations. Call it whatever you prefer. The point is the same: stop counting the number of blogs or your traffic and start counting where your digital breadcrumbs lead.
The digital trails that exist today are the ones AI engines will be following for the next few years. The companies building them right now will own the consideration set, the ones AI engines actually cite. The ones still publishing into the void will wonder why their pipeline dried up completely.