1-minute Summary

The Past: SEO focused on keywords.

Now: The technical breakthroughs that produced LLMs like ChatGPT emerged from 2017 tech Google invented and then implemented in its core ranking algorithms, which means that by deploying customized SEO “GPTs” that cost $20/month, you own intuitive tools that rapidly produce enduring SEO.

With Google’s ranking now based more on Entities (like topics) than on keywords, SEO done right and once endures, even as search engines evolve, because Entities represent the durable real world. 

LLMs emerged from such tech as Google’s 2012+ Entity Knowledge Graph and 2021+ Multitask (and multimedia) Unified Model called MUM, so that ranking mimics the collective human interests of your market. Now Google understands websites through multimedia and user actions both within and surrounding a website. Therefore, optimizing your website structure, content, and conversion flow for the Entities in your market replaces laborious keyword research. Your customized GPT helps execute this optimization at 20x human speed, slashing SEO costs and thus rocketing ROI.

2 Bonus Points:

  1. “Generative Engine Optimization” (GEO) is mostly a salesy rebranding of traditional SEO.
  2. Firms claiming you can game LLMs via Q&A that cites your website don’t understand how LLMs work–or do and lie to gain clients. This tactic is almost certainly futile. 

Overview: Search Engines Humanized

Designing your websites’ overall “SEO Architecture & Writing” (SAW) will make the site rank well in perpetuity. Here’s why.

AI makes search engines work like collective human brains that think about things and not merely strings of text. Google’s Entity model of search understands topics as composites of all kinds of media and human actions. Imagine that all the people in your target markets are one mind, remembering everything related to your brand: as in human brains, keywords are now a small part of what drives that big AI brain. 

Therefore, a crucial and mostly one-time task in SEO today entails structuring your website content and conversion flow to reflect Entities pertinent to your market. This work precedes and, for smaller budgets, can replace most SEO keywording. In other words, starting in 2022 the advice of a great 1960s adman would produce a compelling flow of intent though your site and to your bottom line.

Although Google has completed only ~30% of its ongoing coding (“KGMIDs”) in its Knowledge Vault, the hierarchical connections within that 30% define most if not all main Entities addressed by most websites. Where Google has not yet established more specific sub-sub-Entities, traditional SEO keyword writing will improve rank while connecting to more granular Entities coming soon.

History: Google’s 2021+ MUM Begins This SEO Revolution

Now in “SEO Architecture & Writing” (SAW), the Writing includes any website content, not merely text. This brief history of the underlying tech explains:

  • The Multitask (and multimedia) Unified Model (MUM), which Google introduced in 2021, leverages advancements in Large Language Models (LLMs) and other AI to understand the relationships between different modes of information, not merely text. MUM builds upon Google’s earlier work with the Knowledge Graph (see Introducing the Knowledge Graph: things, not strings, 2012). This Knowledge Graph (or Vault) relates real-world Entities with one another, like humans do. MUM and later LLMs required the increased processing power provided by graphical CPUs (called GPUs, with “TensorFlow” architecture), rocketing Nvidia to top 3 world market cap.

By roughly mid-2022, SEO firms should have pivoted from keyword-based to Entity-based strategies. However, years of huge investments in keyword tools and procedures incentivize most SEO firms and tool suites to sell obsolete solutions. With most news for sale these days, even venerable publishers avoid explaining this, so that they don’t anger their big advertisers.

Now: Uniting Entity and LLM Models to Replace Managers

Replacement of experts and managers requires that LLMs (1) continue current improvements in math and sequential reasoning, and (2) are allowed to become “Agents” taking actions within workflows. These two things have recently happened, but widespread availability remains 3 to 5 years away. However, large portions of most professions’ core tasks–for example, SEO and CRO writing, law, graphic design, coding, even CPA work, and plenty more–can be greatly accelerated even by prior, less capable versions of LLMs. Professional firms with integrity will tell you this, and price services accordingly.

Uniting LLMs’ expert reasoning and freedom to act entails uniting real-world Entity structures with LLMs’ more fluid associations. Google and other big tech firms are now racing to accomplish this, and they announce monumental progress almost monthly.

For the tech-inclined reader, the following technical explanations aim to convey understanding that may help you navigate our new LLM world:

  • Entity-based knowledge graphs and LLMs are fundamentally different. Knowledge graphs are rather structured, showing relationships among key nodes of meaning. Picture a city’s public transit map. LLMs, on the other hand, are more free-form, like a brain’s sprawling, tangled network of nerve connections, wherein topics are related in fine gradations of similarity (via numbers in a vector database).
  • Uniting the two databases entails Retrieval-Augmented Generation (RAG), a technique essential for creating GPTs tailored to specific fields of expertise or to a firm’s cloud. RAG marries knowledge graphs or any separate group of information with LLM processes, to achieve the (un?)Holy Grail of enterprise computing: Replacing costly executives with AI. For example, a specialized GPT can integrate all State laws to create a legal GPT assistant that replaces paralegals and probably plenty of junior attorneys as well.
  • AI/LLM replacement of experts and managers requires that most if not all expert information and company cloud content respectively are available to RAG. For example, because top medical journals aren’t free, retail-level medical GPTs buy limited, incomplete access, meaning that a truly good doctor can still do better. On the other hand, most professions follow established public rules and guidelines, so that dedicated GPTs can replace a big portion of billable time.
  • Until LLMs achieve precise and consistent sequential reasoning and math, professional tasks that depend on near-perfect output in multiple, dependent steps will continue to require professional supervision proportionate to that complexity. Technical SEO, upon which all content production depends for SEO performance, is a prime instance of this principle: It can’t yet be handled by LLMs–in fact they hardly help at all. But this ~1/3 of SEO can be done once to endure; it is table stakes in the game of website success.

The Future: The Durability of Entity Classifications

Chats with LLMs, like those linked below, point out that Entity databases such as knowledge graphs may change substantially. However, though they and LLM’s algorithms and pre-trained databases will change, Entities refer to enduring things in the real world (shoes, Tolstoy, lavender) as well as to abstract enduring concepts, like democracy, expensive, brilliant, zero, and evolution. Therefore, except in new and highly specialized fields for which Entity classification is not yet established, websites’ Entity structure can be designed now to endure in perpetuity.  

In July 2024, top LLMs agreed with the SEO durability thesis of this blog post, adding the crucial caveat that sites’ technical SEO foundations remains outside LLM provenance. Good, honest SEO firms have long known that a sound technical SEO must precede all subsequent content optimization. The LLMs also emphasize that keywording remains important; however, now SEO firms should use Entity research tools that first ascertain a business’s knowledge graph of offerings and market segments. Then such tools or else LLMs quickly generate keyword clusters under each main Entity, so that SEO GPTs produce market-focussed, keyworded content at 20x human speed. For details, see this ChatGPT discussion (about half way down, though the first half elucidates core tech) and Gemini’s synopsis (the Claude LLM echoed those two).

Details for data scientist only: Some firms are now pioneering the use of LLMs to make entirely new structured knowledge graphs, but those would still refer to the same real world that websites and current search engines do. For more about this, see Neo4j Finds the Vector for Graph-LLM Integration and Unifying LLMs & Knowledge Graphs for GenAI: Use Cases & Best Practices

Is “GEO” Mostly Marketing Spin?

Now some SEO firms use the term “Generative Engine Optimization” (GEO) to attract clients. However, GEO entails what good SEO firms have done for many years. Indeed, the neural net connections implicit in both LLMs and search engines’ knowledge graphs mirror such PageRank-like connections as interlinking, reviews, mentions, conversion rates, subsequent off-site actions, and other measures of website helpfulness and legitimacy. So both old-school SEO and the current SEO-CRO approach explained in this post encompass “GEO.”

One potential new SEO tactic emerges from the possibility that a user’s interactions with an LLM could be incorporated into future updates to the LLM’s core pre-trained database. [They aren’t incorporated between updates–a problem which RAG aims to reduce.) For instance, if an LLM user frequently mentions a website that provides a filtered search for door locks, the LLM subsequently might provide that website to anyone asking about what kind of lock to buy. Such strenuous manipulation would likely prove futile because a single user or group of users is a spit in the ocean of LLMs’s vast data. Still, this tactic is worth annual investigation.

Sources and Credits

DISC’s 17-minute, 3-part blog series on AI/LLMs, or equivalent knowledge, helps comprehend this post and plan your firm’s profit.

This post’s writing came from DISC’s decades of web marketing R&D and practice. LLMs were used only to verify some statements and to help with basic edits for clarity.