In the mid-'90s, a breakthrough occurred in the quest to decipher human language in the context of mapping and searching information in the nascent web. Researchers, notably Sergey Brin and Larry Page from Stanford, pursued innovative avenues to refine search quality. Ironically enough, instead of grappling with the intricacies of language semantics and having automatons reading full web pages trying to understand the meaning and context of the text, they directly explored the topology of the web.

Their insight: by scrutinizing how pages interlinked, not the semantic content, computers could discern the relative importance of web content. Pages discussing Sergei Prokofiev, for instance, garnered significance not only from their own content but also from the prominence of referring pages and the frequency of mentions. This approach, blending web topology analysis and just a touch of lexical examination, propelled search accuracy to unprecedented heights.

Content was not king; referral was.

Google, with its superior ranking algorithms, outshone competitors by delivering more pertinent and information-rich results at lighting speed. While Google's search interface remained familiar—a user inputted keywords and received results—the underlying ranking mechanism revolutionized the search landscape forever. The success of Google underscored the efficacy of non-semantic approaches in achieving remarkable outcomes in search technology.

This is where AI cuts deep.

Google has built a $237.86 billion business on a paradigm that not only doesn't require a deep understanding of the vast sea of content they reference but also somehow denies the value of doing so.

AI does the opposite.

And as AI pre-digests so much of the world's information beforehand and gets better at regurgitating cogent answers (whereas Google only provides an assortment of links, more and more polluted by irrelevant ones that need promotion),

The link has been copied!