This is an essay I recently wrote as part of my MRes Business Economics course.
The rise of large language models (LLMs) including generative pre-trained transformer models (GPTs) has the power to upend many traditional digital business models, including those in the search engine sector. LLMs allows consumers to easily obtain answers to their questions, in a personalised and engaging manner, without having to navigate through search results and paid advertisements. ChatGPT has already demonstrated that this is a disruptive technology with phenomenal consumer interest – more than 100 million people had used the product within 2 months of its launch, making it the fastest growing consumer application in history[i], and demonstrating the power of S-curve dynamics and tipping points in this market.
Consequently, Google faces an existential threat to its primary business of search, with competitors integrating this AI technology into search functionality. However, investors and commentators are wrong to be concerned that we might be seeing Google’s Kodak moment[1].
Self-cannibalisation and the (fine) balance between explore and exploit
The rise of LLMs and the threat this non-incremental innovation poses hasn’t come as a surprise to Google. After all, Google has contributed highly to this research endeavour, developing the transformer model architecture (the T in GPT). It is also known to have several in-house AI chatbots, such as Lambda and Bard. This clearly demonstrates that Google has not succumbed to the inherent inertia from being market leader and has managed to innovate in emerging technologies.
Yet the grand challenge for Google is to balance the need to respond to the threat of LLMs, which can usurp its dominant market position, with the countervailing concern of self-cannibalising its traditional lucrative search business. Until now, Google has balanced this trade-off well, maximising and exploiting existing revenue through the cash-cow of traditional search for as long as possible whilst investing in exploration of AI technologies in the background to avoid a future tripping point. However, with the recent release of OpenAI’s ChatGPT-4 and the rolling-out of this technology into Bing, alongside Baidu’s introduction of the Ernie chatbot, Google needs to respond to this competition by exploring, through integrating its own chatbot, Bard, into its search capabilities.
This comes at a direct cost: an AI-response costs around 2 cents, given the required computing power, 7 times more than a traditional search. Therefore, a 10% shift of Google searches to AI-responses could increase Google’s operating costs by between $700 million and $11.6 billion[ii]. Additionally, chatbot search would reduce revenue as responses would rely less on directing users to other webpages, displaying lucrative ads, but answer queries directly. Whilst 80% of Google searches do not contain these lucrative ads at the top of search results, with some pages already including summary boxes, it still needs to find ways to incorporate the display of ads into chat-style search to stay profitable.
It is currently mooted that this can be achieved by displaying ads in a sidebar next to the chat-style response, although it is questionable as to how likely it is consumers will click on such links if they receive a correct answer from the chatbot. Another idea is to incorporate paid-for-recommendations directly into the chat-response: ask the search engine for a recommendation of things to do in Cambridge and it will give a sponsored-mention to Aromi. However, sponsored recommendations, which may bias answers, could lead to consumer scepticism and shunning of such products.
If all else fails, the premium subscription model adopted by competing search engine Neeva could come to the rescue. This approach would force the consumer-side of the two-sided platform to pay directly, rather than the current approach of the advertiser-side subsidising the consumer-side to take advantage of indirect network effects. However, it remains to be seen whether a wide range of consumers are likely to fork out high subscription fees to access a service which has long been provided free-of-charge.
An alternative revenue stream comes from renting out LLM-technology services to other companies. OpenAI has already been working with Duolingo to create a conversational language bot and Khan Academy to design an online tutor.[iii] The company with the most advanced capabilities can open a large revenue stream from renting out this technology, much like Google does with cloud computing.
Regulatory risk leading to cautious innovation
In addition to the direct costs of upending its cash-cow business model, there is a high element of risk in the use of LLM chatbots which base their responses on predicting what text comes next, without verifying the content they produce. This can lead to a whole host of problems associated with inaccurate responses, which may see hype for the use of AI in search fade, or worse, politically biased or inflammatory material.
For smaller start-ups, the inherent risk associated with public perception and regulatory oversight is a lot smaller than for the $1.17 trillion-behemoth that is Google. This is particularly so given several anti-trust cases that are currently being pursued against Google in both the EU and US[iv].
Legal risk also comes from content providers whose data is being used in training datasets, without compensation, to develop LLMs. These content providers may see less traffic, as LLMs can provide their content without users needing to access the source website, reducing content creators’ revenue. If the courts decide in the favour of content creators, then LLMs may become even more expensive.
Whilst the regulatory and reputation risk to Google from incorporating GPT into their search algorithms is high, the risk of not adapting is even greater. There are very few examples of companies having innovated and failing, compared to the number of examples of companies failing to innovate and being left behind. Google itself has several examples of innovative attempts which ultimately failed without imperilling the future of the firm, including Glass, Google+ and Desktop. Google needs to avoid falling into the risk-avoidance trap by incorporating GPT-style search into its business model.
Why innovation can be hard for a company to adjust to (frame flexibility, sunk-cost fallacy and institutional changes)
Internally, Google needs to navigate through uncharted territory, having never needed to fundamentally reinvent itself before. A strong advantage it has is that there does not need to be an internal paradigm shift in thinking. Google’s self-stated mission is “to organise the world’s information and make it universally accessible and useful”[v]. Adopting LLM chatbot search does not require a change in Google’s organisational identity or strategy, just in underlying technology. There is no need to change the notion of “who Google is” or “what Google does”, just “how Google does it”. Similarly, Google still faces the same competing search engines (with added AI) so does not need to adjust its competitive boundaries to face this challenge.
Whilst this frame flexibility is an advantage, Google still faces barriers to change from the sunk-cost fallacy, that existing technology should be preserved due to the expense already incurred, and overcoming embodied interests which may prevent institutional change to prioritise AI-teams at the expense of existing search-teams.
Google has demonstrated that it does not succumb to the sunk-cost bias in the fact that whilst Google has developed and invested heavily in LLMs over the last few years, it has not felt the need to replace its search technology needlessly, which would have been to the detriment of (lower) revenue and (higher) cost. Indeed, this reinforces the fine-balance Google has trodden between explore and exploit. Long-run investment in innovative technologies also shows that Google does not suffer from organisational or incumbent inertia[vi].
On the other hand, adoption may require painful adjustment internally, as certain divisions are de-prioritised and newer groups, such as DeepMind and teams working on AI, are given increased funding and prominence. This will require management focus to steer the appropriate resources and internal power to the necessary divisions. To successfully, and profitably, incorporate AI-capabilities into search, Google’s management must preserve the core functions of Google, which attract consumers, advertisers and skilled workers to the firm, whilst incorporating LLMs and fending off nascent competition.
Google’s strong advantages from barriers to entry and network effects
With a 92% market share in worldwide search engine[vii], Google has established large barriers to entry through (direct) network effects and economies of scale that allowed it to amass $280 billion in profits in 2022. The biggest barrier to entry is the large sunk cost arising from developing search technology, crawling the internet, storing such data and making it easily accessible through page-ranking algorithms. The UK’s CMA estimates that these barriers to entry, alongside computing power and marketing costs, would cost a new entrant around $10-30 billion to compete with Google[viii], demonstrating the large competitive moat Google has created.
Furthermore, Google benefits from having refined and personalised its algorithms over many years using big data, made possible thanks to its incumbent position, reflecting the strong (direct and indirect) network effects prevalent in this multi-sided platform. The more consumers have used Google the more it has been able to fine-tune its algorithms to present the most relevant results, demonstrating a virtuous cycle enhancing competitive advantage. Better search results have reduced customer switching, making customers loyal to Google.
One should also not forget the role of Google’s wide portfolio of products which makes people’s lives easier whilst further tying them into the Google ecosystem, reinforce algorithmic dominance, and make switching away to competing search engines more burdensome. Such products include Google Maps, GMail, Chrome, Android and Drive. Google could integrate GPT-technology into these offerings to compete against Microsoft and retrench its competitive advantage. This could also offer another approach for revenue generation, with estimates suggesting Microsoft could generate $14.9bn over the next 5 years from commercial users paying for GPT technology in Office products.[ix]
Google’s market power does not automatically extend to the new market of LLM search, meaning there is a potential for a competitor, using GPT technology, to steal market share and use this to leverage network effects which could pose a challenge to Google. This is the approach taken by Bing, which thus far, has mainly attracted only tech-savvy consumers without having a large effect on a wider customer base. Such a threat is limited, given Google’s forays into this area, restricting Microsoft’s first-mover advantage. Furthermore, Google already has a large customer base, which, as discussed, is unlikely to switch too easily given the existing product portfolio Google has.
Finally, a hidden tool in Google’s armoury is that it can utilise the wide training dataset it has access to through the Google Books Library Project[x], which can be harnessed to further improve LLM results, at the expense of competition. There is even potential for Google to use the millions of hours of YouTube video content, converting this into text and using the data to train sophisticated GPT models. Google’s access to these training datasets gives it a key competitive advantage that will act as a formidable barrier to entry in the future.
Will Google survive?
Only time will tell if Google will be able to overcome the challenge posed by LLM search technologies and maintain its position as the leader in the search market or whether this transpires to be Google’s tripping point. However, it has a strong base, having already developed significant AI capabilities which can be rolled out so long as the firm overcomes regulatory concern and is able to successfully monetise the new technology and rebalance internal organisational structures.
[1] This refers to the demise of this once dominant company because of its botched response to the threat from digital cameras.
[i] The Battle for Internet Search, The Economist, accessed on 24th February 2023, https://www.economist.com/leaders/2023/02/09/the-battle-for-internet-search
[ii] Is Google’s 20-year dominance of search in peril, The Economist, accessed on 15th February 2023, https://www.economist.com/business/2023/02/08/is-googles-20-year-search-dominance-about-to-end
[iii] ChatGPT maker OpenAI unveils new model GPT-4, Financial Times, accessed on 25th March 2023, https://www.ft.com/content/8bed5cd7-9d1e-4653-8673-f28bb8176385
[iv] Google’s antitrust mess, CNBC, accessed on 25th March 2023, https://www.cnbc.com/2020/12/18/google-antitrust-cases-in-us-and-europe-overview.html
[v] Our Approach to Search, Google, accessed on 23rd February 2023, https://www.google.com/intl/en_uk/search/howsearchworks/our-approach/#:~:text=Google’s%20mission%20is%20to%20organise,a%20wide%20variety%20of%20sources
[vi] Raffaelli, Glynn, Tushman (2019), Frame flexibility: the role of cognitive and emotional framing in innovation adoption by incumbent firms. Strategic Management Journal, 1014.
[vii] Search Engine Market Share Worldwide, StatCounter, accessed on 23rd February 2023, https://gs.statcounter.com/search-engine-market-share
[viii] Is Google’s 20-year dominance of search in peril, The Economist, accessed on 15th February 2023, https://www.economist.com/business/2023/02/08/is-googles-20-year-search-dominance-about-to-end
[ix] Microsoft to add AI co-pilot to its Office software suite, Financial Times, accessed on 20th March 2023, https://www.ft.com/content/c74c7e48-d439-40d1-a42a-2cd9f1560edb
[x] Google Books Library Project, Google, accessed on 25th March 2023, https://books.google.com/intl/en-GB/googlebooks/library.html