ChatGPT and online search unbundling

Check out all the Smart Security Summit on-demand sessions here.

Since the release of ChatGPT in November, there has been much speculation about the spelling fate of OpenAI's latest major language model (LLM) for Google Search. The sentiment has only intensified with the recent report of Microsoft preparing to integrate ChatGPT into its Bing search engine.

There are several reasons to believe that a ChatGPT-powered Bing (or any other search engine) won't pose a serious threat to Google's virtual search monopoly. LLMs have several critical issues to address before they can make a dent in the online search industry. Meanwhile, Google's share of the search market, technical capability, and financial resources will help it stay competitive (and possibly dominant) as conversational LLMs begin to make their mark in online search. .

Meanwhile, the real (and less discussed) potential of LLMs such as ChatGPT is the “unbundling” of online search, which is where real opportunities lie for Microsoft and other companies. By integrating ChatGPT into successful products, businesses can reduce the use cases of Google Search.

Although ChatGPT is a remarkable technology, it has several fundamental problems, which are also present in other LLMs. That's why Google, which already has similar technology, has taken a conservative approach to integrating conversational LLMs into its search engine.

Event

On-Demand Smart Security Summit

Learn about the essential role of AI and ML in cybersecurity and industry-specific case studies. Watch the on-demand sessions today.

look here As many users and researchers have shown, LLMs such as ChatGPT can "hallucinate", generating grammatically consistent but factually incorrect responses. LLMs do not cite their sources, which makes it difficult to validate and further investigate the veracity of their results. The running costs of LLMs are enormous. According to one estimate, with one million daily users, ChatGPT costs around $100,000 per day. LLMs are slow to operate. Search engine databases can return millions of results in milliseconds. LLMs take several seconds to generate responses. LLMs are slow to update. Google can add millions of records to its search index every hour at virtually no cost. LLMs have to undergo slow and expensive training every time they need to be updated with new knowledge (ChatGPT training data is from 2021).

A company like Microsoft might be able to solve these problems by using its very efficient Azure cloud and developing suitable LLM architectures, training techniques and complementary tools.

Microsoft and OpenAI might also be able to address the veracity issue by adding automated guardrails that check ChatGPT responses before showing them in Bing results.

However, nothing prevents Google from doing the same. Google has immense data and computational resources and one of the most talented AI teams. Google also has the advantage of being the default search engine on Chrome...

ChatGPT and online search unbundling

Check out all the Smart Security Summit on-demand sessions here.

Since the release of ChatGPT in November, there has been much speculation about the spelling fate of OpenAI's latest major language model (LLM) for Google Search. The sentiment has only intensified with the recent report of Microsoft preparing to integrate ChatGPT into its Bing search engine.

There are several reasons to believe that a ChatGPT-powered Bing (or any other search engine) won't pose a serious threat to Google's virtual search monopoly. LLMs have several critical issues to address before they can make a dent in the online search industry. Meanwhile, Google's share of the search market, technical capability, and financial resources will help it stay competitive (and possibly dominant) as conversational LLMs begin to make their mark in online search. .

Meanwhile, the real (and less discussed) potential of LLMs such as ChatGPT is the “unbundling” of online search, which is where real opportunities lie for Microsoft and other companies. By integrating ChatGPT into successful products, businesses can reduce the use cases of Google Search.

Although ChatGPT is a remarkable technology, it has several fundamental problems, which are also present in other LLMs. That's why Google, which already has similar technology, has taken a conservative approach to integrating conversational LLMs into its search engine.

Event

On-Demand Smart Security Summit

Learn about the essential role of AI and ML in cybersecurity and industry-specific case studies. Watch the on-demand sessions today.

look here As many users and researchers have shown, LLMs such as ChatGPT can "hallucinate", generating grammatically consistent but factually incorrect responses. LLMs do not cite their sources, which makes it difficult to validate and further investigate the veracity of their results. The running costs of LLMs are enormous. According to one estimate, with one million daily users, ChatGPT costs around $100,000 per day. LLMs are slow to operate. Search engine databases can return millions of results in milliseconds. LLMs take several seconds to generate responses. LLMs are slow to update. Google can add millions of records to its search index every hour at virtually no cost. LLMs have to undergo slow and expensive training every time they need to be updated with new knowledge (ChatGPT training data is from 2021).

A company like Microsoft might be able to solve these problems by using its very efficient Azure cloud and developing suitable LLM architectures, training techniques and complementary tools.

Microsoft and OpenAI might also be able to address the veracity issue by adding automated guardrails that check ChatGPT responses before showing them in Bing results.

However, nothing prevents Google from doing the same. Google has immense data and computational resources and one of the most talented AI teams. Google also has the advantage of being the default search engine on Chrome...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow