Produced by: Hisham Mugane
About $100 billion has been wiped off the market value of developer Google after an emergency test promotion of the smart chatbot “Bolt” to compete with the smart “GBT” tool developed by Microsoft-backed “Open AI”, tech giants and analysts revealed. Another challenge added to efforts to build artificial intelligence into Google, Bing, and other search engines is the “cost” associated with these advanced research platforms.
Large language samples
Executives in the broader tech industry talked about how AI technologies like ChatGPT are performing and how they are contributing to rising costs. Sam Altman, CEO of startup OpenAI, said the OpenAI chatbot, which generates search queries in seconds, has “staggering” computing costs of two years or more per conversation.
John Hennessy, the head of Alphabet, confirmed this in an interview with “Reuters” that exchange with artificial intelligence, so-called “large language models”, costs 10 times the cost of an ordinary static search for keywords.
Even with the potential ad revenue generated by AI and chat-based search engines, these technologies cost a company like Alphabet billions of dollars annually on its balance sheet, analysts say.
An additional $6 billion
Morgan Stanley estimated that last year’s 3.3 trillion Google searches cost the company “a fifth of a cent,” and that would increase depending on how much text the AI had to generate. If artificial intelligence like “ChatGPT” handles half of the inquiries it receives with 50-word answers, it is expected to increase the spending of global search companies by $6 billion by 2024.
Semi-Analysis, a research and consulting firm focused on chip technology, came up with a similar bill in different ways. Adding AI technologies to search like ChatGPT will cost Alphabet an additional $3 billion for Google’s internal processing units, or TPUs, the company said.
assumption
What makes this form of AI more expensive than traditional research is the computing power involved. According to experts, manufacturing AI relies on billions of dollars worth of technology chips, which must be spread over several years over its useful life. Likewise, electricity tariffs add more costs and pressure to companies with carbon footprint targets.
In a related context, the mechanism for handling search queries supported by artificial intelligence is called “inference”, where a precise “neural network” similar to the biology of the human brain is formed, which infers the answer to the question from previous training. . This is quite different from Google’s traditional search, which has an index of information, and once the user types in a query, the engine returns the most relevant answers stored in the index. However, Hennessey described discretionary spending as a problem that lasted two years at worst. Richard Socher, CEO of You.com, another Google competitor, said the addition of an AI chat experience and applications for graphics, videos and other creative technologies have boosted overheads between 30% and 50%. Another source close to the Internet search giant cautioned that it’s too early to say how many chatbots Google might come up with because performance and usability can vary widely depending on the technology used.
Control costs
For years, researchers at Alphabet and elsewhere have studied how to cheaply train and operate large language models. Larger models come with higher costs as they require more chips for inference. A senior technology executive told Reuters that such AI would be too expensive to put into the hands of millions of people.
It should be noted here that the GPT chatbot’s excellent language modeling capabilities are up to 175 billion numeric parameters or (the odd value the algorithm takes), an artificial intelligence that has dazzled users with human-like capabilities. account). Pricing also varies depending on the length of the user’s query, measured by symbols or parts of words.
Now, computer scientists at OpenAI have figured out how to optimize those hypothetical costs with complex code that makes the chips run more efficiently.
OpenAI’s long-standing problem is how to reduce the number of parameters in an AI model by a factor of 10 or 100 without losing accuracy. “How to most effectively eliminate numerical factors in an algorithm remains an open question,” said Naveen Rao, who previously ran Intel’s AI chip efforts and now works to lower computing costs at his startup MosaicML.
The subscription fee is part of the solution
Meanwhile, some are considering charging for access to great AI apps, such as OpenAI’s monthly subscription of $20 to get great services from the GBT chatbot. Other tech experts said an alternative is to use smaller AI models for simpler tasks, something Alphabet is exploring, which recently said a smaller version of the Lambda Application Language Model will power its AI bot (Bard). Much less computing power, which enables the company to expand to include more users.
Asked about chatbots like ChatGBT and Bard, Alphabet chairman John Hennessy said at a conference last week that more focused models, rather than a single system that does everything, would help control higher costs.
“Award-winning beer geek. Extreme coffeeaholic. Introvert. Avid travel specialist. Hipster-friendly communicator.”