AI has been in the news a lot lately. They are proving to be a boon for productivity, but the excitement over these new tools could be concealing a dirty secret. 

The race to build high-performance, AI-powered search engines is likely to require a dramatic rise in computing power, and with it a massive increase in the amount of energy that tech companies require and the amount of carbon they emit, particularly as they become integrated into search engines.

Training large language models (LLMs), such as those that underpin OpenAI’s ChatGPT, which will power Microsoft’s souped-up Bing search engine, and Google’s equivalent, Bard, means parsing and computing linkages within massive volumes of data, which is why they have tended to be developed by companies with sizable resources.

“Training these models takes a huge amount of computational power,” says Carlos Gómez-Rodríguez, a computer scientist at the University of Coruña in Spain. “It’s not that bad, but then you have to take into account [the fact that] not only do you have to train it, but you have to execute it and serve millions of users.” 

There’s also a big difference between utilizing ChatGPT—which investment bank UBS estimates has 13 million users a day—as a standalone product, and integrating it into Bing, which handles half a billion searches every day.

Read the full story