Letter: The arrival of natural language search carries a heavy price

Clive Cookson (“Plan to run computer with human brain cells”, Report, March 1) repeats the assertion by Stuart Russell in his 2019 book Human Compatible that the Frontier computer at the Oak Ridge National Labs is the most powerful in the world, equivalent to a human brain, but uses a million times as much energy. However, he fails to draw out the implications of this for large language models such as ChatGPT, using “generative artificial intelligence” to answer questions.

These models, having hundreds of billions of parameters fed on vast stores of training data, use so much energy that only the largest commercial entities can afford to build them. In 2021, Bryan Catanzaro, vice-president of applied deep learning research at Nvidia, predicted that within five years a company could invest $1bn in compute time, just to train a single language model.

This oligopoly of information is alarming enough, but the arrival of natural language search comes at an even higher price. Last December, OpenAI’s chief executive Sam Altman tweeted that ChatGPT’s average cost per query was “probably single-digit cents per chat”. A Morgan Stanley analysis put it at 2 cents: about seven times the average cost of a Google search query, thus using seven times as much energy. While we wait the “several decades” it will take to develop synthetic brains (I’m not holding my organic breath), readers might want to ponder whether it’s worth burning the planet seven times faster, just to answer a search with a sentence (that may be gibberish) rather than a series of traceable and verifiable links.

Sheila Hayman
Advisory Council, Minderoo Centre for Technology and Democracy
London NW1, UK

This post was originally published on Financial Times

Share your love