While misinformation and the threat of AI taking over human jobs continue to dominate the conversation about the dangers of artificial intelligence, a Boston University professor is sounding the alarm on another possible downside—the potentially sizable environmental impact of generative AI tools.
«As an AI researcher, I often worry about the energy costs of building artificial intelligence models,» Kate Saenko, associate professor of computer science at Boston University, wrote in an article at The Conversation. «The more powerful the AI, the more energy it takes.»
While the energy consumption of blockchains like Bitcoin and Ethereum has been studied and debated from Twitter to the halls of Congress, the effect of the rapid development of AI on the planet has not yet received the same spotlight.
Professor Saenko aims to change that, but acknowledged in the article that there is limited data on the carbon footprint of a single generative AI query. However, she said that research puts the number four to five times higher than a simple search engine query.
According to a 2019 report, Saenko said a generative AI model called the Bidirectional Encoder Representations from Transformers (or BERT)—with 110 million parameters—consumed the energy of a round-trip transcontinental flight for one person using graphics processing units (or GPUs) to train the model.
In AI models, parameters are variables learned from data that guide the model’s predictions. More parameters in the mix often means greater model complexity, requiring more data and computing power as a result. Parameters are adjusted during training to minimize errors.
Saenko noted in comparison that OpenAI’s GPT-3 model—with 175 billion parameters—consumed an equivalent amount of energy as 123 gasoline-powered passenger vehicles driven for one year, or around 1,287-megawatt hours of electricity. It also generated 552 tons of carbon dioxide. She added that the number comes from just getting the model ready to launch before any consumers started using it.
«If chatbots become as popular as search engines, the energy costs of deploying the AIs could really add up,» Saenko said, citing Microsoft’s addition of ChatGPT to its Bing web browser earlier this month.
Not helping matters is the fact that more and more AI chatbots, like Perplexity AI and OpenAI’s wildly popular ChatGPT, are releasing mobile applications. That makes them even easier to use and exposes them to a much broader audience.
Saenko highlighted a study by Google that found that using a more efficient model architecture and processor and a greener data center can considerably reduce the carbon footprint.
«While a single large AI model is not going to ruin the environment,» Saenko wrote, “if a thousand companies develop slightly different AI bots for different purposes, each used by millions of customers, then the energy use could become an issue.»
Ultimately, Saenko concluded that more research is needed to make generative AI more efficient—but she’s optimistic.
«The good news is that AI can run on renewable energy,» she wrote. «By bringing the computation to where green energy is more abundant, or scheduling computation for times of day when renewable energy is more available, emissions can be reduced by a factor of 30 to 40 compared to using a grid dominated by fossil fuels.»