Context:
The field of artificial intelligence is booming, thanks in part, to the hype surrounding new tools like ChatGPT.
The chatbot, owned by Microsoft-backed company OpenAI, has captured the public imagination with its ability to converse, write code, and compose poetry and essays in a surprisingly human way.
It’s also spurred a race among tech giants to release similar, more sophisticated products.
Investment in artificial intelligence is growing rapidly. The global AI market is currently valued at $142.3 billion (€129.6 billion), and is expected to grow to nearly $2 trillion by 2030.
AI systems are already a big part of our lives, helping governments, industries and regular people be more efficient and make data-driven decisions. But there are some significant downsides to this technology.
AI has a big carbon footprint
In order to carry out the tasks they’re supposed to, AI models need to process — or be “trained” on — mountains of data. To learn to recognize an image of a car, for example, an algorithm will need to churn through millions of pictures of cars. Or in the case of ChatGPT, it’s fed colossal text databases from the internet to learn to handle human language.
This data crunching happens in data centers. It requires a lot of computing power and is energy-intensive.
“The entire data center infrastructure and data submission networks account for 2-4% of global CO2 emissions,” says Anne Mollen, researcher at the Berlin-based NGO Algorithmwatch. “This is not only AI, but AI is a large part of that.” That’s on a par with aviation industry emissions.
In a 2019 paper, researchers from the University of Massachusetts, Amherst, found that training a common large AI model can emit up to 284,000 kilograms (626,000 pounds) of carbon dioxide equivalent — nearly five times the emissions of a car over its lifetime, including the manufacture.
“The first time I read this data, I was really shocked,” said Benedetta Brevini, associate professor of political economy of communication at the University of Sydney, Australia, and author of the book, “Is AI good for the planet?”
“If you jump on a plane from London to New York, your carbon emissions will be 986 kilos. But to train one algorithm, we emit 284,000 kilos,” she said. “Why are we not having a conversation about how to reduce this carbon footprint?”
It’s important to note that the Massachusetts study’s estimate was for an especially energy-intensive AI model. Smaller models can run on a laptop and use less energy. But those that use deep learning, such as algorithms that curate social media content, or ChatGPT, need a significant amount of computing power.
Beyond the “training” phase, more emissions are created when the model is applied in the real world, something that can happen billions of times a day, such as every time an online translator translates a word, or a chatbot answers a question.
Mollen from Algorithmwatch says this application phase can potentially account for up to 90% of the emissions in the life cycle of an AI.
So what can be done to tackle AI’s footprint?
Brevini says environmental concerns need to be taken into account right from the start — in the algorithm design and training phases.
“We need to consider the entire production chain and all the environmental problems that are connected to this chain… most notably energy consumption and emissions, but also material toxicity and electronic waste,” says Brevini.
Rather than building bigger and bigger AI models, as is the current trend, Mollen suggests companies could scale them down, use smaller data sets and ensure the AI is trained on the most efficient hardware available.
Using data centers in regions that rely on renewable energy and don’t require huge amounts of water for cooling could also make a difference. Huge facilities in parts of the US or Australia, where fossil fuels make up a significant chunk of the energy mix, will produce more emissions than in Iceland, where geothermal power is a main source of energy and lower temperatures make cooling servers easier.
Mollen notes that tech giants have a fairly good record when it comes to using renewable energy to power their operations. Google says its carbon footprint is zero, thanks to investment in offsets. It aims to be operating exclusively on carbon-free energy by 2030. Microsoft has pledged to be carbon negative by 2030, using carbon capture and storage technologies, and Meta plans to reach net-zero across its value chain by 2030.
But energy isn’t the only consideration. The huge amount of water data centers need to prevent their facilities from overheating has raised concerns in some water-stressed regions, such as Santiago, Chile.
Google’s data center there is “aggravating a drought in the area and local communities are actually revolting against the data center and against the construction of new data centers,” says Mollen.
Emissions aside, how is AI being used?
But even if big tech companies shrink AI’s energy use, there’s another issue that is potentially more damaging to the environment, according to David Rolnick, assistant professor in the school of computer science at McGill University, Canada, and cofounder of the non-profit Climate Change AI.
He says there should be more focus on the way AI is being used to speed up activities that contribute to climate change.
One example he points to is the use of algorithms for advertising. These are deliberately “designed to increase consumption, which assuredly comes with a very significant climate cost,” he says.
Rolnick also draws on a report by tech consultancy Accenture and the World Economic Forum, which predicts AI and advanced analytics will help the oil and gas industry make $425 billion in additional profit by 2025.
Greenpeace has heavily criticized AI contracts between fossil fuel companies and Amazon, Microsoft and Google. In a report, the environmental organization said Shell, BP and ExxonMobil were using AI tools to expand their oil and gas operations, reduce costs, and in some cases boost production. It said such contracts were “significantly undermining the climate commitments” made by the tech giants.
Google has since said it will no longer build customized AI tools to help companies extract fossil fuels.
Playing catchup to the technology
The role of artificial intelligence is only likely to become more significant in the future. And keeping up with such rapidly advancing technology will be a challenge. That’s why Rolnick says regulation is crucial to ensuring AI development is sustainable and doesn’t make emissions targets harder to reach.
“It’s really a question of what we’re prioritizing and getting in early and shaping those choices that are being made,” he says.
In the EU, lawmakers have for the past two years been working on the AI Act, expected to be a landmark piece of legislation, to govern AI and classify tools according to perceived risk. It’s unclear whether environmental concerns will feature in the bill.
Meanwhile, other governments are also working out how to deal with AI — to encourage innovation in the field and reap the benefits this new technology brings, while avoiding the potential dangers and protecting citizens.