The use of artificial intelligence (AI) has been rapidly increasing across a wide range of industries, but its growing energy demands have raised concerns about its impact on the environment. To train AI algorithms called models, tech giants such as Microsoft, Google, and OpenAI use cloud computing that relies on thousands of chips inside servers in great data centres around the world. This process can consume an enormous amount of energy, with training a single model utilizing more electricity than 100 US homes use annually.
The emissions resulting from the energy consumption can also differ depending on the source of the electricity, with data centres that rely on coal or natural gas-fired plants responsible for much higher emissions than those powered by solar or wind farms. While some companies have provided information and detailes about their energy use, there is no final estimate for the complete amount of power AI technology utilizes.
Sasha Luccioni, a researcher at AI company Hugging Face, is calling for greater transparency on power usage and emissions for AI models. She has quantified the carbon impact of her company's competitor of OpenAI's GPT-3, called Bloom, and tried to estimate the same for OpenAI's popular ChatGPT, developed on a restricted set of publicly accessible data.
Training GPT-3 took 1.287 gigawatt hours, which is roughly equivalent to the electricity consumption of 120 US homes in a year, and generated 502 tonnes of carbon emissions, according to a research paper published in 2021. While training a model has a great upfront power cost, analysts discovered that in certain cases, it is only about 40% of the power burned by the general use of the model. OpenAI's GPT-3 utilizes 175 billion parameters or variables, which the AI system has comprehended through its training and retraining.
Despite the increasing energy demands of the technology, companies such as Microsoft, Google, and Amazon have made carbon negative or neutral pledges. Google is pursuing net-zero emissions across its operations by 2030, while Microsoft is buying renewable energy and working on ways to make large systems more efficient in both training and application.
There are methods to make AI run more efficiently, such as scheduling training for times when power is cheaper or at a surplus. However, the total accounting for carbon emissions associated with the chips being utilized remains a mystery. Nvidia, the biggest producer of graphics processing units, has revealed its direct and indirect emissions connected with energy, but not all of the emissions it is indirectly responsible for. Luccioni believes that when Nvidia does share this information, it will reveal that GPUs burn up such a scope of power as a small country.
In conclusion, AI technology is a great achievement for modern society that can change the way people interact and impact diverse sides of their lives in a positive way. However, the technology's enormous consumption of energy can also have an impact on the environment and people's lives in general. Scientists must think about avoiding emissions for the comfortable existence of people.
We suggest you be more interested in IT by overviewing our latest news such as Amazon amplifies AI competition against Microsoft and Google by providing advanced cloud tools.
Besides, you can read about the D-ID presentation of a new Chat API that aimed to empower direct conversations with an AI digital human.
Previously we wrote that Google is developing new functions such as “GIFI”, “search Along” chatbot, and much more, so you can read and be more informed about the digital world.