A recent study indicates that the deployment of artificial intelligence in the United States could result in a significant increase in carbon emissions.
This research, published in Environmental Research Letters, provides an estimate of the carbon consequences associated with the spread of AI. The authors evaluated how widespread adoption would change the country's electricity demand and carbon dioxide emissions.
Illustration image Unsplash
According to the calculations, a massive penetration of AI would add approximately 900,000 tonnes of CO2 per year into the atmosphere. This contribution, while significant, remains moderate compared to the total US emissions: a quantifiable but limited effect.
Regarding energy needs, the expansion of these technologies would require up to 12 petajoules of additional electricity each year. Such a quantity corresponds to the annual consumption of about 300,000 American households.
Anthony R. Harding, co-author of the study, observes that these anticipated emissions remain contained compared to other economic activities, but they are no less real. He advocates for integrating energy efficiency measures from the outset in the design and deployment of AI systems.
Among the considered approaches are algorithm optimization and the use of less polluting energy sources to contain the carbon footprint.
However, the rapid evolution of these tools poses energy questions for the coming decades. Targeted improvements in infrastructure and software could help limit the environmental impact while promoting the continuous development of AI.
The energy-intensive operation of data centers
Data centers constitute the operational heart of artificial intelligence, hosting the servers dedicated to intensive computing. Their operation absorbs large volumes of electricity, primarily to power processors and maintain adequate cooling.
With the rise of AI, the need for computing capacity increases, encouraging the construction of new facilities. This leads to a progression in energy consumption, often of fossil origin, with an increase in greenhouse gas emissions.
Engineers are working on designing algorithms and hardware that are less energy-hungry. For example, model compression methods allow for a reduction in the required computing power without significantly altering the results, making the systems overall more efficient.
The use of specialized processors, such as tensor processing units, provides better performance per watt consumed. These hardware evolutions also help optimize energy efficiency.