AI for sustainability: a challenge and an opportunity

From the point of view of sustainabilty, AI poses both a challenge and an opportunity. High energy consumption and resource intensity of AI systems, especially when combined with inconsiderate use, can lead to a heavy carbon footprint and waste of resources. On the other hand, AI-optimized processes can significantly reduce emissions and increase sustainability. In the recent FCAI Sustainable AI Solutions webinar, experts shared their views on both perspectives. Video and slides from the webinar are available, and below is a summary of the discussion.

Using AI to improve energy efficiency is a rather classic data-driven use of AI methods and there are various ways that AI can be used to increase energy efficiency, says Professor Simo Särkkä from Aalto University, in his presentation covering optimization of energy efficiency with AI and energy-efficient AI systems.

AI-based optimization creates opportunities for increasing energy efficiency in generation and distribution of energy, industrial processes, building level energy management, logistics and transportation, to name a few examples.

However, in addition to harnessing AI for energy efficiency, it is equally important to make AI itself more energy efficient, says Särkkä. In fact, if the AI industry continues to grow at the current pace, it could consume as much energy as a small country by 2027. The main energy hogs are large deep learning models, especially foundation models.

To tackle the problem, Mats Sjöberg from CSC highlights greener methods in natural language processing. The current AI boom is largely driven by large language models and the exponentially expanding scale and computational intensity leads to a growing carbon footprint and high centralization.

To alter this course, there is an urgent need for environmentally sustainable ways of designing and using NLP applications. Efficient data use and training of the models, environmentally efficient data centers and compact and reusable models can have a significant impact on carbon emissions.

Frédéric Parienté, representing NVIDIA AI Technology Center, puts emphasis also on accelerated computing. Single chip performance has increased rapidly during the past decade, and further advancements will continue to reduce energy use and lower the cost.

According to VTT’s Fabrice Saffre, part of the sustainability challenge is that AI is a victim of its own success. We are using AI applications to perform tasks for which it was never intended, and one corrective measure is to limit the use of it for things that it is uniquely good at. Many applications do not need anthropomorphic AI, instead they can manage with “cheap and dirty” insect-grade adaptive behaviour.

This perspective might become especially valuable in the future. Even if there are realistic prospects of reducing the energy use and carbon footprint of AI, the current AI goldrush and exponential use of different applications might still lead to increase in overall emissions and energy consumption, Saffre sums up.

All in all, sustainability is something that the AI industry cannot neglect. There is a real need for developing AI systems to be more resource efficient, but also various ways to improve sustainability with the help of AI as long as the measures are carefully selected.


For further information or to discuss how FCAI can collaborate with your research group or company, contact: isp@fcai.fi
More FCAI Industry and Society events