Welt der Wunder

Nicht glauben, sondern wissen

Can artificial intelligence be environmentally friendly?

Image: AI-generated / Envato

Can artificial intelligence be environmentally friendly?

From medical milestones to intelligent voice assistants and autonomous driving: AI is revolutionizing our lives. But the increasing complexity and the associated data and energy requirements raise an important question: How environmentally friendly can AI really be?

Dieser Beitrag ist auch verfügbar auf: Deutsch

Why do AI models consume so much energy?

Developing and employing AI models requires enormous amounts of computing power. For example, according to a study by the University of Massachusetts Amherst, training the GPT-3 language model known from ChatGPT caused about 628 tons of CO₂ emissions. This is comparable to the lifetime consumption of five cars in the US. This is because training requires specialized hardware and an immense amount of energy, often derived from fossil fuels.

As such AI models become larger and more complex as their performance increases (which is referred to as scaling), their energy consumption could continue to rise in the coming years. In addition, AI systems require computing power not only for training, but also for everyday use.

Solving AI’s sustainability issues

Despite these challenges, there are already a number of approaches to making AI systems more environmentally friendly:

Energy-efficient hardware

Modern computer chips such as the Tensor Processing Units (TPUs) used by NVIDIA or Google have been specifically designed to run AI applications more efficiently. These chips can drastically reduce energy requirements.

Optimizing AI models

AI experts are already working on making AI models smaller and more efficient. “Distillation” techniques can reduce their complexity without significantly affecting their performance.

Use of renewable energy

Tech giants like Google and Microsoft are investing heavily in data centers powered by renewable energy. For example, in 2020 Google reported that it offsets 100 percent of its energy consumption through renewable sources.

Federated Learning

Using this technique, processing takes place on users’ devices instead of storing and processing all data centrally. This not only reduces the energy consumption of cloud services, but also protects privacy.

How tensor chips work

The tensor chips used by Google, NVIDIA and others have been specially developed for processing tensor data. Tensor data consists of multidimensional arrays. These are data structures that organize values in several dimensions (such as rows and columns in a table or matrix) in order to compactly represent and process complex data sets. Such calculations are extremely important for AI systems and are often repeated millions of times.

In addition, unlike traditional computer chips, Tensor chips are designed for lower precision and more parallel calculations, since AI models often do not require the maximum mathematical precision of each calculation. Therefore, Tensor chips can save enormous amounts of memory and energy without significantly affecting the accuracy of the results. This makes them ideal for use in data centers and on mobile devices.

How does AI model distillation work?

Distillation, also known as knowledge distillation, is a method for making large, complex AI models more efficient. It does this by transferring the knowledge of a powerful model (the “teacher model”) to a smaller, more efficient model (the “student model”). This technique aims to largely retain the performance of the “teacher model” while the “student model” requires fewer computational resources and operates faster.

The distillation process works because the teacher model not only provides the correct answers (labels) for a task, but also additional information in the form of “soft targets”. These soft targets are the probability distributions that the teacher model outputs for each possible answer.

For example, for the phrase “The weather today is…” to be completed, a large language model could calculate a high probability for “sunny” and lower probabilities for “rainy” or “cloudy”. The student model is trained to mimic these probability distributions.

The advantage of this method is that the student model becomes more efficient without having to reprocess the entire training data. It requires less memory, energy and computing time, which is particularly important in resource-constrained applications such as mobile devices or IoT sensors.

At the same time, the student model remains able to make predictions that are similarly accurate to the original, larger model. Knowledge Distillation is therefore a key technique for making AI applications more sustainable without compromising on performance. The best-known example of Distillation is the faster Chat-GPT variant GPT-4o Mini.

Using AI as a tool to make AI more green

Some projects show how AI can actively contribute to environmental friendliness:

  • DeepMind, a subsidiary of Alphabet / Google, uses AI to reduce energy consumption in Google’s data centers. Machine learning has reduced energy costs by up to 40%.
  • Cities like Singapore use AI to manage traffic more efficiently, minimize energy waste and reduce CO₂ emissions.
  • IBM’s “Green Horizon” project uses AI to monitor air quality and recommend measures to reduce emissions.
  • The ION Power Grid is an intelligent power grid that distributes energy efficiently, integrates renewable energies and uses modern technologies such as AI to optimize energy consumption and make it more sustainable. It uses AI to optimize energy distribution, integrate renewable energy sources such as solar and wind power and balance peak loads through intelligent load management.

How long will we have to wait for sustainable AI?

However, there is still a long way to go before green AI systems become widespread. A fundamental problem is the lack of transparency in AI operations: many companies do not publish data on the environmental impact of their systems. One possible solution would be to introduce strict standards and guidelines for the permissible energy consumption of AI. Energy efficiency labels for AI, similar to those for other appliances and buildings, could also lead to more initiatives to develop more energy-efficient AI.

AI is therefore not inherently harmful to the environment, but its potential to reduce environmental impact depends heavily on how it is developed and used. By combining energy-efficient technologies, renewable energies and conscious design principles, AI can not only become more sustainable, but also actively contribute to solving global environmental problems.

Welt der Wunder - Die App

FREE
VIEW