Trend 2025: Energy requirements often depend on the size of the AI model

The enormous energy requirements of artificial intelligence (AI) are increasingly becoming the focus of discussion. No wonder, as early as 2019, a US study warned that training a single neural network alone causes as much CO2 emissions as five (conventional) cars. Training the GPT-3 model alone (i.e. the version of ChatGPT that has been publicly available since the end of 2022) consumed as much electricity as a medium-sized nuclear power plant can produce in around one hour (namely 1,287 megawatt hours). 

The energy requirements of all information and communication technology now account for around 2-4 percent of global greenhouse emissions—as much as global air traffic. Demand is growing: Scientists estimate that the energy consumption of artificial intelligence could increase to up to 134 terawatt hours by 2027. In addition, AI training also requires a considerable amount of water—training the GPT model is said to have consumed around 700,000 liters of cooling water.

 

Unprecedented demand for computing power

Training large AI models requires an unprecedented amount of computing power - NVIDIA plans to sell more than 2 million AI computing accelerators of the latest H100 "Hopper" type by the end of 2024. If they all run at full load, they will require 1.6 gigawatts of power, more than one of the largest nuclear reactors can supply.

In addition to training, the operation of AI systems also requires energy, of course. The average energy requirement of a query to ChatGPT is estimated at 3 to 9 watt hours  . If all of the 9 billion daily search queries were answered by AI, the energy requirement would increase threefold. The integration of ChatGPT into Microsoft's BING and Gemini into Google's search engine indicates that the number of AI-generated search responses will increase significantly in any case.

"The use of AI is also increasing in medical technology and industry, for example in production. The use of AI in industry promises to make processes more efficient and prevent production downtime. However, increasing machine efficiency also leads to significantly higher energy requirements here," explains Viacheslav Gromov, Managing Director of AI provider AITAD. 

 

AI must become more energy-efficient

Against this backdrop, the development and use of AI must become more energy-efficient, not only to save costs, but also to counter energy shortages and the consumption of resources required to generate energy. Last but not least, global warming and geopolitical events are forcing us to use energy sparingly.

Of course, the solution is not to do without AI, but to focus on greater energy efficiency when using it. AI and smart sensors can not only transform industry, but also make production more energy-efficient. However, the central focus should already be on the development of neural networks and on the question of where the desired information is extracted from the data collected by sensors. There is great potential for savings in the field of discriminative AI, which is particularly important for industrial use. Unlike generative AI (such as ChatGPT, Midjourney, Gemini, etc.), this type of AI is not used to create content but to analyze and evaluate data. Discriminative AI therefore provides answers to questions directed at devices and machines such as: "What is happening right now and what will happen in the future?".

 

Bigger does not always mean more powerful

Ultimately, the energy consumption essentially depends on the size of the artificial neural network and consequently the underlying, mostly parallelized computing operations. This applies to both the training phase and the use of the model. So what could be more obvious than keeping the models as small and efficient in processing as possible? Small does not necessarily mean lower performance. For a few years now, it has been possible to run AI on even the smallest semiconductors - from the MCU to the FPGA - as an embedded system. If you compare a Cortex-M microcontroller (possibly with NPU and DMA) with the same MAC parallel operations with a typical Intel i-PC processor, you can become more efficient by a factor of 20-60 due to natural limitations in the specifics.

"Embedded AI of this kind works fully autonomously at the network edge - with the major advantage that it can evaluate all the data received from the sensor in real time and therefore requires hardly any connectivity. Embedded AI is becoming increasingly important for industry in many fields of application and at the same time helps to massively reduce energy consumption," explains Gromov in more detail.

 

Energy-saving training on a standard PC? It's possible with embedded AI

This starts with the training of AI models for embedded AI system components. While large models require an extensive server infrastructure for training (sometimes tens of thousands of GPU servers - costing more than USD 400,000 per server - need to be connected), AI development for an embedded system is often possible on a standard PC. Of course, this is also due to the fact that embedded AI answers very specific questions, such as predictions about the health of a drive, simple voice control (with recognition of even more complex word structures), monitoring and testing of weld seams, or assessing the health of teeth based on the ultrasonic sounds recorded during brushing - and countless other use cases, e.g. to relieve service or predict failures in industrial robots.

As embedded AI systems are always very resource-limited, the energy requirement is also very low; depending on the use case, only a few milliamperes are needed. This means that these systems can be battery-operated, and in many cases even energy harvesting would be possible.

 

Embedded AI saves power and is versatile

Even if embedded AI systems are not an all-purpose weapon, they can in principle be used wherever sensors are used. They are ideal for saving energy and thus achieving greater efficiency with a better climate balance. Pleasant side effects are the much deeper analysis of the sensor data, the real-time capability and the protection of data privacy - it is not the sensor data that is transmitted, but only the evaluation results. Last but not least, development and unit costs are likely to be significantly lower than those of networked AI systems.

"With the help of embedded AI, the use of AI can be decentralized so that even large networked production facilities can work with significantly lower energy consumption without sacrificing the advantages of AI. Embedded AI should therefore be used wherever possible and necessary in order to remain fit for the future and conserve resources," concludes Gromov. 

 

 

Viacheslav Gromov is the founder and CEO of AITAD. The company develops electronics-related artificial intelligence (embedded AI) that performs locally defined tasks in devices and machines in real time. He is the author of numerous articles and various textbooks in the field of semiconductors. Gromov is active as an expert in various AI and digitalization committees, including DIN and DKE as well as the German government (DIT, BMBF). AITAD is AI Champion Baden-Württemberg 2023, one of the Top100 Innovators 2023 and winner of the embedded award 2023 in the AI category. 

 

AITAD is a German embedded AI provider. The company focuses on the development, testing and series production of AI electronic systems, particularly in connection with machine learning in an industrial context (especially system components).

As a development partner, AITAD handles the entire process from data collection to development and delivery of the system components. As a result, innovative adaptations are made to the product without the need for expertise and only a few resources on the part of the customer. The focus is on future-oriented, disruptive, innovative adaptations with the greatest possible impact on structures and product strategies. 

AITAD specializes in preventive/predictive maintenance, user interaction and functional innovations. AITAD takes a different approach to many manufacturers: instead of a ready-made AI solution, an individual system is developed for each customer. In the first step, the company examines how customer products benefit from the use of AI, presents the advantages and possibilities, develops the system at all levels, builds a prototype of the new system in-house thanks to a prototyping EMS line based on collected data and is always on hand for series production and system maintenance. AITAD acts as an interdisciplinary full-stack provider in the fields of data science, mechanical engineering and embedded hardware and software. AITAD also conducts internal and external research into numerous algorithmic and semiconductor fundamentals of AI technology. 

In 2023, AITAD received the embedded award in the AI category, the Top100 innovation award for medium-sized companies and was named AI Champion Baden-Württemberg. 

For more information visit: https://aitad.de

 

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Zimmer Group - THE PREMIUM GRIPPER NOW WITH IO-LINK

Zimmer Group - THE PREMIUM GRIPPER NOW WITH IO-LINK

IO-Link is the first standardized IO technology worldwide for communication from the control system to the lowest level of automation.