(Total Views: 70)
Posted On: 09/24/2025 5:13:43 PM
Post# of 175

Why AI Chatbots Require Lots of Energy
Artificial intelligence chatbots like ChatGPT and Google’s Gemini have quickly become part of daily life for millions of people. With just a few words, users can ask questions, write documents, or even create computer code. These tools may seem like they provide instant answers out of thin air, but behind the scenes they require a huge amount of energy.
The demand for power comes mainly from the data centers that run and train these systems. In 2023, data centers in the United States alone used about 4.4 percent of the country’s electricity. Globally, they made up about 1.5 percent of energy use. Experts warn that this number could double by 2030 as more people turn to AI every day.
There are two major reasons why chatbots consume so much power; training and inference. Training happens when a large language model is first built. The system is fed enormous amounts of data so it can learn patterns and make predictions. Because modern models are extremely large, they cannot fit on a single graphics card or even one server.
Instead, many servers with multiple processors must work together for weeks or months. For example, training OpenAI’s GPT-4 is estimated to have used around 50 gigawatt-hours of electricity, which is enough to power the entire city of San Francisco for three days.
Inference, which is the process of responding to user requests, also adds to the energy bill. While it requires less computation than training, inference happens millions of times every second around the world. By mid-2025, users were sending more than 2.5 billion prompts to ChatGPT every single day. Each of these requests requires servers to run at full capacity to provide instant answers. Other popular chatbots, such as Google’s Gemini, add even more pressure on energy use.
Researchers are working to better understand and measure these demands. Some have created tools to track how much energy different AI models consume when they generate responses. However, one challenge is that major technology companies rarely release full details about their energy use. This lack of transparency makes it difficult to know the true environmental cost of AI or to prepare for how much demand will grow in the future.
For now, experts believe that both policymakers and users have a role to play. Policymakers can push companies to share more information and set clearer rules on energy use. Users can also demand better transparency and be mindful of how they use AI tools. While chatbots are useful and powerful, understanding their hidden cost is important in shaping a future where technology and sustainability can work together.
As these chatbots use more energy, novel solutions like those commercialized by companies like PowerBank Corporation (NASDAQ: SUUN) (Cboe CA: SUNN) (FRA: 103) could come in handy to boost the uptake of renewable energy and limit fossil fuel use and its damaging effects.
Please see full terms of use and disclaimers on the TechMediaWire website applicable to all content provided by TMW, wherever published or re-published: https://www.TechMediaWire.com/Disclaimer
Artificial intelligence chatbots like ChatGPT and Google’s Gemini have quickly become part of daily life for millions of people. With just a few words, users can ask questions, write documents, or even create computer code. These tools may seem like they provide instant answers out of thin air, but behind the scenes they require a huge amount of energy.
The demand for power comes mainly from the data centers that run and train these systems. In 2023, data centers in the United States alone used about 4.4 percent of the country’s electricity. Globally, they made up about 1.5 percent of energy use. Experts warn that this number could double by 2030 as more people turn to AI every day.
There are two major reasons why chatbots consume so much power; training and inference. Training happens when a large language model is first built. The system is fed enormous amounts of data so it can learn patterns and make predictions. Because modern models are extremely large, they cannot fit on a single graphics card or even one server.
Instead, many servers with multiple processors must work together for weeks or months. For example, training OpenAI’s GPT-4 is estimated to have used around 50 gigawatt-hours of electricity, which is enough to power the entire city of San Francisco for three days.
Inference, which is the process of responding to user requests, also adds to the energy bill. While it requires less computation than training, inference happens millions of times every second around the world. By mid-2025, users were sending more than 2.5 billion prompts to ChatGPT every single day. Each of these requests requires servers to run at full capacity to provide instant answers. Other popular chatbots, such as Google’s Gemini, add even more pressure on energy use.
Researchers are working to better understand and measure these demands. Some have created tools to track how much energy different AI models consume when they generate responses. However, one challenge is that major technology companies rarely release full details about their energy use. This lack of transparency makes it difficult to know the true environmental cost of AI or to prepare for how much demand will grow in the future.
For now, experts believe that both policymakers and users have a role to play. Policymakers can push companies to share more information and set clearer rules on energy use. Users can also demand better transparency and be mindful of how they use AI tools. While chatbots are useful and powerful, understanding their hidden cost is important in shaping a future where technology and sustainability can work together.
As these chatbots use more energy, novel solutions like those commercialized by companies like PowerBank Corporation (NASDAQ: SUUN) (Cboe CA: SUNN) (FRA: 103) could come in handy to boost the uptake of renewable energy and limit fossil fuel use and its damaging effects.
Please see full terms of use and disclaimers on the TechMediaWire website applicable to all content provided by TMW, wherever published or re-published: https://www.TechMediaWire.com/Disclaimer

