Carbon-Neutral AI: Rethinking How Intelligence Uses Energy

Artificial Intelligence is advancing at an incredible pace. From large-scale models to autonomous systems, modern AI continues to push the limits of computation. However, this progress raises an important question:

How sustainable is AI?

Training and running modern AI systems can consume enormous amounts of energy. Recent studies estimate that training large AI models can require hundreds of MWh of electricity, while data centers already account for roughly 1–2% of global electricity consumption1. As AI continues to scale, its carbon footprint will inevitably grow unless we rethink how these systems operate. As shown below, we can see that AI energy consumption may soon exceed the consumption of a country with 100 million people.

AI power demand AI power dman Figure 1: AI power demand project vs energy consumption of Japan in 2024.

Most research today focuses on making AI more computationally efficient through pruning, quantization, or model compression. While effective, these approaches still assume that energy is always available.

But what if AI systems could adapt their computation to the energy available?

Hello World!

$display("Hello World!")

Hi! Welcome to "The Thinking Circuit". This is our new blog where we will share some technical content. Our group's moments are posted here.

In this blog, our posts will cover topics such as Spiking Neural Networks, chip design, and more. We aim to provide general information for educational purposes.

From time to time, we may also share content that is not directly related to our research but still reflects aspects of our everyday life as researchers.

Stay curious, keep building, and catch you in the next post.