Carbon-Neutral AI: Rethinking How Intelligence Uses Energy
Artificial Intelligence is advancing at an incredible pace. From large-scale models to autonomous systems, modern AI continues to push the limits of computation. However, this progress raises an important question:
How sustainable is AI?
Training and running modern AI systems can consume enormous amounts of energy. Recent studies estimate that training large AI models can require hundreds of MWh of electricity, while data centers already account for roughly 1–2% of global electricity consumption1. As AI continues to scale, its carbon footprint will inevitably grow unless we rethink how these systems operate. As shown below, we can see that AI energy consumption may soon exceed the consumption of a country with 100 million people.
Figure 1: AI power demand project vs energy consumption of Japan in 2024.
Most research today focuses on making AI more computationally efficient through pruning, quantization, or model compression. While effective, these approaches still assume that energy is always available.
But what if AI systems could adapt their computation to the energy available?