How much energy can we still save in computing systems? In a perspective published in PNAS, researchers led by the Complexity Science Hub and the Santa Fe Institute argue that the emerging field of stochastic thermodynamics could hold the key to quantifying the energy demands of computing processes—whether in the human brain or in digital devices.
Every computational process in every computer, data center, and server requires energy. Globally, electricity consumption for information technology—from individual mobile phones and Bitcoin mining to large AI models—has surged dramatically in recent decades. By 2018, this consumption already accounted for 6-10% of total energy usage, surpassing that of air travel. “With compute-intensive technologies like ChatGPT and other AI applications, energy demand is continually on the rise,” says CSH PostDoc Jan Korbel.
INEVITABLE ENERGY LOSS
Part of the energy required for information processing is inevitably lost as heat from every system. But how significant is this loss? Are computers already using energy very efficiently, or is there still considerable room for optimization? And how does this depend on the nature of the computation? Could more efficient systems make a substantial contribution to climate protection?
For over a century, physicists and computer scientists have sought to understand how energy is converted and transferred in information processing systems. The focus has primarily been on that fundamental limit that cannot be exceeded and that is always dissipated as heat. However, for a long time, the mathematical tools needed to investigate this were lacking.
A NEW FIELD
The emergence of a completely new field—stochastic thermodynamics—has changed this, as the international group of researchers elaborates in the PNAS perspective.
“Stochastic thermodynamics provides us with the tools to investigate and quantify with equations all that’s going on with systems, even arbitrarily far from equilibrium,” says David Wolpert from the Santa Fe Institute. These tools include mathematical theorems, uncertainty relations, and even thermodynamic speed limits that apply to the behavior of non-equilibrium systems across all scales—from the very small to macroscopic systems.
“This allows us to assess how much energy is lost as heat, highlighting the costs of making everything faster—not just in terms of our electricity bills, but also for the climate,” Korbel notes. “On a practical level, these tools can help us understand how much better systems can become, whether energy can be saved, or whether the maximum has already been reached.”
WHAT HAS CHANGED?
Until about twenty years ago, there were hardly any mathematical tools available to describe systems that are not in thermal equilibrium. This meant treating them as closed systems with no energy flows occurring with other systems and the environment. In short, until the early 21st century, energy use in computers—whether natural or digital—had to be highly abstracted and simplified, as real computers operate far from thermal equilibrium, as the study authors emphasize.
“Real computers always dissipate a certain amount of heat into the environment. Furthermore, these computers contain countless subsystems that are not equally well interconnected,” explains Korbel. “For example, while chips within a computer’s CPU can exchange information rapidly, it takes longer for data to transfer from the CPU to the RAM, and even longer for information to be transmitted between computers and data centers.” Examining individual components in isolation inevitably leads to some abstraction, Korbel adds.
Additionally, real computers are subject to physical constraints. “For example, you never give a computer a task and then throw it away. Rather, you assign it many tasks, and each of these tasks should be completed as quickly as possible. Previously processed information is also already in memory, and so on,” Korbel explains.
PIONEERS
For a long time, all these characteristics of real computers could not be accounted for in energy flow investigations. “It wasn’t until pioneers like David Wolpert – who had previously worked in fields like physics, complexity science, and machine learning – recognized that tools from these disciplines could also be applied in computer science,” says Korbel. This marked the birth of stochastic thermodynamics.
WHY ISN'T THE TECH INDUSTRY UTILIZING THIS?
For one, these methods are still relatively new. “They are still not well-known even among physicists. Computer scientists and engineers have likely never heard of them,” explains Korbel. “Moreover, the current pressure on tech companies seems to be more about producing the fastest devices or using the latest technology. While energy costs and the efficiency of energy use do play a role—such as data centers typically being cooled underwater—but is often achieved through optimizing materials or clever algorithms,” Korbel remarks.
The two researchers also emphasize that the potential benefits of stochastic thermodynamics extend far beyond artificial computers like laptops and phones. Cells perform calculations that are far from equilibrium, and the same is true for neurons in the brain. “The brain is typically very efficient due to evolution, but even here the question arises: How far are we from the maximum?” Korbel ponders.
About the study
The study “Is stochastic thermodynamics the key to understanding the energy costs of computation?,” by . H. Wolpert, J. Korbel et al., was published in PNAS (doi: 10.1073/pnas.2321112121).