Extropic is developing thermodynamic computing using probabilistic bits (p-bits) that harness natural thermal noise to perform AI tasks with up to 10,000 times less energy than traditional GPUs, potentially revolutionizing energy efficiency in AI. While still in early stages with prototype chips and new algorithms needed, this technology could enable powerful, sustainable AI on everyday devices, transforming how AI is deployed and democratized globally.
A new company called Extropic is pioneering a revolutionary approach to computing called thermodynamic computing, which promises to be 10,000 times more energy-efficient than current Nvidia GPUs. Traditional computers fight against thermal noise—the natural random fluctuations in physical systems—to maintain precise binary states of zeros and ones. Extropic, however, embraces this noise as a computational resource by inventing a new type of computational unit called a probabilistic bit, or p-bit. Unlike a traditional bit that is strictly on or off, a p-bit behaves like a programmable weighted coin that can flip between states with adjustable probabilities, harnessing the universe’s inherent randomness directly rather than simulating it through energy-intensive calculations.
Extropic’s breakthrough lies in assembling millions of these p-bits into a thermodynamic sampling unit (TSU), which operates through local interactions among p-bits to find stable patterns that represent solutions to complex problems. This approach uses a process akin to Gibbs sampling, where each p-bit adjusts its state based on its neighbors, allowing the system to naturally settle into the most probable configuration without centralized control or heavy computation. This method is particularly suited for tasks like AI image generation, where the system denoises random noise step-by-step to reveal a coherent image, but does so with dramatically reduced energy consumption compared to GPUs.
In October 2025, Extropic published research demonstrating that their thermodynamic computing approach could perform AI tasks using 10,000 times less energy than the best GPUs, a claim verified by independent researchers. However, this impressive figure comes from simulations on relatively simple problems, such as generating small black-and-white images of clothing items, rather than complex, photorealistic images. Extropic has also developed a physical prototype chip called XTR0, which serves as a proof of concept but is still far from being able to handle large-scale AI workloads. The company plans to release a more powerful commercial chip, the Z1, next year, which will feature millions of interconnected p-bits and enable more advanced AI applications.
One major challenge is that existing AI software designed for GPUs cannot run directly on Extropic’s thermodynamic chips due to their fundamentally different architectures. This necessitates the development of new AI algorithms tailored specifically for thermodynamic computing, such as Extropic’s denoising thermodynamic model (DTM). This represents a new frontier in AI research, akin to discovering a new branch of physics, and requires collaboration between hardware and software innovation. While the technology is not ready for immediate widespread use, its potential to drastically reduce AI’s energy footprint could transform how AI is deployed and democratized in the future.
Looking ahead, if Extropic’s vision succeeds, thermodynamic computing could eliminate the looming AI energy crisis, enabling powerful AI to run efficiently on everyday devices like phones, cars, and medical tools without massive power demands. This would unlock new possibilities for AI to assist in healthcare, scientific research, and creative endeavors, making advanced intelligence accessible and sustainable worldwide. Extropic’s work challenges decades of computing paradigms and investment, offering a bold new path toward a future where AI is abundant, energy-efficient, and integrated seamlessly into daily life.