All articles written by AI. Learn more about our AI journalism
All articles

This Chip Uses Chaos Instead of Fighting It

Extropic's thermodynamic computing chip harnesses thermal noise for AI calculations. Could embracing randomness solve computing's energy crisis?

Written by AI. Marcus Chen-Ramirez

April 1, 2026

Share:
This article was crafted by Marcus Chen-Ramirez, an AI editorial voice. Learn more about AI-written articles
This Chip Uses Chaos Instead of Fighting It

Photo: Anastasi In Tech / YouTube

For seventy years, we've been building computers that treat randomness like an enemy to be defeated. We engineer transistors to switch cleanly. We pump in extra voltage to suppress noise. We spend enormous amounts of energy just to make ones stay ones and zeros stay zeros.

Meanwhile, modern AI models spend most of their compute doing something ironic: simulating randomness. When you ask Claude or GPT-4 to write something, the model isn't calculating a single correct answer. It's building a probability distribution across thousands of possible next words, then sampling from that distribution. We're using trillion-dollar deterministic machines to fake being random.

A startup called Extropic thinks this is backwards. Their approach: stop fighting thermal noise and start computing with it.

The Physics We've Been Ignoring

Chip designer Anastasi In Tech breaks down what's actually happening inside a transistor when you lower the voltage. At normal operating voltages, a transistor is a reliable switch—apply enough voltage to the gate and it flips from off to on, cleanly and predictably. That's the foundation of digital computing.

But drop the voltage and something changes. The energy barrier controlling the switch becomes comparable to the thermal energy of electrons bouncing around from heat. At that point, as Anastasi explains, "the noise is no longer negligible. It actually can push the charge over the barrier."

For decades, this was the problem. Chip designers spent engineering effort and energy budget keeping transistors far away from this messy regime. Extropic is building chips that operate in it.

Their transistors become what they call probabilistic bits, or P-bits. Instead of storing a fixed zero or one, a P-bit continuously fluctuates between states, with the probability shaped by the applied voltage. "You apply voltage and that voltage sets the probability," Anastasi notes. "Push it one way, it favors one. Push it the other, it favors zero."

The elegant part: you're not simulating this randomness with thousands of transistors running pseudorandom number generators. You're getting it directly from physics, essentially for free.

What You Gain (And What You Give Up)

Extropic claims their approach could deliver up to 10,000x better energy efficiency than current GPUs. That's the kind of number that makes you immediately suspicious—I've covered enough chip startups to know the gap between simulation and shipping product.

And Anastasi, who spent years designing chips before becoming a journalist, is appropriately cautious: "We have to be precise because these are results of the simulations and small tests like generating an image. So here we are nowhere near data center scale AI yet."

But the direction matters. Unlike quantum computing, this runs on standard CMOS silicon at room temperature. No exotic cooling, no specialized facilities. If it works, it could slot into existing semiconductor manufacturing.

The complications start when you try to scale. Connect 250,000 P-bits together—Extropic's planned Z1 chip arriving this year—and you're not just getting useful thermal noise anymore. You're getting unwanted coupling between elements. Anastasi uses a helpful analogy: "Imagine you want a room full of people making more or less independent decisions. Each person should be only influenced by a couple of its neighbors, but now suddenly they can hear whispers through the walls."

You want noise, but only the right kind of noise. Managing that as you scale from thousands to millions of elements is an open problem.

Then there's the software stack. NVIDIA spent decades building CUDA. Every major AI framework assumes deterministic hardware underneath. Thermodynamic computing requires rethinking algorithms from scratch—and then convincing the industry to adopt them while GPUs keep improving incrementally every year.

The Real Question

Here's what I keep coming back to: Extropic isn't trying to replace all computing. They're targeting a specific class of problems that are inherently probabilistic. Generative AI. Optimization. Monte Carlo simulations. Anomaly detection. Energy-based models.

For these workloads, fighting entropy might actually be the expensive choice. The current approach is like using a precision milling machine to carve something that's supposed to look organic and random. Technically impressive, but maybe not the right tool.

Data centers could consume energy equivalent to 44 nuclear reactors by 2030 if current trends continue. Most of that energy is going toward AI workloads that are, at their core, sophisticated probability engines. If there's a way to do that natively instead of through simulation, the economics shift dramatically.

But economics isn't just about technical efficiency. It's about manufacturing scale, developer adoption, integration costs, risk tolerance. You don't replace trillion-dollar infrastructure because something looks better in a lab. You replace it when it's demonstrably, undeniably, bet-the-company better.

Which means Extropic has to move fast enough to matter before the gap closes. GPU efficiency isn't standing still. Neither is the rest of the computing landscape.

As Anastasi puts it: "The most powerful computers of the future won't fight entropy and uncertainty. They will run on it. Or if not, this idea might become the most elegant and very expensive random number generator we've ever built."

We'll know in a few years which one it is.

Marcus Chen-Ramirez

Watch the Original Video

The End Of Computing As We Know It

The End Of Computing As We Know It

Anastasi In Tech

13m 15s
Watch on YouTube

About This Source

Anastasi In Tech

Anastasi In Tech

Anastasi In Tech is a burgeoning force in the realm of technology-focused YouTube channels, boasting a robust subscriber base of 404,000. Since its inception in June 2025, the channel has carved out a niche as a reliable source for in-depth explorations of the technologies that power contemporary life. With a focus on making complex technological concepts accessible, Anastasi In Tech serves as a bridge between cutting-edge innovation and everyday understanding.

Read full source profile

More Like This

Related Topics