By Futurist Thomas Frey

When we thought computers were reaching their limit, nature quietly handed us the next leap. In the Swiss town of Vevey, researchers at the startup FinalSpark are cultivating human brain organoids—mini-brains grown from stem cells—and plugging them into electrode arrays to act as living processors. These clumps, each measuring just a few millimetres, are no longer just models for neuroscience—they’re becoming the underlying architecture of tomorrow’s computing infrastructure.

Biological neurons already out-strip silicon on raw metrics: they’re approximately one million times more energy efficient than current artificial neurons, and they self-organize, self-repair and rewire. What we once simulated, we’re now assimilating. Rather than mimic the brain with chips, we’re tapping the brain’s hardware itself. The implication: “wetware” computing is no longer science fiction—it’s system design.

Think about how today’s data centres are limited by thermal constraints, physical density and energy costs. By 2030, racks won’t just contain GPUs—they’ll house bioreactors with neural tissue, nutrient and microfluidics loops, electrode matrices and hybrid circuits. By 2040, entire cloud-subnets will operate with “neural node farms” that interface living neurons with photonic interconnects and AI frameworks. The computing substrate shifts from metal and silicon to living networks.

This changes what it means to build algorithms. We no longer program line by line; we stimulate neural cultures, train them via electrical inputs, query their responses via electrodes, and harvest patterns. When the substrate itself is a neural network grown in a dish, “software” becomes a prompt, “hardware” becomes biology, and “execution” becomes stimulation and response. The boundaries between life and machine blur.

The organizational impact is massive. NVIDIA-style GPU farms will compete with biocompute farms built around organoid arrays. Nations will vie not just for chip fabrication but for bio-fabs. The economic value shifts: instead of transistor count, we’ll count neuron-modules. Data centres will include “clean rooms,” micro-bioreactor technicians, neural-culture monitoring—jobs we haven’t yet defined.

And yet, the ethical terrain is uncharted. When you connect living human brain tissue to computing arrays, you must ask: at what point does the organoid become sentient? There are only ~10⁴ neurons per organoid today, compared with ~10¹¹ in a human brain, but as scale and complexity increase, so too does the risk. We’ll wrestle with rights for neural hardware, energy access debates in brain-labs, even whether your computer might “want” something. It will trigger new categories of property: Are these organoid-modules “owned,” leased, co-habited? What happens when one “dies”? The machines we build become living because their substrate is alive.

For individuals and organisations, the strategic lesson is clear: abstraction layers are collapsing. The future of compute isn’t just about bigger models—it’s about different mediums. Learn software, learn optics—but if you ignore biology, you’ll miss the platform shift. Organisations that treat compute as a biological ecosystem win the next wave, not those doubling down on brute silicon scaling.

Imagine a biotech cloud startup in 2035 offering “neural-compute hours” per dollar. You book organoid-time to run a generative module. You pay in tokens. Your app spins up a cohort of neuron modules, trains over 24 hours, then retires them. The platform auto-cleans the culture, refreshes, restarts. You no longer code architectures—you design biology. Software engineers partner with cell biologists. DevOps includes micro-fluidic engineers. The stack changes.

The human divide shifts too. Today’s coders optimise algorithms. Tomorrow’s system designers optimise biology-machine interfaces. The skill-sets will include electrode mapping, neuron-density scheduling, organoid lifespan modelling, bio-heat dissipation—not just memory access patterns. The education system will scramble because the curriculum hasn’t caught up.

And for consumers? Your next phone won’t just have a neural co-processor—it’ll lease you wetware micro-modules via micro-bioreactors embedded under your case. You’ll run personalised apps on transient neuron-slices, hardware optimised for you, biological compute that adapts and decays. You may view the device as an ecosystem, not a tool.

Final Thoughts
The substrate of our machines is shifting from silicon to tissue. Wetware computing isn’t just faster hardware—it’s alive hardware. In the next decade we’ll move from simulating life to computing in life. The largest infrastructure build-outs won’t be factories—they’ll be bio-labs. Ownership will flip: processor makers will become cultivators of neural breeds. If you assumed the future of computing was quantum chips or better AI, think again—it’s living nodes. Get ready to plug your ambition into biology.

Original column: ‘Wetware’: Scientists use human mini-brains to power computers
Similar stories: Inside the British lab growing a biological computer · Organoid intelligence: the emerging frontier of biocomputing