Brain Cells: A Week of Training Turns Biological Chips into Doom Players

Human brain cells on a chip learned to play the video game Doom in around a week, marking a notable step in programmable biological computing and renewing debate over practical applications and limits.
What If Brain Cells on Chips Become Easily Programmable?
Researchers at Cortical Labs developed a programming interface that allows neuron-powered chips to be controlled using the Python programming language. An independent developer, Sean Cole, used that interface to teach a clump of living human neurons to interact with Doom in around a week. Brett Kagan, Chief Scientific Officer and Chief Operations Officer at Cortical Labs, highlights the accessibility and flexibility of the new interface, noting that this demonstration required far less specialised biological expertise than prior work.
What Happens When Training Speeds Outpace Silicon-based Systems?
Brett Kagan says the neuronal chips learnt much faster than traditional, silicon-based machine learning systems and should be able to improve performance with newer learning algorithms. The Doom-playing system performed better than a randomly firing player but remained far below the level of top human players. Observers at several universities emphasise that the jump from simpler tasks to Doom signals progress in handling real-time decision-making, but they also note substantial unknowns about how neurons on a chip interpret inputs and coordinate output without conventional sensors or conventional architectures.
What If Biological Computers Move Toward Real-World Control?
Playing a complex, real-time game has been framed by researchers as a stepping stone toward controlling physical systems. Yoshikatsu Hayashi at the University of Reading describes playing Doom as a simpler analogue of controlling an entire robotic arm; Hayashi and colleagues are attempting similar control using a jelly-like hydrogel system. Andrew Adamatzky at the University of the West of England emphasises that the system’s ability to cope with complexity, uncertainty and real-time decision-making is much closer to tasks such as robot control than earlier demonstrations were.
- Pong used clumps of more than 800, 000 living brain cells grown on microelectrode arrays to control paddles.
- The Doom demonstration used about a quarter as many neurons—around 200, 000—on a CL-1 neural computing system-style microelectrode array and was programmed Python.
- The neuronal chip played better than random, learnt faster than traditional silicon-based machine learning, but remained well below the best human players.
Experts also flag unanswered scientific questions. Steve Furber at the University of Manchester notes that researchers still do not fully understand how neurons on these arrays perceive the game state or internalise expected behaviours. Those gaps shape the most immediate technical barriers to wider application.
Readers should expect more demonstrations that emphasize programmability, faster biological learning and attempts to translate game-based control into physical actuation, alongside sustained research into how living neural networks encode sensory input and goals. The week-long Doom training makes clear that interfaces and learning algorithms are central levers for progress, but performance limits and mechanistic uncertainty will temper immediate deployments of brain cells




