Tech giant IBM has announced a prototype “brain-like” chip, which, according to the company, has the potential to revolutionize the energy efficiency of artificial intelligence (AI) systems. The development comes amid growing concerns regarding the carbon emissions generated by sprawling computer warehouses that power these AI systems.
The new chip, modeled after the human brain functions, promises a more efficient AI for data centers and a reduced drain on smartphone batteries. As described by Thanos Vasilopoulos, a scientist at IBM’s Zurich-based research lab, the human brain’s efficiency is unparalleled.
The brain performs outstandingly, consuming minimal power compared to the machines we’ve built. As per Vasilopoulos’ explanation, by harnessing this efficiency, devices in low-power environments could be empowered. Such settings include cars, mobile phones, and cameras.
Vasilopoulos highlighted potential benefits for cloud providers, saying, “They could utilize these chips to diminish their energy expenditure and carbon footprint.”
While most contemporary chips are fundamentally digital, storing data as 0s and 1s, IBM’s innovative chip employs “memristors.” These memory resistors function analogously, capable of holding an array of numbers.
This difference between digital and analog functioning can be analogized with the variance between a typical light switch and a dimmer switch. Much like the human brain, which operates on an analog system, memristors mirror the function of synapses in our brain.
Elucidating on this, Prof Ferrante Neri from the University of Surrey stated that memristors epitomize nature-inspired computing, essentially mimicking brain functions.
These components can “recollect” their electrical history akin to synapses in biological systems. “Interlinked memristors can emulate a network resembling a biological brain,” says Neri.
However, he also voiced cautious optimism. While the advancement hints at the imminent rise of brain-like chips, challenges like material costs and manufacturing complexities are substantial roadblocks to broad-scale adoption.
Despite its primary analog nature, the chip lacks digital elements. This hybrid nature ensures compatibility with existing AI systems, making integration smoother.
Today, many smartphones boast AI chips, integral for tasks like photo processing. The iPhone, for instance, incorporates a “neural engine” chip.
With IBM’s breakthrough, future device chips from phones to cars could become significantly more energy-efficient. This could lead to longer battery life and pave the way for new applications.
The transformative potential doesn’t end here. The energy savings could be monumental if chips akin to IBM’s prototype were to replace those in large computer banks that drive high-power AI systems. Moreover, it would substantially reduce the water required to cool these energy-intensive data centers. To put things in perspective, a massive data center’s electricity consumption equates to that of a medium-sized town.
Commenting on this innovation, Professor James Davenport of the University of Bath observed that while IBM’s findings hold intrigue, the journey ahead isn’t straightforward. This chip doesn’t provide an immediate solution but symbolizes “a potential initial step.”