- Author, Zoe Corbyn
- Role, Technology Reporter
- From reporting san francisco
The power demands of modern computing are growing at an alarming rate.
According to a recent report by the International Energy Agency (IEA), by 2026, consumption from data centers, artificial intelligence (AI) and cryptocurrencies could double from 2022 levels.
It is estimated that the energy consumption of these three sectors in 2026 could roughly equal Japan's annual energy needs.
Companies like Nvidia — whose computer chips are the basis for most AI applications today — are working on developing more energy-efficient hardware.
But could the alternative be to build computers with a fundamentally different architecture, one that is more energy efficient?
Some companies certainly think so, and they're looking at the structure and function of an organ that uses a fraction of the power of a traditional computer to do more work faster: the brain.
In neuromorphic computing, electronic devices mimic neurons and synapses, and are interconnected in such a way that they resemble the electrical networks of the brain.
This isn't new — researchers have been working on this technique since the 1980s.
But the energy requirements of the AI revolution are increasing pressure to bring this new technology into the real world.
Current systems and platforms exist mainly as research tools, but proponents say they could deliver huge gains in energy efficiency.
Those with commercial ambitions include hardware giants such as Intel and IBM.
Some smaller companies are also in this space. “This opportunity is waiting for the company that can figure it out,” says Dan Hutchison, an analyst at TechInsights.[And] The chances are that it could finish off Nvidia.”
In May, SpineCloud Systems, a subsidiary of the Dresden University of Technology, announced it would begin selling the first neuromorphic supercomputer and was taking pre-orders for it.
“We have reached the commercialization of neuromorphic supercomputers before other companies,” says Hector Gonzalez, its co-CEO.
Tony Kenyon, professor of nanoelectronics and nanophotonic materials at University College London, who works in this field, says this is an important development.
He says, “While there is still no killer app… there are many areas where neuromorphic computing will provide significant benefits in energy efficiency and performance, and I'm sure we will see widespread adoption as this technology matures.”
Neuromorphic computing involves various approaches – from a more brain-inspired approach to a nearly perfect simulation of the human brain (which we're actually nowhere near).
But it has some fundamental design qualities that make it different from traditional computing.
First, unlike traditional computers, neuromorphic computers do not have separate memory and processing units. Instead, all of these functions are performed together in one location on a single chip.
Professor Kenyon explained that eliminating the need to transfer data between the two reduces energy consumption and speeds up processing times.
An event-driven approach to computing may also become common.
Unlike traditional computing, where each part of the system is always on and available to communicate with any other part at all times, activations in neuromorphic computing can be slower.
Simulated neurons and synapses are only active when they have something to communicate, just like many of the neurons and synapses in our brain are only active when there is a reason for it to happen.
Working only when there is something to process also saves power.
And while modern computers are digital — using 1s or 0s to represent data — a neuromorphic computing could be analog.
Historically significant, this method of computing relies on continuous signals and can be useful where data coming from the outside world needs to be analysed.
However, for reasons of simplicity, most commercially oriented neuromorphic efforts are digital.
Proposed commercial applications fall into two main categories.
One that SpineCloud is focused on is providing a more energy-efficient and high-performance platform for AI applications — including image and video analysis, speech recognition, and the big language models that power chatbots like ChatGPT.
The second is in “edge computing” applications – where data is processed in real time on connected devices, not in the cloud, but which operate on power constraints. Autonomous vehicles, robots, cell phones and wearable technology could all benefit.
However, technical challenges remain. Developing the software needed to run the chips has long been seen as the main obstacle to the progress of neuromorphic computing.
Having the hardware is one thing, but it must be programmed to work, and this may require developing an entirely different style of programming from that used by traditional computers.
“The potential of these devices is enormous … the problem is how do you make them work,” Mr. Hutchison said. He estimates it will take at least a decade or two for the benefits of neuromorphic computing to be truly realized.
There are also cost issues. Whether they use silicon, as commercially oriented efforts do, or other materials, making radically new chips is expensive, Professor Kenyon says.
Intel's current prototype neuromorphic chip is called Loihi 2.
In April, the company announced it had brought together 1,152 neurons to create Hala Point, a large-scale neuromorphic research system consisting of more than 1.15 billion simulated neurons and 128 billion simulated synapses.
With a neuron capacity roughly equivalent to an owl's brain, Intel claims it is the world's largest system to date.
At the moment it is still a research project for Intel.
,[But Hala Point] “It's showing that there's some real feasibility of using AI for applications,” says Mike Davis, director of Intel's Neuromorphic Computing Lab.
He said the microwave oven-sized Hala Point is “commercially relevant” and that “rapid progress” is being made on the software side.
IBM has named its latest brain-inspired prototype chip Northpole.
Unveiled last year, it is an evolution of its previous TrueNorth prototype chip. Testing shows it is more energy efficient, space efficient and faster than any chip currently on the market, says Dharmendra Modha, the company's chief scientist for brain-inspired computing. He said his group is now working to demonstrate that the chips can be linked together to form a larger system.
“The path to market will be the story to come,” he says. Dr. Modha says a big innovation with Northpole is that it has been designed in conjunction with the software, so that the full capabilities of the architecture can be exploited right from the start.
Other smaller neuromorphic companies include Brainchip, Synsens, and Innatera.
SpinCloud's supercomputer commercializes neuromorphic computing developed by researchers at TU Dresden and the University of Manchester, as part of the European Union's Human Brain Project.
These efforts have resulted in two research-purpose neuromorphic supercomputers: the SpiNNaker1 machine located at the University of Manchester, which has over a billion neurons, and which has been operational since 2018.
The second-generation SpiNNaker2 machine at TU Dresden, which is currently in the process of being configured, has the capability to simulate at least five billion neurons. Commercially available systems such as those offered by SpiNNakerCloud can reach scales as high as at least 10 billion neurons, says Mr. Gonzalez.
Professor Kenyon says that in the future there will be different types of computing platforms – traditional, neuromorphic and quantum, which is another novel type of computing that’s on the horizon – all working together.