Ultra-Low Power AI Chip: Neural Pulses Require Only 0.02% Energy

Machine Heart Reports

Machine Heart Editorial Department

This artificial intelligence chip achieves a new ultra-low power consumption.
The human brain is not very large, yet it carries all computational tasks. For this reason, many researchers have begun to explore the creation of artificial networks that simulate the processing of brain neural signals. This type of artificial network is called Spiking Neural Networks (SNN).
Spiking Neural Networks were first proposed by Professor Maass in 1997. They represent a new generation of artificial neural networks based on the operating mechanisms of the brain and are regarded as the third generation of neural network models. They are currently the closest biological-inspired models to brain-like computing, capable of processing biological stimulus signals and explaining complex intelligent behaviors of the brain.
Ultra-Low Power AI Chip: Neural Pulses Require Only 0.02% Energy
SNN aims to bridge the gap between neuroscience and machine learning by using models that closely fit biological neuron mechanisms for computation, which fundamentally differs from currently popular neural networks and machine learning methods.
SNN uses spikes, which are discrete events that occur at specific points in time, rather than common continuous values. Each spike is represented by differential equations that correspond to biological processes, the most important of which is the membrane potential of the neuron. Essentially, once a neuron reaches a certain potential, a spike occurs, and the neuron that reached the potential is then reset.
However, the brain has 100 billion tiny neurons, each connected to 10,000 other neurons through synapses, and these neurons represent information through coordinated patterns of electrical spikes. It has proven to be very challenging to use hardware to simulate these neurons on a compact device while ensuring computations are performed in an energy-efficient manner.
In a recent study, researchers from the Indian Institute of Technology Bombay achieved ultra-low power artificial neurons, allowing for a more compact arrangement of SNN.
Ultra-Low Power AI Chip: Neural Pulses Require Only 0.02% Energy
Paper link: https://ieeexplore.ieee.org/document/9782075
New Research Achieves 5000 Times Reduction in Energy per Spike
Just like neurons in the brain, spikes occur when energy thresholds are exceeded. SNN relies on artificial neural networks where a current source charges leaky capacitors until threshold levels are reached, at which point the artificial neuron fires, and the stored charge is reset to zero. However, existing SNNs require large transistor currents to charge their capacitors, leading to high power consumption and rapid firing of artificial neurons.
In this study, Professor Udayan Ganguly from the Indian Institute of Technology Bombay and his colleagues collaborated to create an SNN that relies on a new, compact current source to charge capacitors, known as BTBT (band-to-band-tunneling current).
In BTBT, quantum tunneling current charges capacitors at extremely low currents, meaning less energy is required. The BTBT method also eliminates the need for larger capacitors to store large currents, paving the way for smaller capacitors on chips, thus saving space.
Researchers tested the BTBT neuron method using 45-nanometer commercial insulated silicon transistor technology, and the results showed that this method saves significant amounts of energy and space. At the same time, they announced a new low-power AI chip capable of implementing so-called Spiking Neural Networks.
Ultra-Low Power AI Chip: Neural Pulses Require Only 0.02% Energy
Researchers from the Indian Institute of Technology Bombay, including Maryam Shojaei Baghini (left) and Professor Udayan Ganguly (right)
Compared to the SOTA [artificial] neurons implemented in hardware Spiking Neural Networks, this study achieved a 5000 times reduction in energy per spike in similar areas, and a 10 times reduction in energy per spike in similar areas, explained Ganguly.
Researchers applied SNN to a speech recognition model, which used 20 artificial neurons as initial input encoding, along with an additional 36 artificial neurons. This model was able to effectively recognize spoken language, validating the feasibility of this method in real-world applications.
This technology is applicable to speech activity detection, speech classification, motion pattern recognition, navigation, biomedical signal classification, and more. While these applications can be performed using current servers and supercomputers, SNN can enable these applications to be used with edge devices, such as mobile phones and IoT sensors, particularly in energy-constrained situations.
Ganguly stated that his team has demonstrated the usefulness of the BTBT method for specific applications (such as keyword detection), and their goal is to create an ultra-low power neural synapse core and develop a real-time on-chip learning mechanism, a technology that is key to achieving autonomous bionic neural networks.
Reference Links:
https://spectrum.ieee.org/low-power-ai-spiking-neural-net
https://jishuin.proginn.com/p/763bfbd6cfac

Ultra-Low Power AI Chip: Neural Pulses Require Only 0.02% Energy

© THE END

For reprints, please contact this public account for authorization

Submissions or inquiries: [email protected]

Leave a Comment