Spiking neural network promises improved AI

A biomimicking neural network housed in a microchip could lead to faster and more efficient applications of artificial intelligence.

spiking neural network
Advances in artificial intelligence technology is leading to the development of neural networks that mimic the biology of the brain (© 2021 KAUST)

This is the claim of researchers at KAUST in Saudi Arabia whose ‘spiking’ neural network lays the foundation for much improved hardware-based AI computing systems.

CLICK FOR MORE FROM ELECTRONICS & COMMUNICATIONS

Artificial intelligence is gaining traction in areas including advanced automation, data mining and healthcare. These systems are based on a mathematical artificial neural network (ANN) composed of layers of decision-making nodes. Labelled data is fed into the system to ‘train’ the model to respond a certain way, then the decision-making rules are locked in and the model is put into service on standard computing hardware.

While this method works, it is an approximation of the complex, powerful and efficient neural network that makes up the human brain.

“An ANN is an abstract mathematic model that bears little resemblance to real nervous systems and requires intensive computing power,” said Wenzhe Guo, a Ph.D. student in the research team. “A spiking neural network, on the other hand, is constructed and works in the same way as the biological nervous system and can process information in a faster and more energy-efficient way.”

According to KAUST, spiking neural networks (SNNs) emulate the structure of the nervous system as a network of synapses that transmit information via ion channels in the form of action potential, or spikes, as they occur. This event-driven behaviour, implemented mathematically as a so called ‘leaky integrate-and-fire model,’ makes SNNs very energy efficient. Plus, the structure of interconnected nodes is said to provide a high degree of parallelisation, which further boosts processing power and efficiency. It also lends itself to implementation directly in computing hardware as a neuromorphic chip.

“We used a standard low-cost FPGA microchip and implemented a spike-timing-dependent plasticity model, which is a biological learning rule discovered in our brain,” Guo said in a statement.

Furthermore, this biological model does not need teaching signals or labels, allowing the neuromorphic computing system to learn real-world data patterns without training.

“Since SNN models are very complex, our main challenge was to tailor the neural network settings for optimal performance,” Guo said. “We then designed the optimal hardware architecture considering a balance of cost, speed and energy consumption.”

KAUST add the team’s brain-on-a-chip was over 20 times faster and 200 times more energy efficient than other neural network platforms.

“Our ultimate goal is to build a compact, fast and low-energy brain-like hardware computing system. The next step is to improve the design and optimize product packaging, miniaturize the chip and customize it for various industrial applications through collaboration,” Guo said.

The team’s findings have been published in IEEE Transactions on Neural Networks and Learning Systems and is available here.