There are different types of neural networks offering great potential for security in the future; but the spiking neural networks (SNN) type of neuromorphic computing is available now, promising improvements in performance and personal and AI privacy.
Neuromorphic computing attempts to improve traditional (Von Neumann model) computing by collocating memory and processor, by encouraging parallel processing, and by being event driven. The idea takes inspiration from the biological map and functioning of the human brain. The result can, in theory, dramatically increase the speed of processing and decrease the consumption of electrical power.
Its primary components implemented in silicon are artificial neurons (providing basic processing capabilities), synapses (providing communication between neurons) and memory elements (storing the state of the neurons and the weights of the synapses).
“It is,” explains Jamie Boote, a principal security consultant at Black Duck, “a type of deep learning that models hardware and software architecture on natural structures (mostly neurons) implemented in silicon that mimics how neurons create and break connections with each other.”
An SNN is typified by translating data into short bursts of electrical signals known as ‘spikes’. It is the pattern of the voltage spikes that contains the data. The volume of neurons able to respond to data (effectively, an event) allows large scale multiprocessing on a single device, reducing footprint, elapsed processing time, and power consumption.
It is worth noting neuromorphic computing is not synonymous with the deep neural networks (DNNs) already in wide use within industry – although there is growing interest in using the former to accelerate the latter. An SNN model can be up to 100 times smaller than a conventional DNN model as used today.
Neuromorphic computing is a developing field, and its applications are still being explored and researched. Its strengths, however, indicate numerous areas where it may prove superior to traditional Von Neumann style computers. Just to recap, those strengths include the ability to process data with massive parallelism close to or at the source (that is, at the edge), very low power consumption (that is, ideal for devices that are battery powered), event driven and the ability to learn and adapt in real time and autonomously.
Working at the edge with a long battery life makes them suitable for smart sensors, as used in drones, wearable devices and IoT in general. Coupled with radar and/or infra-red sensors, they could provide numerous applications within facilities management – from automated light switches through detection of unauthorized physical access in sensitive areas to locating any persons left behind in a large building (major hotel or office complex) following an evacuation. Within robotics and autonomous systems, the fast (parallel) processing provides instant object recognition for navigation in dynamic environments. In healthcare, the event-driven element is ideal for monitoring physiological conditions.
Within corporate IT infrastructures and cybersecurity, the ability to respond instantly to changing patterns could provide instant recognition or indication of a cyberattack, while fast processing could detect and respond to security threats in general. The potential exists but has yet to be realized; and cybersecurity professionals are largely in the wait and see phase.
“If it can deliver on the promised adherent characteristics,” says Kris Bondi, CEO and co-founder of Mimoto, “it will address several key failings of legacy security solutions. While most anomaly detection is relegated to after breach forensics, neuromorphic computing promises to analyze user behaviors faster and more granularly to identify and respond to threats in real-time.”
Satyam Sinha, CEO and co-founder at Acuvity comments, “These [neuromorphic] traits are very beneficial to tasks such as threat detection and response as they aid in processing large amounts of network traffic in real-time. It also enables scaling up to detect anomalies and adapt to new attack vectors to provide a more accurate analysis. Neuromorphic systems can also process data locally on devices or on the edge networks which reduce the need for centralized data storage. In addition to better efficiency, this approach improves privacy in AI applications as it requires less transmission of data, reducing the number of transit points where the data can be intercepted or leaked.”
But Bondi adds, “There are also known limitations of neuromorphic computing that need to be overcome before it can become a stable, commercial security solution. These include the ability to work with unstructured data or to conduct real-time data processing, which are critical requirements for enterprise-ready solutions.”
The biggest problem right now for market adoption is the market’s lack of understanding and the absence of a track record in use. “Neuromorphic computing is still new, barely more than experimental; and not yet properly understood by the engineers who will benefit from its use.” Inertia reigns. “The more advanced and more probabilistic the system, the less the average engineer will understand it,” suggests David Benas, a principal security consultant at Black Duck.”
“Unfortunately,” says Madhu Shashanka, chief data scientist and co-founder at Concentric AI, “it has become somewhat of a buzzword these days like ‘cognitive computing’ where anything could be claimed as ‘neuromorphic’ to the extent it mimics the distributed processing nature of the brain.”
His concern is that the buzz of the technology will make us overly trust one of the side-effects of neuromorphic computing – increased privacy. By being close to the source, by working from electrical signals rather than identifiable names and information, by processing that data internally and not requiring external storage, neuromorphic systems promise greater privacy.
“I find the claim that it provides better control over privacy just because it mimics the brain to be dubious,” continues Shashanka. “A different computing architecture will come with different levels of privacy tradeoffs but there is no magic bullet – one has to build data privacy and security into the architecture design from the start at the system level.”
Others worry about the potentiality for dual use. Whenever a new technology is introduced for defenders, it is rapidly subverted for use by attackers – usually they use it both as a new attack vector and an attack tool. Just consider AI.
“Think smarter drones, real-time threat detection, human-machine symbiosis and much more, like a good sci-fi movie,” comments Boaz Barzel, field CTO at OX Security. “But here’s the twist: Just like any dual-use tech, it won’t stay on the ‘good side’ for long. The same capabilities that make it ideal for adaptive robotics or autonomous vehicles also make it perfect for autonomous attack systems, undetectable surveillance, and decision-making malware that evolves on the fly. We’re building digital brains, but we haven’t figured out how to keep them ethical.”
Benas adds, “My prediction is, when things like SNNs become more mainstream, we’ll see the typical spike in security problems for the first few years, followed by a cooling off period when the technology is better understood. And the cycle continues with each new breakthrough, technology, paradigm, platform, and so on. The optimist inside me thinks that, by and large, we’ll all get better at securing everything because of it.”
While most discussions on neuromorphic computing talk about potential, there is at least one device already in production: the T1 Spiking Neural Processor, a microcontroller produced by Innatera, that makes full use of the advantages offered by neuromorphic devices.
Innatera is a Dutch spinoff from the Delft University of Technology, founded in 2018. From the beginning, its purpose was to bring ‘brain-like intelligence to sensors’. “Sensors are everywhere,” says co-founder and CEO Sumeet Kumar. “Last year alone, about 4 billion new devices packed with sensors came online, including phones, watches and other wearables, cars and robotics. These sensors are used to gather information to direct the devices’ applications.
But they suffer from two fundamental problems. Firstly, the devices tend to be battery powered, but processing the sensor data is power hungry. These batteries require frequent changing or charging, potentially interrupting the always-on requirement of the sensors.
Secondly, the sensors gather huge amounts of data that currently needs to be sent to the cloud for processing. This can introduce latency and creates the potential for sensitive and sometimes intimate personal data to be leaked or stolen. “You’re sending private data to a place where you no longer have control over that data,” Kumar points out.
His solution is to use the parallel processing and the local memory of neuromorphic computing to keep data with its source. Sensor data is received, processed and fed directly into the application with no chance of leakage. At the same time, the low power consumption of SNNs extends the battery life enormously.
The Innatera microcontroller brings intelligence close to the sensor with a spiking neural network engine and a RISC-V processor core. “Our chips allow you to analyze sensor data in real time to detect and identify patterns of interest. What makes these processors so special is that they mimic how your brain works. So, every time you see something, hear something, or smell something with your personal human sensors, there are certain processes that occur in your brain that allow it to identify what’s happening in the world around you,” he continues.
So, just as the human brain interprets vision by receiving and analyzing light waves, and audio by receiving and analyzing sound waves, so this chip receives and analyzes sensor data. “It works by encoding the critical information that is present inside a sensor data stream into individual voltage spikes. This data is encoded into precisely when these spikes occur in time – if a feature is very important, the spike occurs early. If a feature is less important, the spike occurs late. The SNN manipulates the timing relationships between all these spikes to discover what patterns are hidden inside the sensor data.”
Those patterns can be converted into instructions for the application using the sensor, but all done locally. For example, a wearable health monitor can measure and respond to its sensor data without any personal information leaving the device. “That raw data never has to be sent out of the device. It can be processed with a very high accuracy in real time directly inside of the device, preserving the privacy of the user.”
How the microcontroller interprets the spike patterns from different types of sensors driving different applications will vary and will need to be encoded into each microcontroller installation. To facilitate this, Innatera has developed and provides a special SDK named Talamo. It allows engineers to build SNN models directly in PyTorch using libraries provided by the firm.
“Engineers can build neural networks very quickly using a well understood workflow without having to understand too much about what’s inside the chip. Talamo includes an architecture simulator so that simulations and design iterations can be run without touching the hardware. When satisfied with an optimized model, a compiler allows the model to be mapped onto the chip without any need to understand what’s going on inside the chip. The SDK ensures a simple development process without the need to retain existing engineers.”
Innatera’s SNN T1 chip has been with sampling customers since early last year, and the spiking neural processor entered volume production in December. Some of the sampling customers are already building applications.
SNNs for sensors is a new but promising area. The first applications are, unsurprisingly, relatively simple. “They include presence sensing for smart lighting, and security cameras, similar solutions for doorbells, audio recognition, and activity monitoring and cardiovascular monitoring in leisure and health wearables,” says Kumar.
But what Innatera’s SNN microcontroller demonstrates is that neuromorphic computing has moved from theory to practice. Its future, and the speed of evolution will depend only on the demands of the market: navigation built into free moving robots, and complex instructions to shop floor robotics will come in the future.
We asked Kumar if his system would be able to hear, transcribe and print a copy from a sensitive conversation on a single device without any audio having to leave that device. He noted that the current crop of chips isn’t yet sufficiently complex to achieve that particular composite task, but added, “Absolutely. That is the very intent of these chips, to be able to take all that data, convert it into actionable patterns, actionable insights, and act on the result without the data having to be sent anywhere else. So essentially, that sort of audio recording and transcribing would happen right on the device.”
Related: Amazon Ends Privacy Feature That Let Echo Users Opt Out of Sending Recordings to Company
Related: ‘JekyllBot:5’ Vulnerabilities Allow Remote Hacking of Hospital Robots
Related: Brain-Inspired System Aims to Improve Threat Detection