According to researchers at Intel, the American tech company, and Ireland-based professional services firm Accenture, experimental computer chips that try to imitate the way human brains function could speed up the use of voice and gesture commands in automobiles.
Neuromorphic computing, a cutting-edge technology, could use considerably less energy than conventional computer and graphics processing units that connect wirelessly through the cloud to a car. The AI (artificial intelligence) capabilities of today’s cars do not have the capacity to understand many speech and gesture commands, partially due to the energy requirements needed to make those functions work.
The need for AI methods that consume less energy is understood by automotive manufacturers, which is one reason why neuromorphic computing can be beneficial, said Tim Shea, Accenture Labs’ technology researcher. “They’re already running up against limitations of [current chips] not being scalable enough,” he said.
Last week, German automaker Mercedes-Benz AG announced that it had joined the Intel Neuromorphic Research Group to explore how neuromorphic chips for vehicle-related AI uses could help improve energy efficiency, speed and accuracy.
“With the knowledge we’ll gain, we want to achieve a significant boost for our AI applications in and around our vehicles,” said Jasmin Eichler, director of future technologies at Mercedes-Benz, in a statement.
According to Mike Davies, director of Intel’s Neuromorphic Computing Lab, Intel’s neuromorphic chips could begin selling commercially within five years.
Human like thinking
Researchers from Accenture Labs claim that applications driven by neuromorphic chips inside a car could help to recognize when a person is shivering and adjust the temperature automatically. A voice command to turn on the car or roll down the window could also be recognized by them. The chips would be integrated in the car itself and would not need to connect to the cloud in order to work.
Accenture Labs worked on a neuromorphic computing experiment this year with an undisclosed car maker. In the experiment, a neuromorphic chip made by Intel Labs, named Loihi, recognized voice commands such as “start the engine.” The chip consumed 1,000 times less power and responded 200 milliseconds faster than a standard GPU, Mr. Shea said.
Intel is among several companies, universities and startups, such as International Business Machines (IBM), SynSense and Applied Brain Research, that are studying neuromorphic computing. “The industry is looking for new ways of developing AI systems with much lower power consumptions,” said Alan Priestley, AI technologies analyst at US-based research firm Gartner.
Less energy and quick learning
Energy consumption is an obstacle to some AI deployments. Developing a single AI model, for example, can have a carbon footprint equivalent to the lifetime emissions of five average cars, according to researchers at the University of Massachusetts, Amherst.
With neuromorphic computing, it is possible to train machine-learning models using a fraction of the data it takes to train them on traditional computing hardware. That means the models learn similarly to the way human babies learn, by seeing an image or toy once and being able to recognize it forever.
The technique uses significantly less energy than today’s GPUs (Graphics Processing Unit), which are one of the main computer chips used for AI systems, especially neural networks. Neural networks are used in speech recognition and understanding, as well as computer vision.
Another advantage of the computing technique is that it is “event-driven,” meaning it is only computing and using energy when it is activated by an event, such as a voice or gesture command. “It’s not just computing all the time in a uniform way, whether there’s activity or not,” said Alex Kass, principal director at Accenture Labs.
Neuromorphic chips can be installed inside cars without the need to access the cloud to do the computing inside the car itself. This means that even in areas with low connectivity, such as national forests, the AI functions still work, Accenture researchers claim.
According to Gartner, chips are expected to be the prevailing computing architecture for new, advanced types of AI deployments by 2025. Gartner expects that the system would displace graphics-processing units by that year.