MIT Breakthrough Reduces Neural Network Power Consumption

Neural networks are powerful, but a major roadblock for neural networks is the amount of power they need to operate. Engineers working at MIT have made a breakthrough that allows power consumption of complex neural networks to be reduced by as much as 95%.

The reduction in power consumption will allow neural networks to operate even on mobile devices.

MIT engineers created a new chip that is designed to make neural networks a practical option for battery-powered devices. The breakthrough will allow for smartphones, as an example, to do the heavy processing that is often done by computers on the cloud.

Instead of Siri sending translations to a computer on the cloud and waiting for a response, the new chip opens the doors for these translations to be done on a person’s phone. Neural networks offer 3 – 7 times the speed. The chips can also be used in household appliances in the future.

Neural networks rely on data to be uploaded to internal servers over the Internet, consuming bandwidth and making the process energy extensive.

The new chip reinvents neural networks as they are understood today. Networks today operate off of arranged, artificial neurons that receive input from the layer below them and is passed on when certain thresholds are met. The new process uses analog circuits, with input being done in parallel.

The process requires less data to be shared among layers, which makes the old way neural network chips work so energy intensive.

MIT claims that the weight of connections must be binary for the process to work. Theoretical analysis on the matter suggests that the binary process required will not have a major impact on the accuracy of the computations. Rather, the chips are within a two-to-three percent accuracy range on non-binary neural networks.

The chip isn’t alone in advancing the field of neural computing. Researchers and engineers have previously worked on reducing the power consumption of neural networks. MIT’s solution is the first time a binary chip has been used for image-based artificial intelligence applications.

Silicon Valley is also working on low-power AI chips to help advance smartphones, appliances and devices. Big chip companies are working towards solutions for machine learning efficiency, and ARM introduced two chips earlier in the year that have the ability to offer facial recognition and translation.

Apple’s iPhone X has already had a neural engine built into the phone to offer facial recognition.

Qualcomm has also put a focus on AI, with the introduction of their Snapdragon 845 chip. The chip was designed to have a focus on AI, and it also features a GPU to help with the processing. The company’s 820E was designed for robots, drones and industrial devices specifically.

IBM offers the most impressive advancements with their work on neuromorphic chip technology. The technology is modeled after the human brain, and the chips have the theoretical capability to run on a fraction of the power that conventional chips run on. The chips are still in the very early parts of development, but they also have the potential to offer neural computing and machine learning with a much smaller energy footprint.

Melissa Thompson
Melissa Thompson writes about a wide range of topics, revealing interesting things we didn't know before. She is a freelance USA Today producer, and a Technorati contributor.