Electronics Assembly Knowledge, Vision & Wisdom
Neural Networks and Artificial Intelligence
Neural Networks and Artificial Intelligence
Researchers have developed a chip that increases the speed of neural-network computations by three to seven times and reduces power by 93 - 96%.
Technology Briefing

,{url:'http://www.circuitinsight.com/videos/technology_briefing_neural_networks_artificial_intelligence.mp4'}], clip:{autoBuffering:true, autoPlay:true, scaling:'scale' } }).ipad();
Technology Briefing is brought to you by association with Audio-Tech, publishers of critically acclaimed programs including: Trends Magazine.

Subscribe to their monthly reports and learn about big ideas, new products, new management techniques, breakthrough concepts, and trailblazing technologies.
Transcript
Most recent advances in artificial-intelligence systems have come courtesy of neural networks. These are densely interconnected meshes of simple information processors that learn to perform tasks by analyzing huge sets of training data.

Until now, neural nets have been large, and their computations have been energy intensive. Therefore, so they're not very practical for handheld devices. Most smartphone apps that rely on neural nets simply upload data to internet servers, which process it and send the results back to the phone.

But, according to new research presented at the International Solid-State Circuits Conference, MIT researchers have developed a special-purpose chip that increases the speed of neural-network computations by three to seven times, while reducing power consumption 93 to 96 percent. That could make it practical to run neural networks locally on smartphones or even to embed them in household appliances.

Neural networks are typically arranged into layers. A single processing node in one layer of the network will generally receive data from several nodes in the layer below and pass data to several nodes in the layer above. Each connection between nodes has its own "weight," which indicates how large a role the output of one node will play in the computation performed by the next.

Training the network is a matter of setting those weights.

The MIT researchers' new chip improves efficiency by replicating the brain more faithfully than prior designs. In the chip, a node's input values are converted into electrical voltages and then multiplied by the appropriate weights. Only the combined voltages are converted back into a digital representation and stored for further processing.

The chip can thus calculate dot products for multiple nodes-6 at a time, in the prototype in a single step, instead of shuttling between a processor and memory for every computation.

One of the keys to the system is that all the weights are either 1 or -1. That means that they can be implemented within the memory itself as simple switches that either close a circuit or leave it open. Recent theoretical work suggests that neural nets trained with only two weights should lose little accuracy-somewhere between 1 and 2 percent.

In experiments, the MIT researchers ran the full implementation of a neural network on a conventional computer and the binary-weight equivalent on their chip. Their chip's results were generally within 2 to 3 percent of the conventional networks.  
Submit A Comment

Comments are reviewed prior to posting. Please avoid discussion of pricing or recommendations for specific products. You must include your full name to have your comments posted. We will not post your email address.

Your Name


Company


E-mail


Country


Comments


Authentication

Please type the number displayed into the box. If you receive an error, you may need to refresh the page and resubmit the information.



Related Programs
bullet Quantum Computing Becomes a Real Technology
bullet Wearable Cameras
bullet Quantum Computers Break Encryption
bullet Greater Storage on Smaller Chips
bullet New Methods Extract Lithium from Water
bullet The Portable Smartphone Laboratory
bullet Neural Networks and Artificial Intelligence
bullet Cooling Without Electricity
bullet Machine Learning Impacts the 21st Century Workforce
bullet The Era of Quantum Computing Microprocessors Dawns
More Related Programs