Sensory Integrated Artificial Brain System



Sensory Integrated Artificial Brain System
Recent work combines ultra-fast electronic skin and an ultra-fast nervous system with the latest innovations in vision sensing and AI for robots.
Technology Briefing

Transcript


Another exciting breakthrough in robotic capabilities was presented at the Robotics: Science and Systems conference in July 2020 by a team from the National University of Singapore (or NUS).

The NUS team has developed “a sensory integrated artificial brain system” that mimics biological neural networks, which can run on a power-efficient neuromorphic processor, such as Intel’s Loihi chip. By integrating artificial skin and vision sensors, it equips robots with the ability to draw accurate conclusions about the objects they are grasping based on the data captured by those vision and touch sensors in real-time.

Until now, fusing both visual and tactile information to provide a highly precise response in milliseconds has remained a technological challenge. However, the recent work at NUS combines ultra-fast electronic skin and an ultra-fast nervous system with the latest innovations in vision sensing and AI for robots. As a result, the robots can become smarter and more intuitive in physical interactions.

Enabling a human-like sense of touch in robotics could significantly improve current functionality, and even lead to new applications. For example, on the factory floor, robotic arms fitted with electronic skins that easily adapt to different items, using tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping.

In the new robotic system, the NUS team applied an advanced artificial skin known as Asynchronous Coded Electronic Skin (or ACES) they developed in 2019. This novel sensor detects touches more than 1,000 times faster than the human sensory nervous system. It can also identify the shape, texture, and hardness of objects 10 times faster than the blink of an eye.

Making an ultra-fast artificial skin sensor solves about half the puzzle of making robots smarter. Another critical piece in the puzzle is an artificial brain that can ultimately achieve perception and learning.

To break new ground in robotic perception, the NUS team explored neuromorphic technology — an area of computing that emulates the neural structure and operation of the human brain — to process sensory data from the artificial skin. Since the NUS team leaders are members of the Intel Neuromorphic Research Community, it was a natural choice to use Intel’s Loihi neuromorphic research chip for the new robotic system.

In their initial experiments, the researchers fitted a robotic hand with the artificial skin and used it to read braille, passing the tactile data to Loihi via the cloud to convert the micro bumps felt by the hand into a semantic meaning. Loihi achieved over 92 percent accuracy in classifying the Braille letters while using 20 times less power than a normal microprocessor.

The team improved the robot’s perception capabilities by combining both vision and touch data in a spiking neural network. In their experiments, the researchers tasked a robot equipped with both artificial skin and vision sensors to classify various opaque containers containing differing amounts of liquid. They also tested the system’s ability to identify rotational slip, which is important for stable grasping.

In both tests, the spiking neural network that used both vision and touch data was able to classify objects and detect object slippage. The classification was 10 percent more accurate than a system that used only vision. Moreover, using a technique developed by the team, the neural networks could classify the sensory data while it was being accumulated, unlike the conventional approach where data is classified after it has been fully gathered. In the process, the researchers also demonstrated the efficiency of neuromorphic technology: Loihi processed the sensory data 21 percent faster than a top-performing graphics processing unit (or GPU) while using more than 45 times less power.

This research brings the world a step closer building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations. It also provides a compelling glimpse of the future of robotics where information is both sensed and processed in an event-driven manner combining multiple modalities.

Moving forward, the NUS researchers plan to further develop their novel robotic system for applications in the logistics and food manufacturing industries where there is a high demand for robotic automation, especially in the post-COVID era.

Comments

No comments have been submitted to date.

Submit A Comment


Comments are reviewed prior to posting. You must include your full name to have your comments posted. We will not post your email address.

Your Name


Your Company
Your E-mail


Your Country
Your Comments