Thursday, July 16, 2020

Singapore scholars use neural mimicry computing to make robots “tactile”

Reporter : Ji Hongmei
Publisher : China Science Daily via ScienceNet
Direct translation

Image  : Benjamin Tee, one of the researchers on neural mimicry computing. Picture courtesy of NUS News
 

On 16 July 2020, two researchers from the National University of Singapore (NUS) published a study in Robotics: Science and Systems. In the field of robotics, combined with event-based visual and tactile perception, Intel neural mimicry computing is promising. This work highlights that the introduction of tactile sensation into robotic technology can significantly improve system capabilities and functions compared to current vision-only systems, and neural mimic chips can surpass traditional architectures in processing such sensory data.

Human touch is sensitive enough to feel subtle differences between different surfaces, even if these differences are just a layer of molecules. However, most robot operations are now based on visual processing. Researchers at the National University of Singapore hope to use artificial skin they have recently developed to change this situation. According to their research, this kind of artificial skin can detect touch more than 1,000 times faster than the human sensory nervous system, and can recognize the shape, texture and hardness of objects, 10 times faster than human blinking.

However, creating artificial skin is only the first step in realizing this vision. "If making robots smarter is a puzzle, manufacturing ultra-fast artificial skin sensors only solves about half," said Benjamin Tee, an assistant professor at the National University of Singapore's Department of Materials Science and Engineering and the Institute of Health and Innovation Technology. Robots also need an artificial brain that can eventually realize perception and learning, which is another key part of this puzzle. We have used neural simulation chips such as Intel Loihi to conduct unique research on AI skin systems to achieve energy efficiency and scalability. An important step has been taken."

In order to make breakthroughs in the field of robot perception, the team at the National University of Singapore began to explore the potential of neural mimicry technology, trying to use Intel Loihi neural mimicry research chips to process sensory data from artificial skin. In the initial experiment, the researchers used a manipulator equipped with artificial skin to read braille and transfer tactile data to Loihi through the cloud to convert the micro-protrusion felt by the hand into semantics. In classifying Braille letters, Loihi achieves an accuracy of more than 92%, while the power consumption is 20 times lower than the standard von Neumann processor.

Based on this work, the National University of Singapore team has further improved the robot's perception capabilities by combining visual and haptic data into a pulsed neural network (SNN). To this end, they let a robot use sensory input from artificial skin and event-based cameras to classify various opaque containers containing different amounts of liquid. The researchers also used the same tactile and visual sensors to test the ability of the perception system to recognize rotation and slippage, which is essential for stable grip.

After capturing these sensory data, the research team sent them to the GPU and Intel's Loihi neural mimicry research chip respectively, so as to compare the processing capabilities of the two. Its newly published research results show that the use of pulse neural networks combined with event-based vision and haptics can increase the accuracy of item classification by 10% compared to systems that only use vision. In addition, the researchers also showed that the neural mimic technology is used for the power consumption of such robotic devices. Loihi can process sensory data 21% faster than high-performance GPUs, but the power consumption is reduced by 45 times.

The two researchers are also members of Intel's Neuromimetic Research Community (INRC). In this regard, Mike Davies, director of the Intel Neural Mimic Computing Laboratory, said: "This research by the National University of Singapore has given people a deep understanding of the future of robotics, that is to say, future robots will be combined in an event-driven manner. Multimodality to perceive and process information. More and more similar studies have shown that once we redesign the entire system based on event-based paradigm, including sensors, data formats, algorithms and hardware architecture, the use of neural mimicry computing can be Significantly reduce latency and power consumption."

No comments:

Post a Comment

Will Hurricane Milton bring an October surprise to the U.S. election?

Reporter : Cheng Wen / Editor : Li Lin /  https://www.epochtimes.com/b5/24/10/8/n14346760.htm /  Image : On 7 October 2024, in Kissimmee, F...