Concepedia

Publication | Open Access

An artificial sensory neuron with visual-haptic fusion

293

Citations

39

References

2020

Year

TLDR

Human behavior relies on adaptive, plastic, event‑driven sensory neurons that efficiently integrate multiple cues to accurately depict the environment. The study develops a bimodal artificial sensory neuron to implement sensory fusion processes. The neuron collects optical and pressure signals via photodetector and pressure sensors, transmits them through an ionic cable, and integrates them into post‑synaptic currents using a synaptic transistor. Synchronizing visual and haptic cues excites the neuron at multiple levels, enabling control of skeletal myotubes and a robotic hand, and simulations show improved recognition of multi‑transparency patterns, indicating potential for cyborg and neuromorphic systems with supramodal perception.

Abstract

Human behaviors are extremely sophisticated, relying on the adaptive, plastic and event-driven network of sensory neurons. Such neuronal system analyzes multiple sensory cues efficiently to establish accurate depiction of the environment. Here, we develop a bimodal artificial sensory neuron to implement the sensory fusion processes. Such a bimodal artificial sensory neuron collects optic and pressure information from the photodetector and pressure sensors respectively, transmits the bimodal information through an ionic cable, and integrates them into post-synaptic currents by a synaptic transistor. The sensory neuron can be excited in multiple levels by synchronizing the two sensory cues, which enables the manipulating of skeletal myotubes and a robotic hand. Furthermore, enhanced recognition capability achieved on fused visual/haptic cues is confirmed by simulation of a multi-transparency pattern recognition task. Our biomimetic design has the potential to advance technologies in cyborg and neuromorphic systems by endowing them with supramodal perceptual capabilities.

References

YearCitations

Page 1