Bioinspired sensors and applications in intelligent robots: a review

Yanmin Zhou (Department of Control Science and Engineering, College of Electronics and Information Engineering, Tongji University, Shanghai, China)
Zheng Yan (Department of Control Science and Engineering, College of Electronics and Information Engineering, Tongji University, Shanghai, China)
Ye Yang (Department of Control Science and Engineering, College of Electronics and Information Engineering, Tongji University, Shanghai, China)
Zhipeng Wang (Department of Control Science and Engineering, College of Electronics and Information Engineering, Tongji University, Shanghai, China)
Ping Lu (Department of Control Science and Engineering, College of Electronics and Information Engineering, Tongji University, Shanghai, China)
Philip F. Yuan (Department of Control Science and Engineering, College of Electronics and Information Engineering, Tongji University, Shanghai, China)
Bin He (Department of Control Science and Engineering, College of Electronics and Information Engineering, Tongji University, Shanghai, China)

Robotic Intelligence and Automation

ISSN: 2754-6969

Article publication date: 4 April 2024

Issue publication date: 6 May 2024

517

Abstract

Purpose

Vision, audition, olfactory, tactile and taste are five important senses that human uses to interact with the real world. As facing more and more complex environments, a sensing system is essential for intelligent robots with various types of sensors. To mimic human-like abilities, sensors similar to human perception capabilities are indispensable. However, most research only concentrated on analyzing literature on single-modal sensors and their robotics application.

Design/methodology/approach

This study presents a systematic review of five bioinspired senses, especially considering a brief introduction of multimodal sensing applications and predicting current trends and future directions of this field, which may have continuous enlightenments.

Findings

This review shows that bioinspired sensors can enable robots to better understand the environment, and multiple sensor combinations can support the robot’s ability to behave intelligently.

Originality/value

The review starts with a brief survey of the biological sensing mechanisms of the five senses, which are followed by their bioinspired electronic counterparts. Their applications in the robots are then reviewed as another emphasis, covering the main application scopes of localization and navigation, objection identification, dexterous manipulation, compliant interaction and so on. Finally, the trends, difficulties and challenges of this research were discussed to help guide future research on intelligent robot sensors.

Keywords

Citation

Zhou, Y., Yan, Z., Yang, Y., Wang, Z., Lu, P., Yuan, P.F. and He, B. (2024), "Bioinspired sensors and applications in intelligent robots: a review", Robotic Intelligence and Automation, Vol. 44 No. 2, pp. 215-228. https://doi.org/10.1108/RIA-07-2023-0088

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Yanmin Zhou, Zheng Yan, Ye Yang, Zhipeng Wang, Ping Lu, Philip F. Yuan and Bin He.

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial & non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

Since 1959, robots have played an essential role in many dangerous, exhausting and repetitive tasks, such as continuous use in the industrial field to improve efficiency and quality (Oztemel and Gursev, 2020). With the development of perception range and accuracy in recent decades, intelligent robots gradually stepped out of the isolation of industrial scenes and began to integrate into human daily life. As shown in Figure 1, to meet increasingly diverse and complex application requirements, intelligent robots should possess:

  • environmental perception capabilities as inputs for robot interactions (Tian et al., 2017);

  • motion execution capabilities to cope with complex tasks (Abdelaal et al., 2021); and

  • thinking and evolution capabilities to regulate and improve their behaviors adaptively according to the various environments (Pointeau et al., 2014).

Furthermore, the foundation of robot intelligence is an environmental perception (Young, 2017), the basis of which is sensors, transforming the physical excitation of the environment into electrical signal outputs. As shown in Figure 1, the sensors applied in intelligent robots should have the following characteristics:

  • rich sensing types to meet the demand of complex dynamic environment perception;

  • excellent sensing performance to ensure reliable information inputs for intelligent robots; and

  • ingenious device structure to satisfy mechanical and electrical constrains of intelligent robots.

Inspired by biological organisms, researchers construct types of intelligent robot structures to be applied to different scenarios (Ren and Liang, 2014), such as humanoid robots in household and social scenes (Van Doorn et al., 2017; Kuindersma et al., 2016), four-legged robots on complicated ground (Guizzo, 2019; Hao et al., 2020) and flying robots in the sky (Zeng and Zhang, 2017; Mozaffari et al., 2016). Likewise, humans have five important sensing, (i.e. vision, audition, tactile, olfactory and taste), which are expected to provide natural bioinspired prototypes for intelligent robot sensors, as shown in Figure 2. It is expected to promote the research of intelligent robot sensing systems that meet the above characteristics, by studying the sensing mechanisms of biological systems.

Up to then, there have been several surveys published on sensing of intelligent robots, such as vision sensors (Steffen et al., 2019), odor sensors (Chen and Huang, 2019), vibration sensors (Pearson et al., 2011) and tactile sensors (Yang et al., 2019). Most research only concentrated on analyzing literature on single-modal sensors and robotics applications. To this end, this article presents a comprehensive review of the sensors corresponding to the five human senses, especially brief thinking of multimodal sensing and predicting current trends and prospects of this field, which may have continuous enlightenments.

The rest of this article is organized as follows: Section 2 briefly reviews the biological perception mechanisms of vision, audition, tactile, olfactory and taste, while the Section 3 introduces the corresponding bioinspired sensors for robots; Section 4 introduces the applications of bioinspired sensors on intelligent robots; Section 5 discusses the trends, difficulties and challenges of bioinspired sensors; and Section 6 is a summary of the review.

2. Biological sensing mechanisms of five senses

Through sensory organs, humans are able to perceive and interact with their surroundings. The five senses (sight, hearing, touch, smell and taste) are fundamental to human perceptions and are given particular weight. The first three of them are known as responses to physical stimuli, and the latter two can perceive the chemical composition changes of the environment. In this section, we briefly discuss the biological perception mechanisms focusing on these five sense organs.

2.1 Biological vision sensing mechanism

Vision is integral to human perception, acting as the primary conduit for gathering environmental information through eyes (Chen et al., 2017). Human eyes, spherical in nature, are sectioned into two major parts – front and back – bisected by the crystalline lens situated centrally. This lens is transparent, elastic and convex, adjusting its shape via ligament suspension to accommodate varying distances of target objects. Both the lens and the cornea, located at the front of the eyeball, collaboratively manage the eye’s focus, whereas the dilation and constriction of the pupil within the colored iris regulate the volume of light entering the eye. At the rear of the eyeball lies the retina, the designated area for image reception. Cellular electrical signals here respond to fluctuations in light and are transmitted through optic nerve fibers to the pertinent brain region, thereby constructing our visual experience (Gu et al., 2020).

2.2 Biological auditory sensing mechanism

The human ear is capable of detecting sounds within a frequency range of 20 Hz–20 kHz, primarily relying on the conductive capabilities of the outer and middle ear, in conjunction with the sensory functions of the inner ear, auditory nerve and auditory cortex (Yao and Zhang, 2001). Sound reaches the inner ear via both air and bone conduction. Air conduction is a sound transmission mechanism in which sound waves, collected by the pinna, traverse the external auditory canal to reach the tympanic membrane and then propagate through the ossicular chain. This chain comprises three ossicles: the malleus, incus and stapes. Specifically, the footplate of the stapes is in close contact with the oval window of the inner ear. The activity of the oval window membrane excites the adjacent inner ear lymph. This fluid incites the spiral organs situated on the basilar membrane to generate nerve impulses. These impulses are then conveyed via the auditory nerve to the auditory cortex in the brain, resulting in the sensation of hearing. Bone conduction refers to an alternative sound transmission mechanism where sound waves reach the inner ear via the vibration of the skull. It causes the lymph fluid to vibrate, which, in turn, stimulates the basilar membrane to generate nerve impulses. These impulses are similarly transmitted to the auditory cortex, culminating in the perception of sound (Zhang and Gan, 2011).

2.3 Biological tactile sensing mechanism

Human tactile perception is facilitated by the skin, abundant in mechanoreceptors that are part of the peripheral nervous system. Sensitive nerve cells, such as Merkel disks, Meissner corpuscles and free nerve endings, reside in the dermis layer. The skin’s soft tissue deforms upon tactile contact, triggering these embedded nerve cells to generate electrical pulses. These pulses reflect the stress, strain and temperature experienced by the surrounding soft tissue during stimulation (Romo and Salinas, 1999; Maeno et al., 1998; Dahiya et al., 2009; Abraira and Ginty, 2013). Consequently, the types and locations of the tactile signal are differentiated, giving rise to the sensation of touch that travels along the nerve fibers to the brain.

2.4 Biological olfactory sensing mechanism

Human odor receptors are located in the upper olfactory epithelium of the nasal cavity, which is mainly composed of olfactory sensory neurons, supporting cells and basal cells (Lledo et al., 2005; Sharma et al., 2019; Bushdid et al., 2014). The dendrites end become round and expand dozens of cilia into the mucus after reaching the surface of the olfactory epithelium. There are G protein-coupled odor receptors on the plasma membrane of cilia, which have seven helical transmembrane structures to combine odor molecules. Theodor receptors then activate the specific G protein Golf, open the cyclic nucleotide gating ion channel, produce membrane depolarization and form an action potential. The oscillation pattern (frequency and amplitude) of the potential generated on the olfactory bulb is affected by the odor, thereby realizing the detection and recognition of odor molecules (Firestein, 2001).

2.5 Biological taste sensing mechanism

The generation of human taste depends on taste buds (Roper and Chaudhari, 2017; Taruno et al., 2013). Taste cells are cloaked in a bilipid layer membrane, with microvilli adorning their apices. The rest of the surface is wrapped by groove-like cells that are flat and opaque to the external environment. The microvilli can only interact with the mouth’s saliva via a small orifice at the peak of the taste bud. When taste substances are absorbed by lipid membranes on different taste cells, various electrical properties (such as membrane potential) change to encode corresponding taste experiences.

3. Bioinspired robot sensors of five senses

To endow intelligent robots with human-like senses, researchers have committed to the study of sensing technology rooted in biological sensing principles (Pfeifer et al., 2007). In this section, we overview the advancements in the field, focusing on representative biomimetic sensors.

3.1 Bioinspired vision sensors

Charge-coupled device (CCD) and complementary metal oxide semiconductor (CMOS) vision sensors are commonly used in the robotics field (Fossum and Hondongwa, 2014). However, CCD sensors (Zhang et al., 2020) and CMOS sensors (Fossum and Hondongwa, 2014; Nikonov and Young, 2013) typically sample and transmit a large amount of pixels within a set time frame, which inevitably leads to delays and redundancy in information while also placing significant demands on bandwidth, energy consumption and storage.

As a result, bioinspired vision sensors have continued to emerge, including dynamic vision sensors (DVSs) (Lichtsteiner et al., 2006; Lichtsteiner et al., 2008), asynchronous time-based image sensors (ATISs) (Posch et al., 2010; Posch et al., 2007) and dynamic and active pixel vision sensors (DAVISs) (Brandli et al., 2014a, Lenero-Bardallo et al., 2015). The bioinspired vision sensor combines the biological functions of “where” and “what” of the human visual system using an asynchronous event-driven method to process visual information. They boast benefits such as the suppression of redundancy, integrated processing, rapid sensing capabilities, a wide dynamic range and low power consumption.

In 2002, Kramer et al. proposed DVS that simulates the function of human retinal Y cells to perceive the dynamic information in the scene (Kramer, 2002). In 2004, Lichtsteiner proposed an improved practical DVS to provide a more uniform and symmetric output of positive and negative events, thus significantly expanding the dynamic perception range (Lichtsteiner et al., 2004). It makes DVS the first commercial neuromorphic sensor, which is similar to its corresponding biological model: responding to natural events that occur in the observation scene based on event-driven mode rather than clock-driven mode (Sivilotti, 1991; Mahowald, 1992). Pixels change autonomously with the relative intensity with microsecond time resolution. Each pixel asynchronously sends an ON event if the logarithmic compressed light intensity of the pixel increases by a fixed amount. Conversely, an OFF event will be sent. This system enables continuous information transmission and processing, with communication bandwidth being used only by activated pixels. Recently in 2017, Yuan Xu further reduced DVS noise by introducing the concept of a smoothing filter (Xu et al., 2017).

ATIS integrates the biological visual perception functions of “where” and “what,” leveraging a variety of biological heuristics such as event-based imaging. Posch committed to enhancing DVS by adding the DPS structure, leading to the ATIS (Posch et al., 2010; Posch et al., 2014; Chen et al., 2011). The sensor consists of a fully autonomous pixel array, which incorporates an event-based change detector (CD) and an exposure measurement (EM) unit based on pulse width modulation (PWM). The EM unit, locally activated by a single pixel, uses the CD to detect the brightness changes within its field of view. ATIS outputs both time contrast event data and the absolute intensity of each event.

DAVIS, another visual sensing sensor that simulates the “where” and “what” of biological visual perception, was proposed by Christian Brandli in 2014 (Brandli et al., 2014a, Berner et al., 2013; Brandli et al., 2014b). DAVIS combines the advantages of DVSs and the active pixel sensors (APSs) at the pixel level, synchronously coordinating the APS asynchronous image frame outputs and the DVS asynchronous event outputs. The incorporation of a shared photodiode and a more compact APS circuit results in the DAVIS pixel area being 60% smaller than the ATIS pixel area.

3.2 Bioinspired auditory sensors

Most of the design of the bioinspired auditory sensors is derived from the structure of the human ear. For example, many scholars used nano/microstructures and piezoelectric materials to fabricate a variety of flexible vibration sensors on ultrathin substrates. Such design adjusts the resonance frequency through optimization of the three-dimensional (3D) geometric dimensions of the membrane, allowing selective absorption of vibration frequencies by parameter combination to filter certain frequency ranges (An et al., 2018). Some researchers used piezoelectric materials as substitutes for damaged cochlear hair cells to provide the electrical signal in the artificial auditory system (Fan et al., 2015; Yang et al., 2015). These sensors can rapidly detect the vibrations changing across a wide frequency range (from 100 to 3,200 Hz) showing potential to replace artificial auditory organs (Yu et al., 2016; Gu et al., 2015; Seol et al., 2014; Yang et al., 2014; Wu et al., 2018). Other than those, Mannoor has 3D-printed entirely functional, bioinspired ears using hydrogels embedded with living cells, which maintain structural and shape integrity (Mannoor et al., 2013).

Most artificial auditory systems struggle in conditions with low signal-to-noise ratios and varying acoustic environments. Inspired by the human hearing mechanism, Claudia (Lenk et al., 2023) developed a microelectromechanical cochlea as a bioinspired acoustic sensor with integrated signal processing functionality and adaptability tuned for noisy conditions. Composed of bionic acoustic sensors with thermomechanical feedback mechanisms, exhibits active auditory sensing, allowing the sensor to adjust its properties to different acoustic environments. Real-time feedback is used to tune the sensing and processing properties, and dynamic switching between linear and nonlinear characteristics improves the detection of signals in noisy conditions, increases the sensor dynamic range and enables adaptation to changing acoustic environments.

In the human ear, spike-based auditory neurons biologically process sound signals in a remarkably energy-efficient manner. Inspired by this, Yun (Yun et al., 2023) constructed a self-aware artificial auditory neuron module by serially connecting a triboelectric nanogenerator (TENG) and a bi-stable resistor, detecting the sound pressure level and encoding it into a spike form, which is then relayed to artificial synapses [metal-oxide-semiconductor field-effect transistors (MOSFETs)] as input neurons for the spiking neural network. The TENG serves as a sound sensor and a current source to feed current input so as to awaken the resistor, which acts as a device-level neuron that is dissimilar to a traditional circuit-based neuron.

In addition, another research focus is designing sensors inspired by some organisms that have unique vibrational receptors that can perceive an extremely wide range of sound waves or selectively filter and discern specific frequency ranges. The research and practical implementation of such sound-sensing mechanisms could potentially result in hearing devices that surpass human capabilities. For instance, the design of a resistance strain gauge based on a flexible metal film, capable of detecting minute vibrations with an amplitude of merely 10 nm, was inspired by the unique crack structures found on spider metatarsals (Kang et al., 2014). Similarly, sensor fibers that emulate the exceptional vibrational sensing performance of the Venus flytrap can sensitively detect directional vibration signals ranging from 20 kHz to several GHz (McConney et al., 2009; Fratzl and Barth, 2009; Talukdar et al., 2015). The outstanding acoustic localization ability of the Omeya brown fly is another typical bioinspired prototype for auditory sensors with microphone arrays (Miles et al., 1997; Lisiewski et al., 2011).

3.3 Bioinspired tactile sensors

Tactile perception is a crucial component of the intelligent perception of autonomous robots. It provides information on the force and surface characteristics of the contact point between the robot and the object(Chen et al., 2021). Tactile sensors can be classified according to human-like conduction mechanisms, such as changes in capacitance, resistance and charge. Pressure-sensitive resistors use the piezoresistive effect to sense touch, and the resistance changes under external mechanical forces (Stassi et al., 2014; Kou et al., 2018). Based on this principle, force-sensing resistors with multilayer structures (Fukui et al., 2011; Teshigawara et al., 2011; Drimus et al., 2014; Büscher et al., 2015) are widely used in pressure-sensing equipment for contact localization (Dahiya et al., 2011). Capacitive tactile sensors operate on the principle of plate capacitance, wherein the capacitance value shifts with the varying distance between plates due to normal forces (Lee et al., 2008), and the tangential force can also be detected by embedding multiple capacitors. Piezoelectric sensors use the piezoelectric effect to sense dynamic contact force (Dahiya et al., 2009; Seminara et al., 2011). The quantum tunneling effect sensor can change from an insulator to a conductor under compression, achieving normal and tangential force sensing with a sensitivity up to 0.45 mV/mN and a maximum dynamic value of 20 N (Zhang et al., 2012).

Innovative configurations, using optical and air pressure sensing mechanisms, and multimodal tactile sensors are also emerging. The optical sensor measures the contact force based on the light reflection between materials with different refractive indexes under stress, as exemplified by the “GelSight” tactile sensor proposed by Johnson et al. (2011), and TacLINK proposed by JAIST (Van Duong, 2020). Tactile sensors based on the air/hydraulic pressure measurement use pressure sensors that are generally used for air and liquid pressure sensing. These sensors use the air or liquid inside them as a vibration propagation medium, allowing for the simultaneous gathering of high-frequency responses and deformation, as demonstrated by the BioTac finger from SynTouch LLC (Lai et al., 2011; Yu et al., 2018). The structure-acoustic tactile sensor uses the accelerometer and microphone to measure the contact force (Kyberd et al., 1998), offering a broad detection bandwidth, though limited to dynamic sensing. The sensor imitating the shell effect can estimate the distance to an object by comparing the noise level in the environment and inside the sensor (Jiang and Smith, 2012). Li designed a magnetostrictive tactile sensor that is able to detect the force and stiffness of the manipulated object (Li et al., 2018).

Inspired by the touch mechanism of human skin, Zhou (Zhou et al., 2023) has proposed a custom, modular and multimodal robotic haptic sensing sensor. This sensor, comprising pressure, proximity, acceleration and temperature detectors, is scalable and apt for broad surface coverage on robots. With a joint focus on mechanical structure and data fusion algorithm design, a multilevel event-driven data fusion algorithm is used to efficiently process information from a large array of haptic sensors.

3.4 Bioinspired olfactory sensors

The electronic nose is a typical artificial olfactory that mimics biological olfaction. The early related research can be traced back to the 1990s, focusing on the mechanical structure design of the odor-sensing module. Ishida et al. (1999) proposed a biologically inspired implementation of a 3D odor compass to detect the flow direction, consisting of four odor sensors, two motors and a fan. The odor sensors obtain relatively independent stimulation in separate airflows through mechanical isolations (like the septum in biological noses and the sniffing effect of silkworm wings), and the two motors rotate the compass head individually to seek the direction with balanced outputs from the sensors. However, the mechanical electronic nose is inefficient and vulnerable because of the limitation of mechanical structure.

Therefore, another focus of bioinspired olfactory sensor research lies in the design of odor sensor arrays (as olfactory receptors) and signal processing systems (as olfactory cortical neural networks). The bioinspired olfactory sensor inherits the advantages of the biochemical systems with high sensitivity, fast response and strong specificity. In 2012, Juhun Park’s team proposed an electronic nose using carbon nano transistor sensors and odor-sensitive materials to express the olfactory sensory neurons of the cfOR5269 receptors (Park et al., 2012). This nose was used to selectively detect the hexanal, which is a food oxidation indicator. In 2016, Jing’s team designed an electronic nose consisting of six semiconductor gas sensors and five infrared sensors and proposed a bioinspired neural network to process the odor concentration signal of the electronic nose. Classifications of eight Chinese liquors of similar degrees from different brands have been realized without any signal preprocessing and feature selection (Jing et al., 2016).

The sense of smell in vivo has also developed rapidly in recent years. Lu et al. fixed the odor binding receptor of honeybees on the surface of the impedance chip to specifically detect pheromone (Lu et al., 2014). Strauch’s team from Rome University used in vivo calcium imaging technology to study the response of the olfactory neurons on the Drosophila antennae to volatile substances produced by cancer cells and noncancerous cells (Strauch et al., 2014). Nowotny’s team from Sussex University used microelectrodes to record Drosophila olfactory neurons in vivo, the classifications of 36 different alcoholic gases and 35 industrial gases were achieved with the support vector machine classification algorithm (Nowotny et al., 2014).

3.5 Bioinspired taste sensors

The first electronic tongue, which relies on the cross-reaction of a sensor array to analyze liquid samples, was introduced by Vlasov and his team in 2008. Researchers have harnessed the local field potential of the taste cortex alongside the action potential of a single neuron (spike) to effectively differentiate between substances such as sucrose, salt, hydrochloric acid and denatonium benzoate (Liu et al., 2010; Zhang et al., 2014; Qin et al., 2017). Much like olfactory receptors, the receptors for bitter, sweet and umami tastes are also G protein coupled. Scientists from Seoul University and Ohio State University expressed the human taste receptor hT1R2+hT1R3 heterodimer on the membrane of HEK-293 cells. These cells were then shattered into nanovesicles bearing taste receptors and anchored on the surface of a field effect transistor, creating a novel sensing technology for detecting sweet substances (Song et al., 2014). Certain researchers have used a quartz crystal microbalance to devise a bioinspired taste sensor capable of effectively detecting bitter substance denatonium within a particular concentration range (Wu et al., 2013). In 2019, Wang developed a new type of bioelectronic taste sensor specifically for detecting bitter substances (Wang et al., 2019).

The field of in vivo taste sensing is rapidly advancing as well. In 2016, Qin introduced an in vivo taste biosensor using brain-computer interface technology (Qin et al., 2016). They implanted a microelectrode array into the taste cortex of rats and recorded the electrophysiological activities of the neural network under taste stimulation. This biosensor demonstrated high sensitivity in detecting bitterness. Other researchers (Qin et al., 2017) have also reported a comprehensive animal biosensor, which leverages brain-machine interface technology in conjunction with anesthetized rats for the detection of bitter substances.

4. Robotic applications of bioinspired sensors

Bioinspired sensing forms the foundation for human-like robot behavior and safe human–robot interaction (Figure 3). In this section, we aim to survey the related applications of bioinspired sensors on intelligent robots.

4.1 Applications of vision in robots

Vision, which is widely used in robot perception, allows for the identification of the shape, location, color and movement of target objects (Servières et al., 2021). One of the most researched fields is simultaneous localization and mapping (SLAM), commonly used in tasks such as environmental perception, comprehension, self-localization and path planning in unfamiliar environments. SLAM serves as the foundation for autonomous navigation and obstacle avoidance of robots (Cadena et al., 2016; Fuentes-Pacheco et al., 2015). Vison-based SLAM (VSLAM) integrates SLAM with GPU, RGB-D cameras and stereo cameras (Schneider et al., 2018). Nowadays, the robustness and reliability of VSLAM have been highly improved, resulting in its widespread use in practical applications (Chen et al., 2022).

The applications of bioinspired vision sensors are emerging. For example, in 2018, Scaramuzza took the frequency accumulation image as input to the deep learning network, based on an “image frame” concept (Zhang and Scaramuzza, 2018). The ON and OFF pulse streams were accumulated into grayscale images according to the time-domain frequency, followed by the ResNet to predict steering wheel angles in the automatic driving scene. In 2019, Zhu mapped the pulse stream outputs from the DVS into grayscale images in the time-domain order and estimated the optical flow using the proposed EV-FlowNet network (Zhu et al., 2019). Zhu proposes a visual reconstruction framework inspired by biological asynchronous spikes for the first time. Combined with the information on spatiotemporal spikes, the neuromorphic adaptive adjustment mechanism is used to reconstruct vision with an ultrahigh temporal resolution for natural scene perception, analysis and recognition (Zhu et al., 2022).

4.2 Applications of audition in robots

Language is one of the primary means of information expression and communication, not just for humans but also for robots. In the realm of human–robot interaction and communication, language plays a key role in the user interface and is instrumental in controlling speech-active robots. SHAKEY-II, which might be the first endeavor in this field, emerged in 1960 (Frenkel and Karen, 1985). Nowadays, almost all intelligent terminals, such as mobile phones, laptops and vehicles, are equipped with speech recognition software.

Sound source localization, a classic application of nonspeech sound detection in robotics, is accomplished by determining the source of sound with respect to one or multimicrophones (Sasaki et al., 2011). Jwu-Sheng Hu et al. proposed a method combining Direction of Arrival estimation and bearing-only SLAM for simultaneous localization of a mobile robot and an unknown number of multiple sound sources in the environment (Hu et al., 2011). Guanqun Liu et al. equipped a wheeled robot with a 32-channel concentric microphone array to build a two-dimensional sound source map using direction localization (Sasaki et al., 2006). When more sound sources occur, bioinspired attention mechanisms are used for localization of the loudest one (Michaud et al., 2007) or the closest one (Nguyen and Choi, 2016).

Artificial bioinspired sound sensors that outperform human hearing capabilities have been developed, broadening the application of underwater robots. For instance, acoustic navigation systems are used for the navigation and localization of autonomous underwater vehicles (AUV), where global positioning system and high-frequency radio signals are not available in the underwater environment (Siddiqui et al., 2015; Allotta et al., 2014). Optoacoustic imaging and passive acoustics are even used for exo-ocean exploration and understanding (Aguzzi et al., 2020).

4.3 Applications of tactile in robots

Robots come into physical interaction with humans and the environment under a variety of circumstances. The tactile sensing is required for scenes such as human–robot safe collaboration, skillful manipulation and special applications like soft robots. For these reasons, tactile sensors are installed on various parts of the robot, including hands, arms, feet or even the whole body. Active infrared proximity sensors, which are similar to the hairs on some insect legs, might be the first tactile sensors applied in robotic applications. As early as 1989, a real-time motion planning system for a robot arm manipulator, fully covered with active infrared proximity sensors, was proposed to operate among arbitrarily shaped obstacles (Cheung and Lumelsky, 1989). For robots to closely interact with humans and the environment, direct force sensing is necessary (Suen et al., 2018), further involved in the recognition of touch gestures and emotions (Li et al., 2022).

Currently, several commercially available tactile sensors, like the Tekscan sensor (Brimacombe et al., 2009), the “GelSight” optical sensor (Li and Adelson, 2013) and the BioTac sensor (Ruppel et al., 2019), are integrated into robot hands or fingers to enable sophisticated hand operations. To achieve whole-body compliance upon contact, large-area tactile skins have been the focus of many studies. addressing challenges such as scalability, robotic coverage and real-time data processing. Solutions often involve techniques like flexible materials, modular design and artificial intelligence (Chun et al., 2018; Dean-Leon et al., 2019). Gordon Cheng constructed a robot skin consisting of 1260 self-organized and self-calibrated hexagonal sensor “cells,” each carrying temperature, pressure (normal forces), accelerations (in x, y and z directions) and proximity sensors, plus a microcontroller for local computation, data reading and communication. Robots covered by this skin can achieve their position, velocity and torque through the sensor data and subsequent calculation. Moreover, soft robots integrated with tactile sensing exhibit significant potential in shape sensing, feedback control, manipulation, tactile exploration, reaction and so on (Tang et al., 2021; Shih et al., 2020).

4.4 Applications of olfactory in robots

Sniffing robots, generally equipped with electronic noses, can complete hazardous tasks such as the identification and localization of gas leaks and environmental monitoring of volatile substances. To improve detection capability and identification accuracy, the research mainly focuses on the design of electronic noses and gas signal processing algorithms.

Early results in this field focused on the design of electronic nose structures. Rozas et al. constructed a robot with a nose of mechanical structure (using a fan for active sniffing) with six different types of semiconductor gas sensors for Cf, Cb, Cl and Cr sensing. The robot located the odor source by moving along the increase of odor concentration (Rozas et al., 1991). To address the challenges of complex and variable airflow, active search strategies are also essential in addition to electronic nose in robotic odor source localization. Therefore, various algorithms have been proposed to guide the path planning for mobile robots, such as gradient-based algorithms and bioinspired algorithms (Chen and Huang, 2019). Furthermore, the use of multiple robots equipped with odor sensors is also an important research hotspot in robot active olfactory. Hayes et al. used a group of six robots to construct an odor concentration distribution map of the interested area, followed by the map analysis to obtain the odor source information (Hayes et al., 2002). Recently, with the high miniaturization and low power consumption of gas sensors, aerial drones carried with electronic noses have been used for environmental chemical sensing applications (Burgués and Marco, 2020), or even working in swarms (Szulczyński et al., 2017) for the rapid odor distribution acquisition over unknown area with relatively large size for agriculture and security services.

4.5 Applications of taste in robots

Similar to electronic noses, electronic tongues consist of a taste sensor array that can analyze one substance or more. An electronic tongue could allow robots to gain a taste sense by responding to chemicals in solutions or solids. Several commercial platforms are available, such as the SA 402B taste-sensing system (Intelligent Sensor Technology Company, Limited., Tokyo, Japan), TS-5000Z (Insent Inc., Atsugi-chi, Japan) and αAstree2 (AlphaMOS, Toulouse, France). These systems normally feature a robotic arm and a rotary sample holder platform, where the arm holds testing probes with taste sensors and inserts into the target samples. Then the experiment data is recorded and analyzed by a computer. However, these systems can hardly be identified as intelligent robots, because their behaviors are limited to automatic sensing data collecting and analyzing.

Compared with electronic noses, the applications of electronic tongues in robots are less common. Russell constructed a mobile robot called The Autonomous Sensory Tongue Investigator, capable of following chemical trails with the robot tongue (Russell, 2006). As an early work in this field, the robot could collect nonvolatile chemicals (NaCl for experiments) on the ground by lowering and raising a sensor holding mechanism. It tasted the ionic compound by dissolving it in the distilled water in the dampened cotton tape, then followed trails by simple algorithms. However, it tastes an ionic compound concentration rather than distinguish between different compounds. Bianca Ciui et al. proposed a robot hand that could detect chemicals via fingertips (Ciui et al., 2018). The robot hand, equipped with a chemical-flavor sensing glove, was able to touch and sense the glucose, ascorbic acid and capsaicin constituents. Consequently, it discriminates sweetness with the middle finger, sourness with the index finger and spiciness with the ring finger, making it capable of discriminating a wide range of food samples.

5. Trends, difficulties and challenges of bioinspired sensors

As we mentioned, considerable progress has been made in the systematic research of the performance optimization, structure design, integration, flexibility and bionics of sensing devices (Ran et al., 2021). However, before these bioinspired sensing systems can be integrated into various intelligent robots, several critical elements will require thorough investigation.

5.1 Surpassed biological perception

Although various bioinspired sensors have been developed, their perception performances are yet to match their biological counterparts. With the advent of artificial intelligence, considerable strides have been made in integrating different artificial transduction technologies with AI or even biological materials, along with the use of appropriate tools for data processing. This has paved the way for a new concept of next-generation devices, which is expected to develop sensors that surpass human and biological perceptions. Such advancements in calculating and sensing could redefine how robots interact with the surrounding world, opening up a plethora of applications and possibilities.

5.2 Multifunction and integration

Combining multiple sensors and microprocessors allows for not just abundant sensing, but also the incorporation of artificial intelligence functionalities such as information processing, logical discrimination, self-diagnosis and cognitive processes (Dong et al., 2021). This kind of sensor array offers many advantages, such as versatility, high performance, small size and suitability for mass production and easy operation (Wu et al., 2016). However, there remain many challenges, including device performance degradation; multidimension; multistimulus crosstalk decoupling under simultaneous detection; achieving consistency in force, thermal and electrical properties among the components of the integrated sensing system; and algorithm requirements for high speed and low consumption.

5.3 Application of advanced materials

Materials form the primary basis of sensor performance and the improvement thereof. For instance, advancements in optical fibers and superconducting materials provided a material foundation for the development of new sensors (Campbell and Dincă, 2017). Many modern sensors are based on semiconductors, such as infrared sensors, laser sensors and fiber optic sensors (Lim et al., 2020). In addition, flexible sensors show great application potential in the future human–computer interaction systems, intelligent robots, mobile health care and other fields, which also rely heavily on the research and development of new materials (He et al., 2020; Pei et al., 2019), such as nanomaterials. In addition, exploring the use of neuromorphic materials to enable bioinspired sensing in energy-efficient and computationally powerful ways becomes a heat.

5.4 Easy application on robots

The applications of sensors on robots present various hardware and engineering demands, such as reliability, cost, manufacturability and economy, as illustrated in Figure 4. A distributed robot sensing system could comprise thousands of diverse sensors and associated electronic components, making wiring, power management, data processing and reliability significant concerns. Modularity is an established technique for organizing and simplifying such complex systems. Provisions must be included for fault tolerance, graceful degradation, self-diagnostics and self-healing. What is more, the viability of the robotic sensing system largely depends on manufacturability and maintenance. A system that is easy to manufacture, install and maintain will naturally have a competitive edge economically feasible.

5.5 Multimodal sensors fusion

By emulating how humans process signals from various senses to make decisions and adapt to the environment, it is expected that more robust and efficient systems can be devised. Research ideas in this field might include developing cross-modal algorithms to enhance perception by sharing multimodal sensor information; implementing a brain-inspired sensing fusion framework that integrates sensor information hierarchically at multiple abstraction levels to refine the fusion process; and adaptively adjusting the correlation and weights of different sensor modes based on bioinspired attention mechanism to further promote the intelligent fusion process.

6. Conclusion

The intelligent perception of robots hinges on the ability to acquire, process and comprehend a myriad of environmental information, with multimodal sensors serving as the foundational element. Humans are endowed with efficient senses (visual, auditory, tactile, olfactory and gustatory) to capture environmental information, or directly react to physical world stimuli to convey objective information. This information is transformed into neural electrical signals at the point of stimulation, which are subsequently relayed to the brain via neural networks, enabling comprehension and recognition of the environment. This biological process has spurred the emergence of a variety of biomimetic sensors, each designed to emulate this natural mechanism.

This work first briefly surveys the physiological principles of biological vision, audition, tactile, olfactory and taste senses, which serve as the basis for the sensing mechanisms of corresponding biomimetic sensors. The subsequent focus is on the robotic application of various biomimetic sensors, including autonomous positioning, environmental detection and autonomous navigation. The requirements and expectations of bioinspired sensors and their applications on robots are discussed at the end of this review. It is hoped that the survey proves beneficial to practitioners designing bioinspired sensing systems and to researchers working on intelligent robotics involving environment perception.

Figures

Summary of intelligent robot capabilities and perception requirements in complex environments

Figure 1

Summary of intelligent robot capabilities and perception requirements in complex environments

Five bioinspired senses applications on intelligent robots

Figure 2

Five bioinspired senses applications on intelligent robots

Robot task implementation algorithm architecture

Figure 3

Robot task implementation algorithm architecture

Summary of robot sensing system requirements and expectations

Figure 4

Summary of robot sensing system requirements and expectations

References

Abdelaal, A.E., Liu, J., Hong, N., Hager, G.D. and Salcudean, S.E. (2021), “Parallelism in autonomous robotic surgery”, IEEE Robotics and Automation Letters, Vol. 6 No. 2, pp. 1824-1831.

Abraira, V.E. and Ginty, D.D. (2013), “The sensory neurons of touch”, Neuron, Vol. 79 No. 4, pp. 618-639.

Aguzzi, J., Flexas, M.D.M., FLöGEL, S., LO Iacono, C., Tangherlini, M., Costa, C., Marini, S., Bahamon, N., Martini, S. and Fanelli, E. (2020), “Exo-ocean exploration with deep-sea sensor and platform technologies”, Astrobiology, Vol. 20 No. 7, pp. 897-915.

Allotta, B., Costanzi, R., Meli, E., Pugi, L., Ridolfi, A. and Vettori, G. (2014), “Cooperative localization of a team of AUVs by a tetrahedral configuration”, Robotics and Autonomous Systems, Vol. 62 No. 8, pp. 1228-1237.

An, B.W., Heo, S., Ji, S., Bien, F. and Park, J.-U. (2018), “Transparent and flexible fingerprint sensor array with multiplexed detection of tactile pressure and skin temperature”, Nature Communications, Vol. 9 No. 1, p. 2458.

Andrychowicz, O.M., Baker, B., Chociej, M., Jozefowicz, R., Mcgrew, B., Pachocki, J., Petron, A., Plappert, M., Powell, G. and Ray, A. (2020), “Learning dexterous in-hand manipulation”, The International Journal of Robotics Research, Vol. 39 No. 1, pp. 3-20.

Berner, R., Brandli, C., Yang, M., Liu, S.-C. and Delbruck, T. (2013), “A 240× 180 10mw 12us latency sparse-output vision sensor for mobile applications”, 2013 Symposium on VLSI Circuits, IEEE, pp. C186-C187.

Brandli, C., Berner, R., Yang, M., Liu, S.-C. and Delbruck, T. (2014a), “A 240× 180 130 db 3 µs latency global shutter spatiotemporal vision sensor”, IEEE Journal of Solid-State Circuits, Vol. 49 No. 10, pp. 2333-2341.

Brandli, C., Muller, L. and Delbruck, T. (2014b), “Real-time, high-speed video decompression using a frame - and event-based DAVIS sensor”, 2014 IEEE International Symposium on Circuits and Systems (ISCAS), IEEE, pp. 686-689.

Brimacombe, J.M., Wilson, D.R., Hodgson, A.J., Ho, K.C. and Anglin, C. (2009), “Effect of calibration method on Tekscan sensor accuracy”.

BURGUéS, J. and Marco, S. (2020), “Environmental chemical sensing using small drones: a review”, Science of the Total Environment, Vol. 748, p. 141172.

BüSCHER, G.H., KõIVA, R., SCHüRMANN, C., Haschke, R. and Ritter, H.J. (2015), “Flexible and stretchable fabric-based tactile sensor”, Robotics and Autonomous Systems, Vol. 63, pp. 244-252.

Bushdid, C., Magnasco, M.O., Vosshall, L.B. and Keller, A. (2014), “Humans can discriminate more than 1 trillion olfactory stimuli”, Science, Vol. 343 No. 6177, pp. 1370-1372.

Cadena, C., Carlone, L., Carrillo, H., Latif, Y., Scaramuzza, D., Neira, J., Reid, I. and Leonard, J.J. (2016), “Past, present, and future of simultaneous localization and mapping: toward the robust-perception age”, IEEE Transactions on Robotics, Vol. 32 No. 6, pp. 1309-1332.

Campbell, M.G. and Dincă, M. (2017), “Metal–organic frameworks as active materials in electronic sensor devices”, Sensors, Vol. 17 No. 5, p. 1108.

Chen, X.-X. and Huang, J. (2019), “Odor source localization algorithms on mobile robots: a review and future outlook”, Robotics and Autonomous Systems, Vol. 112, pp. 123-136.

Chen, C., Jafari, R. and Kehtarnavaz, N. (2017), “A survey of depth and inertial sensor fusion for human action recognition”, Multimedia Tools and Applications, Vol. 76 No. 3, pp. 4405-4425.

Chen, D.G., Matolin, D., Bermak, A. and Posch, C. (2011), “Pulse-modulation imaging–review and performance analysis”, IEEE Transactions on Biomedical Circuits and Systems, Vol. 5 No. 1, pp. 64-82.

Chen, W., Shang, G., Ji, A., Zhou, C., Wang, X., Xu, C., Li, Z. and Hu, K. (2022), “An overview on visual slam: from tradition to semantic”, Remote Sensing, Vol. 14 No. 13, p. 3010.

Chen, J., Zhu, Y., Chang, X., Pan, D., Song, G., Guo, Z. and Naik, N. (2021), “Recent progress in essential functions of soft electronic skin”, Advanced Functional Materials, Vol. 31 No. 42, p. 2104686.

Cheung, E. and Lumelsky, V.J. (1989), “Proximity sensing in robot manipulator motion planning: system and implementation issues”, IEEE Transactions on Robotics and Automation, Vol. 5 No. 6, pp. 740-751.

Chun, K.Y., Son, Y.J., Jeon, E.S., Lee, S. and Han, C.S. (2018), “A self‐powered sensor mimicking slow‐and fast‐adapting cutaneous mechanoreceptors”, Advanced Materials, Vol. 30 No. 12, p. 1706299.

Ciui, B., Martin, A., Mishra, R.K., Nakagawa, T., Dawkins, T.J., Lyu, M., Cristea, C., Sandulescu, R. and Wang, J. (2018), “Chemical sensing at the robot fingertips: toward automated taste discrimination in food samples”, ACS Sensors, Vol. 3 No. 11, pp. 2375-2384.

Dahiya, R.S., Cattin, D., Adami, A., Collini, C., Barboni, L., Valle, M., Lorenzelli, L., Oboe, R., Metta, G. and Brunetti, F. (2011), “Towards tactile sensing system on chip for robotic applications”, IEEE Sensors Journal, Vol. 11 No. 12, pp. 3216-3226.

Dahiya, R.S., Metta, G., Valle, M. and Sandini, G. (2009), “Tactile sensing–from humans to humanoids”, IEEE Transactions on Robotics, Vol. 26 No. 1, pp. 1-20.

Dang, T., Khattak, S., Mascarich, F. and Alexis, K. (2019), “Explore locally, plan globally: a path planning framework for autonomous robotic exploration in subterranean environments”, 2019 19th International Conference on Advanced Robotics (ICAR), IEEE, pp. 9-16.

Dean-Leon, E., Guadarrama-Olvera, J.R., Bergner, F. and Cheng, G. (2019), “Whole-body active compliance control for humanoid robots with robot skin”, 2019 International Conference on Robotics and Automation (ICRA), IEEE, pp. 5404-5410.

Dong, B., Shi, Q., Yang, Y., Wen, F., Zhang, Z. and Lee, C. (2021), “Technology evolution from self-powered sensors to AIoT enabled smart homes”, Nano Energy, Vol. 79, p. 105414.

Drimus, A., Kootstra, G., Bilberg, A. and Kragic, D. (2014), “Design of a flexible tactile sensor for classification of rigid and deformable objects”, Robotics and Autonomous Systems, Vol. 62 No. 1, pp. 3-15.

Fan, X., Chen, J., Yang, J., Bai, P., Li, Z. and Wang, Z.L. (2015), “Ultrathin, rollable, paper-based triboelectric nanogenerator for acoustic energy harvesting and self-powered sound recording”, ACS Nano, Vol. 9 No. 4, pp. 4236-4243.

Firestein, S. (2001), “How the olfactory system makes sense of scents”, Nature, Vol. 413 No. 6852, pp. 211-218.

Fossum, E.R. and Hondongwa, D.B. (2014), “A review of the pinned photodiode for CCD and CMOS image sensors”, IEEE Journal of the Electron Devices Society, Vol. 2 No. 3.

Fratzl, P. and Barth, F.G. (2009), “Biomaterial systems for mechanosensing and actuation”, Nature, Vol. 462 No. 7272, pp. 442-448.

Frenkel, A. and Karen, A. (1985), “Robots, machines in man’s image, New York”.

Fuentes-Pacheco, J., Ruiz-Ascencio, J. and RENDóN-Mancha, J.M. (2015), “Visual simultaneous localization and mapping: a survey”, Artificial Intelligence Review, Vol. 43 No. 1, pp. 55-81.

Fukui, W., Kobayashi, F., Kojima, F., Nakamoto, H., Imamura, N., Maeda, T. and Shirasawa, H. (2011), “High-speed tactile sensing for array-type tactile sensor and object manipulation based on tactile information”, Journal of Robotics, Vol. 2011.

Gu, L., Cui, N., Liu, J., Zheng, Y., Bai, S. and Qin, Y. (2015), “Packaged triboelectric nanogenerator with high endurability for severe environments”, Nanoscale, Vol. 7 No. 43, pp. 18049-18053.

Gu, L., Poddar, S., Lin, Y., Long, Z., Zhang, D., Zhang, Q., Shu, L., Qiu, X., Kam, M. and Javey, A. (2020), “A biomimetic eye with a hemispherical perovskite nanowire array retina”, Nature, Vol. 581 No. 7808, pp. 278-282.

Guizzo, E. (2019), “By leaps and bounds: an exclusive look at how Boston dynamics is redefining robot agility”, IEEE Spectrum, Vol. 56 No. 12, pp. 34-39.

Hao, Q., Wang, Z., Wang, J. and Chen, G. (2020), “Stability-guaranteed and high terrain adaptability static gait for quadruped robots”, Sensors, Vol. 20 No. 17, p. 4911.

Hayes, A.T., Martinoli, A. and Goodman, R.M. (2002), “Distributed odor source localization”, IEEE Sensors Journal, Vol. 2 No. 3, pp. 260-271.

He, S., Li, S., Nag, A., Feng, S., Han, T., Mukhopadhyay, S.C. and Powel, W. (2020), “A comprehensive review of the use of sensors for food intake detection”, Sensors and Actuators A: Physical, Vol. 315, p. 112318.

Hu, J.-S., Chan, C.-Y., Wang, C.-K., Lee, M.-T. and Kuo, C.-Y. (2011), “Simultaneous localization of a mobile robot and multiple sound sources using a microphone array”, Advanced Robotics, Vol. 25 Nos 1/2, pp. 135-152.

Ishida, H., Kobayashi, A., Nakamoto, T. and Moriizumi, T. (1999), “Three-dimensional odor compass”, IEEE Transactions on Robotics and Automation, Vol. 15 No. 2, pp. 251-257.

Jiang, L.-T. and Smith, J.R. (2012), “Seashell effect Pretouch sensing for robotic grasping”, ICRA, pp. 2851-2858.

Jing, T., Meng, Q.H. and Ishida, H. (2021), “Recent progress and trend of robot odor source localization”, IEEJ Transactions on Electrical and Electronic Engineering, Vol. 16 No. 7, pp. 938-953.

Jing, Y.-Q., Meng, Q.-H., Qi, P.-F., Cao, M.-L., Zeng, M. and Ma, S.-G. (2016), “A bioinspired neural network for data processing in an electronic nose”, IEEE Transactions on Instrumentation and Measurement, Vol. 65 No. 10, pp. 2369-2380.

Johnson, M.K., Cole, F., Raj, A. and Adelson, E.H. (2011), “Microgeometry capture using an elastomeric sensor”, ACM Transactions on Graphics (TOG), Vol. 30 No. 4, pp. 1-8.

Kang, D., Pikhitsa, P.V., Choi, Y.W., Lee, C., Shin, S.S., Piao, L., Park, B., Suh, K.-Y., Kim, T.-I. and Choi, M. (2014), “Ultrasensitive mechanical crack-based sensor inspired by the spider sensory system”, Nature, Vol. 516 No. 7530, pp. 222-226.

Kou, H., Zhang, L., Tan, Q., Liu, G., Lv, W., Lu, F., Dong, H. and Xiong, J. (2018), “Wireless flexible pressure sensor based on micro-patterned graphene/PDMS composite”, Sensors and Actuators A: Physical, Vol. 277, pp. 150-156.

Kramer, J. (2002), “An integrated optical transient sensor”, IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, Vol. 49 No. 9, pp. 612-628.

Kuindersma, S., Deits, R., Fallon, M., Valenzuela, A., Dai, H., Permenter, F., Koolen, T., Marion, P. and Tedrake, R. (2016), “Optimization-based locomotion planning, estimation, and control design for the atlas humanoid robot”, Autonomous Robots, Vol. 40 No. 3, pp. 429-455.

Kyberd, P.J., Evans, M. and Te Winkel, S. (1998), “An intelligent anthropomorphic hand, with automatic grasp”, Robotica, Vol. 16 No. 5, pp. 531-536.

Lai, C.-W., LO, Y.-L., Yur, J.-P. and Chuang, C.-H. (2011), “Application of fiber Bragg grating level sensor and Fabry-Perot pressure sensor to simultaneous measurement of liquid level and specific gravity”, IEEE Sensors Journal, Vol. 12 No. 4, pp. 827-831.

Leboutet, Q., Dean-Leon, E., Bergner, F. and Cheng, G. (2019), “Tactile-based whole-body compliance with force propagation for mobile manipulators”, IEEE Transactions on Robotics, Vol. 35 No. 2, pp. 330-342.

Lee, H.-K., Chung, J., Chang, S.-I. and Yoon, E. (2008), “Normal and shear force measurement using a flexible polymer tactile sensor with embedded multiple capacitors”, Journal of Microelectromechanical Systems, Vol. 17, pp. 934-942.

Lenero-Bardallo, J.A., HäFLIGER, P., Carmona-GALáN, R. and Rodriguez-Vazquez, A. (2015), “A bio-inspired vision sensor with dual operation and readout modes”, IEEE Sensors Journal, Vol. 16 No. 2, pp. 317-330.

Lenk, C., HöVEL, P., Ved, K., Durstewitz, S., Meurer, T., Fritsch, T., MäNNCHEN, A., KüLLER, J., Beer, D. and Ivanov, T. (2023), “Neuromorphic acoustic sensing using an adaptive microelectromechanical cochlea with integrated feedback”, Nature Electronics, Vol. 6 No. 5, pp. 1-11.

Li, R. and Adelson, E.H. (2013), “Sensing and recognizing surface textures using a Gelsight sensor”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1241-1247.

Lichtsteiner, P., Delbruck, T. and Kramer, J. (2004), “Improved on/off temporally differentiating address-event imager”, Proceedings of the 2004 11th IEEE International Conference on Electronics, Circuits and Systems, ICECS 2004, IEEE, pp. 211-214.

Lichtsteiner, P., Posch, C. and Delbruck, T. (2008), “A 128 × 128 120 dB 15 mus latency asynchronous temporal contrast vision sensor”, IEEE Journal of Solid-State Circuits, Vol. 43 No. 2, p. 43.

Li, Y., Wang, B., Zhang, B., Weng, L., Huang, W. and Liu, H. (2018), “Design and output characteristics of magnetostrictive tactile sensor for detecting force and stiffness of manipulated objects”, IEEE Transactions on Industrial Informatics, Vol. 15 No. 2, pp. 1219-1225.

Li, Y.-K., Meng, Q.-H., Wang, Y.-X., Yang, T.-H. and Hou, H.-R. (2022), “MASS: a multisource domain adaptation network for Cross-Subject touch gesture recognition”, IEEE Transactions on Industrial Informatics, Vol. 19 No. 3, pp. 3099-3108.

Lichtsteiner, P., Posch, C. and Delbruck, T. (2006), “A 128 x 128 120db 30mw asynchronous vision sensor that responds to relative intensity change”, 2006 IEEE International Solid State Circuits Conference-Digest of Technical Papers, IEEE, pp. 2060-2069.

Lim, H.R., Kim, H.S., Qazi, R., Kwon, Y.T., Jeong, J.W. and Yeo, W.H. (2020), “Advanced soft materials, sensor integrations, and applications of wearable flexible hybrid electronics in healthcare, energy, and environment”, Advanced Materials, Vol. 32 No. 15, p. 1901924.

Lisiewski, A., Liu, H., Yu, M., Currano, L. and Gee, D. (2011), “Fly-ear inspired micro-sensor for sound source localization in two dimensions”, The Journal of the Acoustical Society of America, Vol. 129 No. 5, pp. EL166-EL171.

Liu, Q., Ye, W., Xiao, L., Du, L., Hu, N. and Wang, P. (2010), “Extracellular potentials recording in intact olfactory epithelium by microelectrode array for a bioelectronic nose”, Biosensors and Bioelectronics, Vol. 25 No. 10, pp. 2212-2217.

Lledo, P.-M., Gheusi, G. and Vincent, J.-D. (2005), “Information processing in the mammalian olfactory system”, Physiological Reviews, Vol. 85 No. 1, pp. 281-317.

Lu, Y., Zhuang, S., Zhang, D., Zhang, Q., Zhou, J., Dong, S., Liu, Q. and Wang, P. (2014), “Olfactory biosensor using odorant-binding proteins from honeybee: ligands of floral odors and pheromones detection by electrochemical impedance”, Sensors and Actuators B: Chemical, Vol. 193, pp. 420-427.

Mcconney, M.E., Anderson, K.D., Brott, L.L., Naik, R.R. and Tsukruk, V.V. (2009), “Bioinspired material approaches to sensing”, Advanced Functional Materials, Vol. 19 No. 16, pp. 2527-2544.

Maeno, T., Kobayashi, K. and Yamazaki, N. (1998), “Relationship between the structure of human finger tissue and the location of tactile receptors”, JSME International Journal Series C, Vol. 41 No. 1, pp. 94-100.

Mahowald, M. (1992), “VLSI analogs of neuronal visual processing: a synthesis of form and function”.

Mannoor, M.S., Jiang, Z., James, T., Kong, Y.L., Malatesta, K.A., Soboyejo, W.O., Verma, N., Gracias, D.H. and Mcalpine, M.C. (2013), “3D printed bionic ears”, Nano Letters, Vol. 13 No. 6, pp. 2634-2639.

Michaud, F., CôTé, C., LéTOURNEAU, D., Brosseau, Y., Valin, J.-M., Beaudry, É., RAïEVSKY, C., Ponchon, A., Moisan, P. and Lepage, P. (2007), “Spartacus attending the 2005 AAAI conference”, Autonomous Robots, Vol. 22 No. 4, pp. 369-383.

Miles, R., Tieu, T., Robert, D. and Hoy, R. (1997), “A mechanical analysis of the novel ear of the parasitoid fly Ormia Ochracea”, Proceedings: Diversity in Auditory Mechanics, pp. 18-24.

Mozaffari, M., Saad, W., Bennis, M. and Debbah, M. (2016), “Unmanned aerial vehicle with underlaid device-to-device communications: performance and tradeoffs”, IEEE Transactions on Wireless Communications, Vol. 15 No. 6, pp. 3949-3963.

Nguyen, Q. and Choi, J. (2016), “Selection of the closest sound source for robot auditory attention in multi-source scenarios”, Journal of Intelligent & Robotic Systems, Vol. 83 No. 2, pp. 239-251.

Nikonov, D.E. and Young, I.A. (2013), “Overview of beyond-CMOS devices and a uniform methodology for their benchmarking”, Proceedings of the IEEE, Vol. 101 No. 12, pp. 2498-2533.

Nowotny, T., DE Bruyne, M., Berna, A.Z., Warr, C.G. and Trowell, S.C. (2014), “Drosophila olfactory receptors as classifiers for volatiles from disparate real world applications”, Bioinspiration & Biomimetics, Vol. 9 No. 4, p. 46007.

Oztemel, E. and Gursev, S. (2020), “Literature review of industry 4.0 and related technologies”, Journal of Intelligent Manufacturing, Vol. 31 No. 1, pp. 127-182.

Park, J., Lim, J.H., Jin, H.J., Namgung, S., Lee, S.H., Park, T.H. and Hong, S. (2012), “A bioelectronic sensor based on canine olfactory nanovesicle–carbon nanotube hybrid structures for the fast assessment of food quality”, The Analyst, Vol. 137 No. 14, pp. 3249-3254.

Pearson, M.J., Mitchinson, B., Sullivan, J.C., Pipe, A.G. and Prescott, T.J. (2011), “Biomimetic vibrissal sensing for robots”, Philosophical Transactions of the Royal Society B: Biological Sciences, Vol. 366 No. 1581, pp. 3085-3096.

Pei, Y., Wang, W., Zhang, G., Ding, J., Xu, Q., Zhang, X., Yang, S., Shen, N., Lian, Y. and Zhang, L. (2019), “Design and implementation of T-type MEMS heart sound sensor”, Sensors and Actuators A: Physical, Vol. 285, pp. 308-318.

Pfeifer, R., Lungarella, M. and Iida, F. (2007), “Self-organization, embodiment, and biologically inspired robotics”, Science, Vol. 318 No. 5853, pp. 1088-1093.

Pham, T., L.I., G., Bekyarova, E., Itkis, M.E. and Mulchandani, A. (2019), “MoS2-based optoelectronic gas sensor with Sub-parts-per-billion limit of NO2 gas detection”, ACS Nano, Vol. 13 No. 3, pp. 3196-3205.

Pointeau, G., Petit, M. and Dominey, P.F. (2014), “Successive developmental levels of autobiographical memory for learning through social interaction”, IEEE Transactions on Autonomous Mental Development, Vol. 6 No. 3, pp. 200-212.

Posch, C., Hofstatter, M., Litzenberger, M., Matolin, D., Donath, N., Schon, P. and Garn, H. (2007), “Wide dynamic range, high-speed machine vision with a 2× 256 pixel temporal contrast vision sensor”, 2007 IEEE International Symposium on Circuits and Systems, IEEE, pp. 1196-1199.

Posch, C., Matolin, D. and Wohlgenannt, R. (2010), “A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS”, IEEE Journal of Solid-State Circuits, Vol. 46 No. 1, pp. 259-275.

Posch, C., Serrano-Gotarredona, T., Linares-Barranco, B. and Delbruck, T. (2014), “Retinomorphic event-based vision sensors: bioinspired cameras with spiking output”, Proceedings of the IEEE, Vol. 102 No. 10, pp. 1470-1484.

Qin, Z., Zhang, B., Gao, K., Zhuang, L., Hu, N. and Wang, P. (2017), “A whole animal-based biosensor for fast detection of bitter compounds using extracellular potentials in rat gustatory cortex”, Sensors and Actuators B: Chemical, Vol. 239, pp. 746-753.

Qin, Z., Zhang, B., H.U., L., Zhuang, L., Hu, N. and Wang, P. (2016), “A novel bioelectronic tongue in vivo for highly sensitive bitterness detection with brain–machine interface”, Biosensors and Bioelectronics, Vol. 78, pp. 374-380.

Ran, Z., He, X., Rao, Y., Sun, D., Qin, X., Zeng, D., Chu, W., Li, X. and Wei, Y. (2021), “Fiber-optic microstructure sensors: a review”, Photonic Sensors, Vol. 11 No. 2, pp. 227-261.

Ren, L. and Liang, Y. (2014), “Preliminary studies on the basic factors of bionics”, Science China Technological Sciences, Vol. 57 No. 3, pp. 520-530.

Romo, R. and Salinas, E. (1999), “Sensing and deciding in the somatosensory system”, Current Opinion in Neurobiology, Vol. 9 No. 4, pp. 487-493.

Roper, S.D. and Chaudhari, N. (2017), “Taste buds: cells, signals and synapses”, Nature Reviews Neuroscience, Vol. 18 No. 8, pp. 485-497.

Rozas, R., Morales, J. and Vega, D. (1991), “Artificial smell detection for robotic navigation”, Fifth International Conference on Advanced Robotics' Robots in Unstructured Environments, IEEE, pp. 1730-1733.

Ruppel, P., Jonetzko, Y., GöRNER, M., Hendrich, N. and Zhang, J. (2019), “Simulation of the SynTouch BioTac sensor”, Intelligent Autonomous Systems 15: Proceedings of the 15th International Conference IAS-15, Springer, pp. 374-387.

Russell, R.A. (2006), “TASTI follows chemical trails with its robot tongue”, 2006 IEEE International Conference on Robotics and Biomimetics, IEEE, pp. 257-262.

Sasaki, Y., Fujihara, T., Kagami, S., Mizoguchi, H. and Oro, K. (2011), “32-channel omni-directional microphone array design and implementation”, Journal of Robotics and Mechatronics, Vol. 23 No. 3, pp. 378-385.

Sasaki, Y., Kagami, S. and Mizoguchi, H. (2006), “Multiple sound source mapping for a mobile robot by self-motion triangulation”, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, pp. 380-385.

Schneider, T., Dymczyk, M., Fehr, M., Egger, K., Lynen, S., Gilitschenski, I. and Siegwart, R. (2018), “Maplab: an open framework for research in visual-inertial mapping and localization”, IEEE Robotics and Automation Letters, Vol. 3 No. 3, pp. 1418-1425.

Seminara, L., Capurro, M., Cirillo, P., Cannata, G. and Valle, M. (2011), “Electromechanical characterization of piezoelectric PVDF polymer films for tactile sensors in robotics applications”, Sensors and Actuators A: Physical, Vol. 169 No. 1, pp. 49-58.

Seol, M.L., Woo, J.H., Lee, D.I., Im, H., Hur, J. and Choi, Y.K. (2014), “Nature‐replicated nano‐in‐micro structures for triboelectric energy harvesting”, Small, Vol. 10 No. 19, pp. 3887-3894.

Servières, M., Renaudin, V., Dupuis, A. and Antigny, N. (2021), “Visual and visual-inertial slam: state of the art, classification, and experimental benchmarking”, Journal of Sensors, Vol. 2021, pp. 1-26.

Sharma, A., Kumar, R., Aier, I., Semwal, R., Tyagi, P. and Varadwaj, P. (2019), “Sense of smell: structural, functional, mechanistic advancements and challenges in human olfactory research”, Current Neuropharmacology, Vol. 17 No. 9, pp. 891-911.

Shih, B., Shah, D., Li, J., Thuruthel, T.G., Park, Y.-L., Iida, F., Bao, Z., Kramer-Bottiglio, R. and Tolley, M.T. (2020), “Electronic skins and machine learning for intelligent soft robots”, Science Robotics, Vol. 5 No. 41, p. eaaz9239.

Siddiqui, S.I., Ludvigsen, M. and Dong, H. (2015), “Analysis, verification and optimization of AUV navigation using underwater acoustic models”, OCEANS 2015-MTS/IEEE Washington, DC, IEEE, pp. 1.-6.

Sivilotti, M.A. (1991), Wiring Considerations in Analog VLSI Systems, with Application to Field-Programmable Networks, CA Institute of Technology, CA.

Song, H.S., Jin, H.J., Ahn, S.R., Kim, D., Lee, S.H., Kim, U.-K., Simons, C.T., Hong, S. and Park, T.H. (2014), “Bioelectronic tongue using heterodimeric human taste receptor for the discrimination of sweeteners with human-like performance”, ACS Nano, Vol. 8 No. 10, pp. 9781-9789.

Sousa, P., Marques, L. and Almeida, A.T. (2008), “Toward chemical-trail following robots”, 2008 Seventh International Conference on Machine Learning and Applications, IEEE, pp. 489-494.

Stassi, S., Cauda, V., Canavese, G. and Pirri, C.F. (2014), “Flexible tactile sensing based on piezoresistive composites: a review”, Sensors, Vol. 14 No. 3, pp. 5296-5332.

Steffen, L., Reichard, D., Weinland, J., Kaiser, J., Roennau, A. and Dillmann, R. (2019), “Neuromorphic stereo vision: a survey of bio-inspired sensors and algorithms”, Frontiers in Neurorobotics, Vol. 13, p. 28.

Strauch, M., LüDKE, A., MüNCH, D., Laudes, T., Galizia, C.G., Martinelli, E., Lavra, L., Paolesse, R., Ulivieri, A. and Catini, A. (2014), “More than apples and oranges-detecting cancer with a fruit fly’s antenna”, Scientific Reports, Vol. 4 No. 1, p. 3576.

Suen, M.-S., Lin, Y.-C. and Chen, R. (2018), “A flexible multifunctional tactile sensor using interlocked zinc oxide nanorod arrays for artificial electronic skin”, Sensors and Actuators A: Physical, Vol. 269, pp. 574-584.

Szulczyński, B., Wasilewski, T., Wojnowski, W., Majchrzak, T., Dymerski, T., Namieśnik, J. and Gębicki, J. (2017), “Different ways to apply a measurement instrument of E-nose type to evaluate ambient air quality with respect to odour nuisance in a vicinity of municipal processing plants”, Sensors, Vol. 17 No. 11, p. 2671.

Talukdar, A., Faheem Khan, M., Lee, D., Kim, S., Thundat, T. and Koley, G. (2015), “Piezotransistive transduction of femtoscale displacement for photoacoustic spectroscopy”, Nature Communications, Vol. 6 No. 1, p. 7885.

Tang, H., Nie, P., Wang, R. and Sun, J. (2021), “Piezoresistive electronic skin based on diverse bionic microstructure”, Sensors and Actuators A: Physical, Vol. 318, p. 112532.

Taruno, A., Vingtdeux, V., Ohmoto, M., M.A., Z., Dvoryanchikov, G., Li, A., Adrien, L., Zhao, H., Leung, S. and Abernethy, M. (2013), “CALHM1 ion channel mediates purinergic neurotransmission of sweet, bitter and umami tastes”, Nature, Vol. 495 No. 7440, pp. 223-226.

Teshigawara, S., Tsutsumi, T., Shimizu, S., Suzuki, Y., Ming, A., Ishikawa, M. and Shimojo, M. (2011), “Highly sensitive sensor for detection of initial slip and its application in a multi-fingered robot hand”, 2011 IEEE International Conference on Robotics and Automation, IEEE, pp. 1097-1102.

Tian, Y.-H., Chen, X.-L., Xiong, H.-K., L.I., H.-L., Dai, L.-R., Chen, J., Xing, J.-L., Chen, J., Wu, X.-H. and Hu, W.-M. (2017), “Towards human-like and transhuman perception in AI 2.0: a review”, Frontiers of Information Technology & Electronic Engineering, Vol. 18 No. 1, pp. 58-67.

Van Doorn, J., Mende, M., Noble, S.M., Hulland, J., Ostrom, A.L., Grewal, D. and Petersen, J.A. (2017), “Domo arigato Mr Roboto: emergence of automated social presence in organizational frontlines and customers’ service experiences”, Journal of Service Research, Vol. 20 No. 1, pp. 43-58.

Van Duong, L. (2020), “Large-scale vision-based tactile sensing for robot links: design, modeling, and evaluation”, IEEE Transactions on Robotics, Vol. 37 No. 2, pp. 390-403.

Wang, J., Kong, S., Chen, F., Chen, W., Du, L., Cai, W., Huang, L., Wu, C. and Zhang, D.-W. (2019), “A bioelectronic taste sensor based on bioengineered Escherichia coli cells combined with ITO-constructed electrochemical sensors”, Analytica Chimica Acta, Vol. 1079, pp. 73-78.

Wu, C., Du, L., Zou, L., Huang, L. and Wang, P. (2013), “A biomimetic bitter receptor-based biosensor with high efficiency immobilization and purification using self-assembled aptamers”, The Analyst, Vol. 138 No. 20, pp. 5989-5994.

Wu, X., Han, Y., Zhang, X., Zhou, Z. and Lu, C. (2016), “Large‐area compliant, low‐cost, and versatile pressure‐sensing platform based on microcrack‐designed carbon black@ polyurethane sponge for human–machine interfacing”, Advanced Functional Materials, Vol. 26 No. 34, pp. 6246-6256.

Wu, C., Kima, T.W., Sung, S., Park, J.H. and Li, F. (2018), “Ultrasoft and cuttable paper-based triboelectric nanogenerators for mechanical energy harvesting”, Nano Energy, Vol. 44, pp. 279-287.

Xu, Y., Liu, S., Lu, H. and Zhang, Z. (2017), “A neighbor pixel communication filtering structure for dynamic vision sensors”, Eighth International Conference on Graphic and Image Processing (ICGIP 2016), SPIE, pp. 309-314.

Yang, J., Chen, J., Liu, Y., Yang, W., Su, Y. and Wang, Z.L. (2014), “Triboelectrification-based organic film nanogenerator for acoustic energy harvesting and self-powered active acoustic sensing”, ACS Nano, Vol. 8 No. 3, pp. 2649-2657.

Yang, J., Chen, J., Su, Y., Jing, Q., Li, Z., Yi, F., Wen, X., Wang, Z. and Wang, Z.L. (2015), “Eardrum‐inspired active sensors for self‐powered cardiovascular system characterization and throat‐attached anti‐interference voice recognition”, Advanced Materials, Vol. 27 No. 8, pp. 1316-1326.

Yang, J.C., Mun, J., Kwon, S.Y., Park, S., Bao, Z. and Park, S. (2019), “Electronic skin: recent progress and future prospects for skin‐attachable devices for health monitoring, robotics, and prosthetics”, Advanced Materials, Vol. 31 No. 48, p. 1904765.

Yao, J. and Zhang, Y.-T. (2001), “Bionic wavelet transform: a new time-frequency method based on an auditory model”, IEEE Transactions on Biomedical Engineering, Vol. 48, pp. 856-863.

Young, R. (2017), “A general architecture for robotics systems: a perception-based approach to artificial life”, Artificial Life, Vol. 23 No. 2, pp. 236-286.

Yu, A., Chen, X., Wang, R., Luo, J., Chen, L., Zhang, Y., Wu, W., Liu, C. and Yuan, H. (2016), “Triboelectric nanogenerator as a self-powered communication unit for processing and transmitting information”, ACS Nano, Vol. 10 No. 4, pp. 3944-3950.

Yu, J., Yu, H. and Li, D. (2018), “Design and characteristic analysis of cross-capacitance fuel-level sensor”, Sensors, Vol. 18 No. 11, p. 3984.

Yun, S.-Y., Han, J.-K., Lee, S.-W., Yu, J.-M., Jeon, S.-B. and Choi, Y.-K. (2023), “Self-aware artificial auditory neuron with a triboelectric sensor for spike-based neuromorphic hardware”, Nano Energy, Vol. 109, p. 108322.

Zeng, Y. and Zhang, R. (2017), “Energy-efficient UAV communication with trajectory optimization”, IEEE Transactions on Wireless Communications, Vol. 16 No. 6, pp. 3747-3760.

Zhang, X. and Gan, R.Z. (2011), “A comprehensive model of human ear for analysis of implantable hearing devices”, IEEE Transactions on Biomedical Engineering, Vol. 58 No. 10, pp. 3024-3027.

Zhang, Z. and Scaramuzza, D. (2018), “Perception-aware receding horizon navigation for MAVs”, 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, pp. 2534-2541.

Zhang, T., Liu, H., Jiang, L., Fan, S. and Yang, J. (2012), “Development of a flexible 3-D tactile sensor system for anthropomorphic artificial hand”, IEEE Sensors Journal, Vol. 13 No. 2, pp. 510-518.

Zhang, F., Zhang, Q., Zhang, D., Lu, Y., Liu, Q. and Wang, P. (2014), “Biosensor analysis of natural and artificial sweeteners in intact taste epithelium”, Biosensors and Bioelectronics, Vol. 54, pp. 385-392.

Zhang, C., Hu, H., Fang, D. and Duan, J. (2020), “The CCD sensor video acquisition system based on FPGA&MCU”, 2020 IEEE 9th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), IEEE, pp. 995-999.

Zhou, Y., Zhao, J., Lu, P., Wang, Z. and He, B. (2023), “TacSuit: a wearable large-area, bioinspired multi-modal tactile skin for collaborative robots”, IEEE Transactions on Industrial Electronics, Vol. 71 No. 2.

Zhu, L., Dong, S., Huang, T. and Tian, Y. (2022), “Ultra-high temporal resolution visual reconstruction from a fovea-like spike camera via spiking neuron model”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 45 No. 1, pp. 1233-1249.

Zhu, A.Z., Yuan, L., Chaney, K. and Daniilidis, K. (2019), “Unsupervised event-based learning of optical flow, depth, and egomotion”, Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 989-997.

Acknowledgements

This work was supported in part by the National Natural Science Foundation of China (Nos U2013602, 62088101, 51975415, 52002286 and 61825303), in part by the National Key Research and Development Program of China (No. 2020AAA0108905), in part by the Science and Technology Commission of Shanghai Municipality (No. 2021SHZDZX0100, 22ZR1467100) and the Fundamental Research Funds for the Central Universities.

Declaration of competing interest: The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Corresponding author

Bin He can be contacted at: hebin@tongji.edu.cn

Related articles