Skip to main content

PERSPECTIVE article

Front. Neurorobot., 11 December 2018
Volume 12 - 2018 | https://doi.org/10.3389/fnbot.2018.00084

Feel-Good Robotics: Requirements on Touch for Embodiment in Assistive Robotics

  • 1Elastic Lightweight Robotics, Department of Electrical Engineering and Information Technology, Robotics Research Institute, Technische Universität Dortmund, Dortmund, Germany
  • 2Institute for Mechatronic Systems, Mechanical Engineering, Technische Universität Darmstadt, Darmstadt, Germany
  • 3Neuroinformatics Group, Center of Excellence Cognitive Interaction Technology, Bielefeld University, Bielefeld, Germany
  • 4German Research Center for Artificial Intelligence, Robotics Innovation Center, Bremen, Germany
  • 5Robotics Group, University of Bremen, Bremen, Germany
  • 6Department of Cognitive and Clinical Neuroscience, Medical Faculty Mannheim, Central Institute of Mental Health, Heidelberg University, Mannheim, Germany
  • 7Department of Health Science and Technology, Faculty of Medicine, Center for Sensory-Motor Interaction, Aalborg University, Aalborg, Denmark
  • 8School of Applied Psychology, Institute Humans in Complex Systems, University of Applied Sciences and Arts Northwestern Switzerland, Olten, Switzerland
  • 9Delft Haptics Lab, Department of Cognitive Robotics, Faculty 3mE, Delft University of Technology, Delft, Netherlands
  • 10DLR German Aerospace Center, Institute of Robotics and Mechatronics, Oberpfaffenhofen, Germany
  • 11Cognitive Neuropsychology, Department of Psychology, University of Zurich, Zurich, Switzerland

The feeling of embodiment, i.e., experiencing the body as belonging to oneself and being able to integrate objects into one's bodily self-representation, is a key aspect of human self-consciousness and has been shown to importantly shape human cognition. An extension of such feelings toward robots has been argued as being crucial for assistive technologies aiming at restoring, extending, or simulating sensorimotor functions. Empirical and theoretical work illustrates the importance of sensory feedback for the feeling of embodiment and also immersion; we focus on the the perceptual level of touch and the role of tactile feedback in various assistive robotic devices. We critically review how different facets of tactile perception in humans, i.e., affective, social, and self-touch, might influence embodiment. This is particularly important as current assistive robotic devices – such as prostheses, orthoses, exoskeletons, and devices for teleoperation–often limit touch low-density and spatially constrained haptic feedback, i.e., the mere touch sensation linked to an action. Here, we analyze, discuss, and propose how and to what degree tactile feedback might increase the embodiment of certain robotic devices, e.g., prostheses, and the feeling of immersion in human-robot interaction, e.g., in teleoperation. Based on recent findings from cognitive psychology on interactive processes between touch and embodiment, we discuss technical solutions for specific applications, which might be used to enhance embodiment, and facilitate the study of how embodiment might alter human-robot interactions. We postulate that high-density and large surface sensing and stimulation are required to foster embodiment of such assistive devices.

1. Introduction

Due to recent societal trends and technical improvements, assistive robots, i.e., devices that enable or support the performance of a functional task, have gained increased importance in various applications such as prostheses, orthoses, exoskeletons, and devices for teleoperation (Dollar and Herr, 2008; Beckerle et al., 2017; Veneman et al., 2017; Fani et al., 2018). A central issue in assistive robotics is to what extent the device might and should be integrated into the user's bodily self-representation, which has been shown to be highly plastic, e.g., (Moseley et al., 2012). It has previously been argued that particularly robots aiming at restoring, improving, or enhancing human sensorimotor functions might benefit from enhanced embodiment (Pazzaglia and Molinari, 2016; Makin et al., 2017; Niedernhuber et al., 2018). This concerns ownership, i.e., the sensation that an object belongs to the body, as well as agency, i.e., the feeling of being in control of an object's movements (Synofzik et al., 2008). Thus, the crucial question is: how should assistive robots be designed for optimized integration with the user's body?

There has been a keen interest within psychological, neuroscientific, and philosophical studies to investigate to what extent an external object might become part of oneself. Many of these studies focused on the investigation of tool use. Importantly, due to the nature of tool use, this literature largely focuses on motor aspects and visuomotor contingencies. Yet, at least since the seminal study on the rubber hand illusion (Botvinick and Cohen, 1998), research has increasingly investigated the influence of sensory feedback and multisensory processing on the experience of the bodily self. Multisensory integration seems to be a crucial underpinning of the illusion (Bremner and Spence, 2017). Interestingly, a large body of literature suggests that various aspects of touch might differently modulate the feeling of ownership in rubber-hand-illusion-like setups. For example, illusory ownership of a virtual hand can be elicited when participants actively touch the limb (Hara et al., 2015), i.e., during self-touch. Specific, low speed tactile stimulation that activates specific fibers and is associated with positive feelings, i.e., affective touch (Löken et al., 2009), was observed to increase illusory ownership of a rubber hand (Crucianelli et al., 2013, 2017; van Stralen et al., 2014). These results suggest subtle and complex influences of multi-faceted tactile information on the sense of self and the integration of an external object into the person's bodily self-representation. The network of receptors in the human skin is spatially distributed and provides high-density information.

The majority of contemporary tactile feedback techniques, however, focus on low-density and spatially constrained tactile feedback related to active touch of external objects typically through fingers and hand (Antfolk et al., 2013b; Schofield et al., 2014; Svensson et al., 2017; Stephens-Fripp et al., 2018). Yet, this does by no means cover all facets of tactile afferent signals humans typically get from their body when interacting with their environment. Various interaction scenarios also rely on passive touch, i.e., being touched either by one self (self-touch) or by someone else (social/interpersonal touch). Beyond purely functional information about the consequence of a self-generated action, passive and especially social touch might be important in non-verbal communication, for example, in transferring emotional states (Hertenstein et al., 2006, 2009). The required spatial resolution is not yet known, which seems to highlight the limitations of most current technologies, but we think that cutting edge sensory and stimulation devices could help to explore this as argued in detail below.

These facets of touch have preliminarily been investigated in scenarios where they are mediated through a human-machine interface and related to prosthetics, telerobotics, and assistance (Haans and Usselsteijn, 2009; Hara et al., 2015; Huisman, 2017), but are far from being fully understood. Additionally, the applicability of the different facets might depend on the application domain: while it intuitively makes sense that affective tactile feedback might enhance the integration of a prostheses into the bodily self and might foster more natural interpersonal interaction, other assistive devices, e.g., teleoperation in hazardous areas, might not require integration into their user's bodily self, but nevertheless might benefit from more natural feedback to allow for intuitive interaction. Thus, a careful look into the requirements regarding tactile feedback in different domains is indispensable in order to design robots that actually ‘feel good' to their users.

Here, we will discuss to what degree engineering well-integrated tactile feedback could increase the integration of assistive robotic devices such as prostheses (Rosén et al., 2009; Marasco et al., 2011; Bensmaia and Miller, 2014), exoskeletons (Avizzano and Bergamasco, 1999; O'Malley and Gupta, 2008; Frisoli et al., 2009; Ben-Tzvi and Ma, 2015; Mallwitz et al., 2015; Shokur et al., 2016; Planthaber et al., 2018), and telerobotic devices (Gomez-Rodriguez et al., 2011; Gallo et al., 2012; Sengül et al., 2012; Pamungkas and Ward, 2014; Weber and Eichberger, 2015) into their users' body representations. Possibly, increased experience of the bodily self facilitates the use and increases the acceptance and performance of such devices (D'Alonzo and Cipriani, 2012; D'Alonzo et al., 2015; Imaizumi et al., 2016). Furthermore, we consider virtual reality (VR) training approaches as a flexible and low-cost measure (Holden, 2005) to support and enhance users in interacting with assistive robots.

We give theoretical and technical perspectives and suggestions to develop human-robot interactions that embrace affective and social facets of touch as well as self-touch. Therefore, we argue that a shift toward spatially distributed and high-density tactile sensing and stimulation would open up new avenues to empirically investigate user-oriented assistance in terms of applied sciences as well as neuroscientific and psychological research. Finally, the integration of different touch principles might enhance the usability and user experience. We discuss the relations of tactile feedback and perception of robotic devices on a perceptual level as well as how they influence the bodily self-experience and how they are influenced by human-machine interfaces. The subsequent sections discuss potential technologies and applications of high-density bidirectional interfaces with tactile stimulation over large surfaces and their relation to bodily self-experience.

2. Technical Solutions for Stimulation and Sensing Devices

We argue that provision of artificial touch feedback enhances bodily self-experience and that the integration of assistive devices into the bodily self improves control (Castellini et al., 2014; Beckerle, 2017). This is highly desirable whenever a human user must learn to use a robotic device as if it was a real extension of his or her body rather than a tool (Hahne et al., 2017). Hence, exploring the functions of the human sense of touch is an important research topic and it still requires substantial technological advancements in several aspects of human-machine interfacing (HMI). From the engineer's point of view, all facets of touch can be enforced via tactile sensing to detect the act of being touched (Dahiya et al., 2010; Zou et al., 2017) and tactile stimulation to elicit the feeling of being touched (Franceschi et al., 2017). To provide all required facets of touch, the HMI must mimic the human sense of touch and implement high sensitivity and distributed sensing and stimulation. Therefore, an ideal HMI enforcing touch comprises an integrated shape-conformable high-density, large-surface tactile stimulator and tactile sensor, with characteristics similar to those of the human skin (Dahiya et al., 2010). First works attempt to provide such HMIs (Kim et al., 2014), but many open research questions remain as outlined in the remainder. Interestingly, robotics research already starts to use whole-body artificial skin to endow humanoid robots with more human-like body experience, which is also used for control purposes, i.e., implementing a safety margin around the robot or supporting it to reach objects (Roncone et al., 2016).

2.1. Tactile Stimulation

Tactile stimulation is mainly applied to restore or transmit tactile feedback to a human. Methods for restoring tactile feedback have, for instance, been investigated in prosthetics (Li et al., 2017). However, the focus in previous work has been on providing active touch to improve manipulation or grasping – for example, using force feedback to allow manipulation of delicate objects. So far, affective and social touch have essentially not been considered. Tactile sensations can be elicited using invasive (Raspopovic et al., 2014) and non-invasive (Li et al., 2017) electrical stimulation to depolarize cutaneous afferents or by mechanical stimulation, e.g., vibration motors, force, and torque applicators (Schofield et al., 2014), to directly activate skin mechanoreceptors. However, most of the presented feedback systems are rather simple and include a single stimulator delivering stimuli related to one variable of the assistive robot only, e.g., grasping force of a prosthesis and/or hand aperture (Antfolk et al., 2013a). A few multichannel interfaces have been presented recently, coding feedback information through careful positioning of the stimulators (spatial coding) or using a combination of spatial and parameter modulation (mixed coding) (Dosen et al., 2017; Strbac et al., 2017).

Human-to-human social and affective touch is typically delivered in the form of distributed, non-stationary pressure patterns, e.g., a handshake, and/or gentle motions across the skin, e.g., a caress. To realistically emulate this type of interaction, a specific tactile interface is required, providing high-density stimulation through a dense network of spatially distributed stimulation channels, potentially covering large areas of the user's limb. Electrotactile stimulation is particularly suitable for this application due to its compactness, low power consumption, and rather immediate activation of cutaneous afferents. However, it needs to be considered that electrical stimulation requires calibration as the elicited sensations depend on body location. In addition, if not properly applied, the stimulation can be uncomfortable. Recently, flexible matrix electrodes for electrotactile stimulation have been presented and tested (Štrbac et al., 2016; Franceschi et al., 2017). The electrodes can be printed with a desired distribution and density of stimulation points.

2.2. Tactile Sensing

Flexible sensors providing distributed tactile force measurement (artificial skins) have already been presented (Kim et al., 2014; Büscher et al., 2015) and are potentially able to capture the complex mechanical interaction characteristics of social and affective touch. In order to endow an anthropomorphic robotic system such as a hand prosthesis with artificial touch, the tactile sensors need to cover a two-dimensionally curved surface (Kim et al., 2014), while maintaining high resolution and sensitivity. First promising attempts in this direction have been made by combining conductive elastomers with molded-interconnect-devices (MID) (Kõiva et al., 2013). The sensors also need to withstand the wear typical of affective touch situations such as petting and stroking, where the surfaces are slid against each other. Overall, the maintenance cycles, if any, need to be wide apart for the technical systems, for them to be of additional benefit for the user. Many current tactile sensor developments capture the forces normal to the surface; we rather argue that, for affective touch sensing, capturing shear forces is crucial, as affective touch often involves sliding. To the best of our knowledge, no single sensor development exists at present day that achieves all the desired features in one – normal and shear force sensing, high spatial resolution and sensitivity and capability of covering curved surfaces, while being robust enough to be used outside confined laboratory setups.

2.3. Integration of Sensing and Stimulation

Tactile sensing and stimulation need to be properly integrated. The data measured by the sensor need to be coded into stimulation profiles delivered by the stimulator. A particular challenge is to accommodate the abundance of tactile data (spatially distributed and high-density). Recently, a prototype system has been demonstrated for transmitting tactile information captured by an artificial skin sensor (64 taxels, i.e., tactile pixels) to a human participant using electrotactile stimulation through a matrix of 32 pads arranged around the forearm. As there were more taxels than pads, the information from several neighboring taxels was fused and mapped to a single pad. Despite using this compromise to cope with current technical limitations, the experimental results showed that human users could recognize a range of shapes (letters, geometries, and lines), retrace the exact trajectory, and guess the direction of the movement of the tactile stimulus (Franceschi et al., 2017). This is an encouraging result for the prospects of inducing and restoring affective and social touch: ideally, repeated, coherent, and discernible tactile stimuli patterns ‘perceived'by the robot could be forwarded to the human via biologically plausible mapping of tactile stimulation. What this mapping should look like, as well as the desirable nature of psychophysical properties of high-density stimulation, are still open questions. Moreover, since a tactile sensation is often associated with a characteristic thermal sensation, e.g., coldness for metal surfaces, warmth for human touch, first studies investigate multimodal feedback including thermal stimulation (Gallo et al., 2012; Pacchierotti et al., 2017).

3. Applications in Assistive Robotics

We believe that the ability to control an assistive robotic device is distinctly influencing its usage, acceptance, and integration into the user's bodily self. Moreover, corresponding sensory feedback, as a main part of the sensory-motor loop, is highly relevant for a person to perceive a body part as belonging to oneself. This section discusses how high-density tactile feedback does or could potentially improve different applications of assistive robotics organized by the relevant facets of touch.

3.1. Active Touch

Large-surface tactile sensing is an important aspect of modern robotics. Although grippers with high-resolution sensors are able to manipulate and recognize objects only by tactile guidance (Aggarwal et al., 2015), this data are typically used only by the robot control algorithms, but not provided to the human user. Directly conveying active touch could enhance the operator's control over the system and understanding of the environment. Similarly, exoskeletons that assist patients with disabilities are often missing tactile feedback, simplify it distinctly, or make use of sensory substitution, e.g., substituting tactile by vibrotactile feedback (Shokur et al., 2016). Certain sensory substitution approaches try to provide multimodal tactile feedback, e.g., including pressure, vibration, shear force, and temperature (Kim et al., 2010). Such multimodal stimulation could reestablish complex active touch sensations in healthy individuals and patients with somatosensory deprivation by either providing somatotopically matching haptic stimulation (Kim et al., 2010) or by stimulating uneffected body parts with unrestricted sensation (Meek et al., 1989). We argue, that especially for the latter case high-density and large-surface interfaces could enhance touch sensation by possibly enforcing sensory remapping. Natural touch sensation would not only improve the overall experience, but could further enhance learning of simple (Bark et al., 2015) as well as complex motor behavior (Sigrist et al., 2015). This is especially true when combined with VR-based training scenarios that are currently often lacking tactile feedback as well (Weiss et al., 2006; Bovet et al., 2018).

An enhancement of control is even more important for assistive devices that replace a body part, such as prostheses. Here, spatially distributed high-density tactile feedback could provide the missing tactile sensation and allow for an integrated processing of visual and somatosensory feedback to restore the user's feeling of agency and the perceived integrity of the body as investigations with rather simple vibrotactile feedback suggest (Prewett et al., 2012; Witteveen et al., 2015). It has been shown that amputees perceive a varying degree of ownership for their prosthetic devices (Kern et al., 2009), which has been proposed to be an important factor for prosthesis acceptance (Ehrsson et al., 2008). Touch might play a key role in this process: recent studies show that synchronous touches applied to both the residual limb and a prosthetic glove (Ehrsson et al., 2008) and physiologically appropriate cutaneous feedback (Marasco et al., 2011) is capable to induce vivid ownership sensations for a prosthesis, which not only becomes a tool, but an integrated part of the body (Graczyk et al., 2018). It turns out that the synchrony of stimulation and context (Rohde et al., 2011; Bekrater-Bodmann et al., 2012) as well as synchrony of multimodal sensory input (Choi et al., 2016) is of key importance.

3.2. Passive Touch

Passive touch is highly relevant for various domains of assistive robotics. Examples are telerobotics, e.g., the visual sight is very limited or simply not given under water, or during everyday use of robotic prostheses, e.g., when being touched in the dark or outside the field of view. Thus, an artificial skin to detect contacts with the environment should cover the whole surface of the robotic device. Such an approach could enhance the robots' and thus also the users' or teleoperators' insight into the actual environmental situation. Therefore, the HMI might cover large areas of the user's body to haptically mirror the robot's surface to the user. This could foster the integration of a teleoperated robot's body into the operator's self-representation for higher immersion and thus better control and could contribute to the restoration of perceived body integrity of individuals using prostheses. Despite technical feasibility, HMI design would then also need to consider cognitive burden of the human processing the various feedback (Beckerle et al., 2017).

3.3. Affective and Social Touch

Providing feedback also affectively might contribute to the integration of robotic devices into the bodily self-representation, similarly to recent observations in the rubber hand illusion paradigm (Crucianelli et al., 2013, 2017; van Stralen et al., 2014). The application of affective touch feedback in prosthesis users is promising, since the skin of the residual limb potentially remains sensitive for those signals. Hence, we expect prosthetic applications to strongly benefit from high-density and large surface feedback.

Social touch is at the basis of daily interpersonal interaction and non-verbal communication and first tailored haptic stimulators are under development (Culbertson et al., 2018). Since social touch can support human-human interaction, it might be helpful in robot-assisted caregiving contexts as well as in enhancing the acceptance of prosthetic devices, particularly for the upper extremity, as prior findings show that successful prosthesis use in terms of satisfaction and frequency is positively associated with social integration (Ham and Cotton, 2013).

Today, VR-based training scenarios help to improve user control in teleoperation and exoskeleton mediated assistance (Lugo-Villeda et al., 2009; Folgheraiter et al., 2012; Fani et al., 2018). While haptic guidance supports the users' trajectory tracking capabilities (Lugo-Villeda et al., 2009), the support of a therapist could be mimicked by introducing the feeling of her or him touching a patient to prevent unwanted movements. Such a social touch must be different with respect to the forces introduced to guide the patient and could further improve the support of patients especially in VR-based therapy applications.

3.4. Self-Touch

Self-touch is often neglected, but important in order to establish and maintain our own body representation (Bremner and Spence, 2017). Moreover, this kind of touch has been shown to increase body ownership (Hara et al., 2015) and first preliminary studies have extended such scenarios to rather complex situations of self-touch (Dieguez et al., 2009; Huynh et al., 2018). Research on embodiment in VR also shows that correct self-contact is more important for the feeling of embodiment than the mapping of movements (Bovet et al., 2018). Yet, tactile feedback is often limited to the hand or body parts, which was shown to constrain the implementation of passive self-touch in VR (Bovet et al., 2018).

4. Conclusion and Outlook

This paper points out the importance of various facets if tactile feedback for the embodiment of assistive robotic devices. Notably, self-touch, affective touch, and social touch should be considered as they modulate embodiment on different levels and ultimately concern psychosocial factors that determine how the device will be used in real life. To communicate this diverse tactile information between human and machine, sensing and stimulation needs to be spatially distributed and to be provided with high-density. Our review shows that several promising technologies such as flexible tactile sensors or electrotactile stimulation exist, but need to be improved and integrated into one interface. The design of the non-stationary and complex stimulation patterns to provide the actual tactile feedback might take human-human interaction as an example.

Finally, we expect that versatile and realistic tactile feedback covering the different facets of touch will enhance the usability and the user experience of assistive robots. We advocate the inclusion of self-touch, affective touch, and social touch to provide interfaces that approximate the capabilities of human skin. We recommend fundamental research toward a practical framework for psychology of tool use that can provide guidelines toward engineering for embodiment. A tight collaboration of engineering and psychology will be needed to devise experimental protocols as well as day-to-day interaction strategies between machines and humans. Advanced robotic touch technology embedded into robotic artifacts will likely be the main tool to enforce a real synergy with users.

Author Contributions

PB coordinated the development of the paper and the integration of individual contributions. All authors contributed content, perspectives, and references as well as discussed and revised the manuscript.

Funding

This work received support from the German Research Foundation (DFG) through the projects “Users Body Experience and Human-Machine Interfaces in (Assistive) Robotics” (no. BE 5729/3&11), “TACT-HAND: improving control of prosthetic hands using tactile sensors and realistic machine learning” (no. CA 1389/1), and the project ROBIN (no. 8022-00243A) funded by the Independent Research Fund Denmark as well as the Swiss National Science Foundation (no. 170511). The support by the German Research Foundation and the Open Access Publishing Fund of Technische Universität Darmstadt, Germany is acknowledged.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

Aggarwal, A., Kampmann, P., Lemburg, J., and Kirchner, F. (2015). Haptic object recognition in underwater and deep-sea environments. J. Field Robot. 32, 167–185. doi: 10.1002/rob.21538

CrossRef Full Text | Google Scholar

Antfolk, C., D'Alonzo, M., Rosén, B., Lundborg, G., Sebelius, F., and Cipriani, C. (2013a). Sensory feedback in upper limb prosthetics. Exp. Rev. Med. Devices 10, 45–54. doi: 10.1586/erd.12.68

PubMed Abstract | CrossRef Full Text | Google Scholar

Antfolk, C., D'Alonzo, M., Controzzi, M., Lundborg, G., Rosen, B., Sebelius, F., et al. (2013b). Artificial redirection of sensation from prosthetic fingers to the phantom hand map on transradial amputees: vibrotactile versus mechanotactile sensory feedback. IEEE Trans. Neural Syst. Rehabil. Eng. 21, 112–120. doi: 10.1109/TNSRE.2012.2217989

PubMed Abstract | CrossRef Full Text | Google Scholar

Avizzano, C. A., and Bergamasco, M. (1999). “Haptic interfaces: a new interaction paradigm,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (Kyongju).

Google Scholar

Bark, K., Hyman, E., Tan, F., Cha, E., Jax, S. A., Buxbaum, L. J., et al. (2015). Effects of vibrotactile feedback on human learning of arm motions. IEEE Trans. Neural Syst. Rehabil. Eng. 23, 51–63. doi: 10.1109/TNSRE.2014.2327229

PubMed Abstract | CrossRef Full Text | Google Scholar

Beckerle, P. (2017). Commentary: proceedings of the first workshop on peripheral machine interfaces: going beyond traditional surface electromyography. Front. Neurorobot. 11:32. doi: 10.3389/fnbot.2017.00032

CrossRef Full Text | Google Scholar

Beckerle, P., Salvietti, G., Unal, R., Prattichizzo, D., Rossi, S., Castellini, C., et al. (2017). A human-robot interaction perspective on assistive and rehabilitation robotics. Front. Neurorobot. 11:24. doi: 10.3389/fnbot.2017.00024

CrossRef Full Text | Google Scholar

Bekrater-Bodmann, R., Foell, J., Diers, M., and Flor, H. (2012). The perceptual and neuronal stability of the rubber hand illusion across contexts and over time. Brain Res. 1452, 130–139. doi: 10.1016/j.brainres.2012.03.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Bensmaia, S. J., and Miller, L. E. (2014). Restoring sensorimotor function through intracortical interfaces: progress and looming challenges. Nat. Rev. Neurosci. 15:313. doi: 10.1038/nrn3724

CrossRef Full Text | Google Scholar

Ben-Tzvi, P., and Ma, Z. (2015). Sensing and force-feedback exoskeleton (safe) robotic glove. IEEE Trans. Neural Syst. Rehabil. Eng. 23, 992–1002. doi: 10.1109/TNSRE.2014.2378171

PubMed Abstract | CrossRef Full Text | Google Scholar

Botvinick, M., and Cohen, J. (1998). Rubber hands 'feel' touch that eyes see. Nature 391:756. doi: 10.1038/35784

CrossRef Full Text | Google Scholar

Bovet, S., Debarba, H. G., Herbelin, B., Molla, E., and Boulic, R. (2018). The critical role of self-contact for embodiment in virtual reality. IEEE Trans. Visual. Comput. Graph. 24, 1428–1436. doi: 10.1109/TVCG.2018.2794658

PubMed Abstract | CrossRef Full Text | Google Scholar

Bremner, A. J., and Spence, C. (2017). “The development of tactile perception,” in Advances in Child Development and Behavior, Vol. 52 (Cambridge, MA: Elsevier), 227–268. doi: 10.1016/bs.acdb.2016.12.002

CrossRef Full Text | Google Scholar

Büscher, G. H., Kõiva, R., Schürmann, C., Haschke, R., and Ritter, H. J. (2015). Flexible and stretchable fabric-based tactile sensor. Robot. Auton. Syst. 63, 244–252. doi: 10.1016/j.robot.2014.09.007

CrossRef Full Text | Google Scholar

Castellini, C., Artemiadis, P. K., Wininger, M., Ajoudani, A., Alimusaj, M., Bicchi, A., et al. (2014). Proceedings of the first workshop on peripheral machine interfaces: going beyond traditional surface electromyography. Front. Neurorobot. 5:22. doi: 10.3389/fnbot.2014.00022

CrossRef Full Text | Google Scholar

Choi, W., Li, L., Satoh, S., and Hachimura, K. (2016). Multisensory integration in the virtual hand illusion with active movement. BioMed Res. Int. 2016:8163098. doi: 10.1155/2016/8163098

CrossRef Full Text | Google Scholar

Crucianelli, L., Krahé, C., Jenkinson, P. M., and Fotopoulou, A. K. (2017). Interoceptive ingredients of body ownership: affective touch and cardiac awareness in the rubber hand illusion. Cortex 104, 180–192. doi: 10.1016/j.cortex.2017.04.018

PubMed Abstract | CrossRef Full Text | Google Scholar

Crucianelli, L., Metcalf, N. K., Fotopoulou, A., and Jenkinson, P. M. (2013). Bodily pleasure matters: velocity of touch modulates body ownership during the rubber hand illusion. Front. Psychol. 4:703. doi: 10.3389/fpsyg.2013.00703

PubMed Abstract | CrossRef Full Text | Google Scholar

Culbertson, H., Nunez, C. M., Israr, A., Lau, F., Abnousi, F., and Okamura, A. M. (2018). “A social haptic device to create continuous lateral motion using sequential normal indentation,” in IEEE Haptics Symposium (San Francisco, CA), 32–39.

Google Scholar

Dahiya, R. S., Metta, G., Valle, M., and Sandini, G. (2010). Tactile sensing – from humans to humanoids. IEEE Trans. Robot. 26, 1–20. doi: 10.1109/TRO.2009.2033627

CrossRef Full Text | Google Scholar

D'Alonzo, M., and Cipriani, C. (2012). Vibrotactile sensory substitution elicits feeling of ownership of an alien hand. PLoS ONE 7:e50756. doi: 10.1371/journal.pone.0050756

PubMed Abstract | CrossRef Full Text | Google Scholar

D'Alonzo, M., Clemente, F., and Cipriani, C. (2015). Vibrotactile stimulation promotes embodiment of an alien hand in amputees with phantom sensations. IEEE Trans. Neural Syst. Rehabil. Eng. 23, 450–457. doi: 10.1109/TNSRE.2014.2337952

PubMed Abstract | CrossRef Full Text | Google Scholar

Dieguez, S., Mervier, M. R., Newby, N., and Blanke, O. (2009). Feeling numbness for someone else's finger. Curr. Biol. 19, R1108–R1109. doi: 10.1016/j.cub.2009.10.055

PubMed Abstract | CrossRef Full Text | Google Scholar

Dollar, A. M., and Herr, H. (2008). Lower extremity exoskeletons and active orthoses: challenges and state-of-the-art. IEEE Trans. Robot. 24, 144–158. doi: 10.1109/TRO.2008.915453

CrossRef Full Text | Google Scholar

Dosen, S., Markovic, M., Strbac, M., Belic, M., Kojic, C., Bijelic, G., et al. (2017). Multichannel electrotactile feedback with spatial and mixed coding for closed-loop control of grasping force in hand prostheses. IEEE Trans. Neural Syst. Rehabil. Eng. 25, 183–195. doi: 10.1109/TNSRE.2016.2550864

PubMed Abstract | CrossRef Full Text | Google Scholar

Ehrsson, H. H., Rosén, B., Stockselius, A., Ragnö, C., Köhler, P., and Lundborg, G. (2008). Upper limb amputees can be induced to experience a rubber hand as their own. Brain 131, 3443–3452. doi: 10.1093/brain/awn297

PubMed Abstract | CrossRef Full Text | Google Scholar

Fani, S., Ciotti, S., Catalano, M. G., Grioli, G., Tognetti, A., Valenza, G., et al. (2018). Simplifying telerobotics: wearability and teleimpedance improves human-robot interactions in teleoperation. IEEE Robot. Autom. Mag. 25, 77–88. doi: 10.1109/MRA.2017.2741579

CrossRef Full Text | Google Scholar

Folgheraiter, M., Jordan, M., Straube, S., Seeland, A., Kim, S. K., and Kirchner, E. A. (2012). Measuring the improvement of the interaction comfort of a wearable exoskeleton. Int. J. Soc. Robot. 4, 285–302. doi: 10.1007/s12369-012-0147-x

CrossRef Full Text | Google Scholar

Franceschi, M., Seminara, L., Došen, S., Štrbac, M., Valle, M., and Farina, D. (2017). A system for electrotactile feedback using electronic skin and flexible matrix electrodes: experimental evaluation. IEEE Trans. Hapt. 10, 162–172. doi: 10.1109/TOH.2016.2618377

PubMed Abstract | CrossRef Full Text | Google Scholar

Frisoli, A., Salsedo, F., Bergamasco, M., Rossi, B., and Carboncini, M. C. (2009). A force-feedback exoskeleton for upper-limb rehabilitation in virtual reality. Appl. Bionics Biomech. 6, 115–126. doi: 10.1155/2009/378254

CrossRef Full Text | Google Scholar

Gallo, S., Santos-Carreras, L., Rognini, G., Hara, M., Yamamoto, A., and Higuchi, T. (2012). “Towards multimodal haptics for teleoperation: design of a tactile thermal display,” in IEEE International Workshop on Advanced Motion Control (Sarajevo).

Google Scholar

Gomez-Rodriguez, M., Peters, J., Hill, J., Schölkopf, B., Gharabaghi, A., and Grosse-Wentrup, M. (2011). Closing the sensorimotor loop: haptic feedback facilitates decoding of motor imagery. J. Neural Eng. 8:036005. doi: 10.1088/1741-2560/8/3/036005

PubMed Abstract | CrossRef Full Text | Google Scholar

Graczyk, E. L., Resnik, L., Schiefer, M. A., Schmitt, M. S., and Tyler, D. J. (2018). Home use of a neural-connected sensory prosthesis provides the functional and psychosocial experience of having a hand again. Sci. Rep. 8:9866. doi: 10.1038/s41598-018-26952-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Haans, A., and Usselsteijn, W. A. (2009). The virtual midas touch: helping behavior after a mediated social touch. IEEE Trans. Haptics 2, 136–140. doi: 10.1109/TOH.2009.20

PubMed Abstract | CrossRef Full Text | Google Scholar

Hahne, J. M., Markovic, M., and Farina, D. (2017). User adaptation in myoelectric man-machine interfaces. Sci. Rep. 7:4437. doi: 10.1038/s41598-017-04255-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Ham, R., and Cotton, L. T. (2013). Limb Amputation: From Aetiology to Rehabilitation. Boston, MA: Springer.

Google Scholar

Hara, M., Pozeg, P., Rognini, G., Higuchi, T., Fukuhara, K., Yamamoto, A., et al. (2015). Voluntary self-touch increases body ownership. Front. Psychol. 6:1509. doi: 10.3389/fpsyg.2015.01509

PubMed Abstract | CrossRef Full Text | Google Scholar

Hertenstein, M. J., Holmes, R., McCullough, M., and Keltner, D. (2009). The communication of emotion via touch. Emotion 9:566. doi: 10.1037/a0016108

PubMed Abstract | CrossRef Full Text | Google Scholar

Hertenstein, M. J., Keltner, D., App, B., Bulleit, B. A., and Jaskolka, A. R. (2006). Touch communicates distinct emotions. Emotion 6:528. doi: 10.1037/1528-3542.6.3.528

PubMed Abstract | CrossRef Full Text | Google Scholar

Holden, M. K. (2005). Virtual environments for motor rehabilitation. Cyberpsychol. Behav. 8, 187–211. doi: 10.1089/cpb.2005.8.187

PubMed Abstract | CrossRef Full Text | Google Scholar

Huisman, G. (2017). Social touch technology: a survey of haptic technology for social touch. IEEE Trans. Haptics 10, 391–408. doi: 10.1109/TOH.2017.2650221

PubMed Abstract | CrossRef Full Text | Google Scholar

Huynh, T. V., Scherf, A., Bittner, A., Saetta, G., Lenggenhager, B., and Beckerle, P. (2018). “Design of a wearable robotic hand to investigate multisensory illusions and the bodily self of humans,” in International Symposium on Robotics (Munich).

Google Scholar

Imaizumi, S., Asai, T., and Koyama, S. (2016). Embodied prosthetic arm stabilizes body posture, while unembodied one perturbs it. Conscious. Cogn. 45, 75–88. doi: 10.1016/j.concog.2016.08.019

PubMed Abstract | CrossRef Full Text | Google Scholar

Kern, U., Busch, V., Rockland, M., Kohl, M., and Birklein, F. (2009). Prävalenz und risikofaktoren von phantomschmerzen und phantomwahrnehmungen in deutschland. Der Schmerz 23, 479–488. doi: 10.1007/s00482-009-0786-5

CrossRef Full Text | Google Scholar

Kim, J., Lee, M., Shim, H. J., Ghaffari, R., Cho, H. R., Son, D., et al. (2014). Stretchable silicon nanoribbon electronics for skin prosthesis. Nat. Commun. 5:5747. doi: 10.1038/ncomms6747

PubMed Abstract | CrossRef Full Text | Google Scholar

Kim, K., Colgate, J. E., Santos-Munné, J. J., Makhlin, A., and Peshkin, M. A. (2010). On the design of miniature haptic devices for upper extremity prosthetics. IEEE/ASME Trans. Mechatr. 15, 27–39. doi: 10.1109/TMECH.2009.2013944

CrossRef Full Text | Google Scholar

Kõiva, R., Zenker, M., Schürmann, C., Haschke, R., and Ritter, H. J. (2013). “A highly sensitive 3d-shaped tactile sensor,” in IEEE/ASME International Conference on Advanced Intelligent Mechatronics (Wollongong).

Google Scholar

Li, K., Fang, Y., Zhou, Y., and Liu, H. (2017). Non-invasive stimulation-based tactile sensation for upper-extremity prosthesis: a review. IEEE Sens. J. 17, 2625–2635. doi: 10.1109/JSEN.2017.2674965

CrossRef Full Text | Google Scholar

Löken, L. S., Wessberg, J., Morrison, I., McGlone, F., and Olausson, H. (2009). Coding of pleasant touch by unmyelinated afferents in humans. Nat. Neurosci. 12:547. doi: 10.1038/nn.2312

CrossRef Full Text | Google Scholar

Lugo-Villeda, L. I., Frisoli, A., Sandoval-Gonzalez, O., Padilla, M. A., Parra-Vega, V., Avizzano, C. A., et al. (2009). “Haptic guidance of light-exoskeleton for arm-rehabilitation tasks,” in IEEE International Symposium on Robot and Human Interactive Communication (Toyama).

Google Scholar

Makin, T. R., de Vignemont, F., and Faisal, A. A. (2017). Neurocognitive barriers to the embodiment of technology. Nat. Biomed. Eng. 1:0014. doi: 10.1038/s41551-016-0014

CrossRef Full Text | Google Scholar

Mallwitz, M., Will, N., Teiwes, J., and Kirchner, E. A. (2015). “The capio active upper body exoskeleton and its application for teleoperation,” in Proceedings of the 13th Symposium on Advanced Space Technologies in Robotics and Automation. ESA/Estec Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA-2015) ESA (Noordwijk).

Google Scholar

Marasco, P. D., Kim, K., Colgate, J. E., Peshkin, M. A., and Kuiken, T. A. (2011). Robotic touch shifts perception of embodiment to a prosthesis in targeted reinnervation amputees. Brain 134, 747–758. doi: 10.1093/brain/awq361

PubMed Abstract | CrossRef Full Text | Google Scholar

Meek, S. G., Jacobsen, S. C., and Goulding, P. P. (1989). Extended physiologic taction: design and evaluation of a proportional force feedback system. J. Rehabil. Res. Dev. 26, 53–62.

PubMed Abstract | Google Scholar

Moseley, G. L., Gallace, A., and Spence, C. (2012). Bodily illusions in health and disease: physiological and clinical perspectives and the concept of a cortical ‘body matrix.' Neurosci. Biobehav. Rev. 36, 34–46. doi: 10.1016/j.neubiorev.2011.03.013

CrossRef Full Text

Niedernhuber, M., Barone, D. G., and Lenggenhager, B. (2018). Prostheses as extensions of the body: progress and challenges. Neurosci. Biobehav. Rev. 92, 1–6. doi: 10.1016/j.neubiorev.2018.04.020

PubMed Abstract | CrossRef Full Text | Google Scholar

O'Malley, M. K., and Gupta, A. (2008). “Haptic interfaces,” in HCI beyond the GUI: Design for Haptic, Speech, Olfactory, and other nontraditional Interfaces (Burlington, MA), 25–64. doi: 10.1016/B978-0-12-374017-5.00002-X

CrossRef Full Text | Google Scholar

Pacchierotti, C., Sinclair, S., Solazzi, M., Frisoli, A., Hayward, V., and Prattichizzo, D. (2017). Wearable haptic systems for the fingertip and the hand: taxonomy, review, and perspectives. IEEE Trans. Haptics 10, 580–600. doi: 10.1109/TOH.2017.2689006

PubMed Abstract | CrossRef Full Text | Google Scholar

Pamungkas, D. S., and Ward, K. (2014). “Electro-tactile feedback system for achieving embodiment in a tele-operated robot,” in IEEE International Conference on Control Automation Robotics & Vision (Marina Bay Sands), 1706–1711.

Google Scholar

Pazzaglia, M., and Molinari, M. (2016). The embodiment of assistive devices—from wheelchair to exoskeleton. Phys. Life Rev. 16, 163–175. doi: 10.1016/j.plrev.2015.11.006

CrossRef Full Text | Google Scholar

Planthaber, S., Mallwitz, M., and Kirchner, E. A. (2018). Immersive robot control in virtual reality to command robots in space missions. J. Softw. Eng. Appl. 11:341. doi: 10.4236/jsea.2018.117021

CrossRef Full Text | Google Scholar

Prewett, M. S., Elliott, L. R., Walvoord, A. G., and Coovert, M. D. (2012). A meta-analysis of vibrotactile and visual information displays for improving task performance. IEEE Trans. Syst. Man Cybern. C 42, 123–132. doi: 10.1109/TSMCC.2010.2103057

CrossRef Full Text | Google Scholar

Raspopovic, S., Capogrosso, M., Petrini, F. M., Bonizzato, M., Rigosa, J., Di Pino, G., et al. (2014). Restoring natural sensory feedback in real-time bidirectional hand prostheses. Sci. Transl. Med. 6:222ra19. doi: 10.1126/scitranslmed.3006820

PubMed Abstract | CrossRef Full Text | Google Scholar

Rohde, M., Di Luca, M., and Ernst, M. O. (2011). The rubber hand illusion: feeling of ownership and proprioceptive drift do not go hand in hand. PLoS ONE 6:e21659. doi: 10.1371/journal.pone.0021659

PubMed Abstract | CrossRef Full Text | Google Scholar

Roncone, A., Hoffmann, M., Pattacini, U., Fadiga, L., and Metta, G. (2016). Peripersonal space and margin of safety around the body: learning visuo-tactile associations in a humanoid robot with artificial skin. PLoS ONE 11:e0163713. doi: 10.1371/journal.pone.0163713

PubMed Abstract | CrossRef Full Text | Google Scholar

Rosén, B., Ehrsson, H. H., Antfolk, C., Cipriani, C., Sebelius, F., and Lundborg, G. (2009). Referral of sensation to an advanced humanoid robotic hand prosthesis. Scand. J. Plast. Reconstruct. Surg. Hand Surg. 43, 260–266. doi: 10.3109/02844310903113107

PubMed Abstract | CrossRef Full Text | Google Scholar

Schofield, J. S., Evans, K. R., Carey, J. P., and Hebert, J. S. (2014). Applications of sensory feedback in motorized upper extremity prosthesis: a review. Exp. Rev. Med. Devices 11, 499–511. doi: 10.1586/17434440.2014.929496

PubMed Abstract | CrossRef Full Text | Google Scholar

Sengül, A., van Elk, M., Rognini, G., Aspell, J. E., Bleuler, H., and Blanke, O. (2012). Extending the body to virtual tools using a robotic surgical interface: evidence from the crossmodal congruency task. PLoS ONE 7:e49473. doi: 10.1371/journal.pone.0049473

PubMed Abstract | CrossRef Full Text | Google Scholar

Shokur, S., Gallo, S., Moioli, R. C., Donati, A. R. C., Morya, E., Bleuler, H., et al. (2016). Assimilation of virtual legs and perception of floor texture by complete paraplegic patients receiving artificial tactile feedback. Sci. Rep. 6:32293. doi: 10.1038/srep32293

PubMed Abstract | CrossRef Full Text | Google Scholar

Sigrist, R., Rauter, G., Marchal-Crespo, L., Riener, R., and Wolf, P. (2015). Sonification and haptic feedback in addition to visual feedback enhances complex motor task learning. Exp. Brain Res. 233, 909–925. doi: 10.1007/s00221-014-4167-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Stephens-Fripp, B., Alici, G., and Mutlu, R. (2018). A review of non-invasive sensory feedback methods for transradial prosthetic hands. IEEE Access 6, 6878–6899. doi: 10.1109/ACCESS.2018.2791583

CrossRef Full Text | Google Scholar

Štrbac, M., Belic, M., Isakovic, M., Kojic, V., Bijelic, G., Popovic, I., et al. (2016). Integrated and flexible multichannel interface for electrotactile stimulation. J. Neural Eng. 13:046014. doi: 10.1088/1741-2560/13/4/046014

CrossRef Full Text | Google Scholar

Strbac, M., Isakovic, M., Belic, M., Popovic, I., Simanic, I., Farina, D., et al. (2017). Short- and long-term learning of feedforward control of a myoelectric prosthesis with sensory feedback by amputees. IEEE Trans. Neural Syst. Rehabil. Eng. 25, 2133–2145. doi: 10.1109/TNSRE.2017.2712287

PubMed Abstract | CrossRef Full Text | Google Scholar

Svensson, P., Wijk, U., Björkman, A., and Antfolk, C. (2017). A review of invasive and non-invasive sensory feedback in upper limb prostheses. Exp. Rev. Med. Devices 14, 439–447. doi: 10.1080/17434440.2017.1332989

PubMed Abstract | CrossRef Full Text | Google Scholar

Synofzik, M., Vosgerau, G., and Newen, A. (2008). I move, therefore i am: a new theoretical framework to investigate agency and ownership. Conscious. Cogn. 17, 411–424. doi: 10.1016/j.concog.2008.03.008

PubMed Abstract | CrossRef Full Text | Google Scholar

van Stralen, H. E., van Zandvoort, M. J. E., Hoppenbrouwers, S. S., Vissers, L. M. G., Kappelle, L. J., and Dijkerman, H. C. (2014). Affective touch modulates the rubber hand illusion. Cognition 131, 147–158. doi: 10.1016/j.cognition.2013.11.020

PubMed Abstract | CrossRef Full Text | Google Scholar

Veneman, J., Burdet, E., van, d. K. H., and Lefeber, D. (2017). “Emerging directions in lower limb externally wearable robots for gait rehabilitation and augmentation - a review,” in Advances in Cooperative Robotics, eds M. O. Tokhi and G. S. Virk (London, UK: World Scientific Publishing Co. Pte. Ltd.), 840–850.

Google Scholar

Weber, B., and Eichberger, C. (2015). “The benefits of haptic feedback in telesurgery and other teleoperation systems: a meta-analysis,” in International Conference on Universal Access in Human-Computer Interaction (Los Angeles, CA).

Google Scholar

Weiss, P. L., Kizony, R., Feintuch, U., and Katz, N. (2006). Virtual reality in neurorehabilitation. Textbook Neural Repair Rehabil. 51, 182–197. doi: 10.1017/CBO9780511545078.015

CrossRef Full Text | Google Scholar

Witteveen, H. J. B., Rietman, H. S., and Veltink, P. H. (2015). Vibrotactile grasping force and hand aperture feedback for myoelectric forearm prosthesis users. Prosth. Orthot. Int. 39, 204–212. doi: 10.1177/0309364614522260

PubMed Abstract | CrossRef Full Text | Google Scholar

Zou, L., Ge, C., Wang, Z. J., Cretu, E., and Li, X. (2017). Novel tactile sensor technology and smart tactile sensing systems: a review. Sensors 17:2653. doi: 10.3390/s17112653

CrossRef Full Text | Google Scholar

Keywords: embodiment, affective touch, social touch, self-touch, human-machine interfaces, tactile feedback, assistive robotics

Citation: Beckerle P, Kõiva R, Kirchner EA, Bekrater-Bodmann R, Dosen S, Christ O, Abbink DA, Castellini C and Lenggenhager B (2018) Feel-Good Robotics: Requirements on Touch for Embodiment in Assistive Robotics. Front. Neurorobot. 12:84. doi: 10.3389/fnbot.2018.00084

Received: 03 September 2018; Accepted: 26 November 2018;
Published: 11 December 2018.

Edited by:

Sung-Phil Kim, Ulsan National Institute of Science and Technology, South Korea

Reviewed by:

Solaiman Shokur, Alberto Santos Dumont Association for Research Support, Brazil
Stéphane Lallée, Facebook, United States

Copyright © 2018 Beckerle, Kõiva, Kirchner, Bekrater-Bodmann, Dosen, Christ, Abbink, Castellini and Lenggenhager. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Philipp Beckerle, philipp.beckerle@tu-dortmund.de

Download