<?xml version="1.0" encoding="UTF-8" standalone="no"?><!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd"><article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" article-type="research-article"><front><journal-meta><journal-id journal-id-type="publisher-id">Front. Robot. AI</journal-id><journal-title>Frontiers in Robotics and AI</journal-title><abbrev-journal-title abbrev-type="pubmed">Front. Robot. AI</abbrev-journal-title><issn pub-type="epub">2296-9144</issn><publisher><publisher-name>Frontiers Media S.A.</publisher-name></publisher></journal-meta><article-meta><article-id pub-id-type="publisher-id">311495</article-id><article-id pub-id-type="doi">10.3389/frobt.2018.00062</article-id><article-categories><subj-group subj-group-type="heading"><subject>Robotics and AI</subject><subj-group><subject>Original Research</subject></subj-group></subj-group></article-categories><title-group><article-title>Design of a Wearable Fingertip Haptic Device for Remote Palpation: Characterisation and Interface with a Virtual Environment</article-title></title-group><contrib-group><contrib corresp="yes" contrib-type="author"><name><surname>Tzemanaki</surname><given-names>Antonia</given-names></name><uri xlink:href="http://loop.frontiersin.org/people/439315"/><xref ref-type="corresp" rid="cor1"><sup>&#x002A;</sup></xref></contrib><contrib contrib-type="author"><name><surname>Al</surname><given-names>Gorkem Anil</given-names></name><uri xlink:href="http://loop.frontiersin.org/people/503015"/></contrib><contrib contrib-type="author"><name><surname>Melhuish</surname><given-names>Chris</given-names></name><uri xlink:href="http://loop.frontiersin.org/people/100884"/></contrib><contrib contrib-type="author"><name><surname>Dogramadzi</surname><given-names>Sanja</given-names></name><uri xlink:href="http://loop.frontiersin.org/people/384296"/></contrib><aff><institution>Bristol Robotics Laboratory, University of the West of England</institution>, <addr-line>Bristol</addr-line>, <country>United Kingdom</country></aff></contrib-group><author-notes><fn fn-type="edited-by"><p>Edited by: Kaspar Althoefer, Queen Mary University of London, United Kingdom</p></fn><fn fn-type="edited-by"><p>Reviewed by: Martin F. Stoelen, Plymouth University, United Kingdom; Surya Girinatha Nurzaman, Monash University Malaysia, Malaysia; Emanuele Lindo Secco, Liverpool Hope University, United Kingdom</p></fn><corresp id="cor1">&#x002A;Antonia Tzemanaki, <email>antonia.tzemanaki@brl.ac.uk</email></corresp><fn fn-type="other" id="fn001"><p>Specialty section: This article was submitted to Soft Robotics, a section of the journal Frontiers in Robotics and AI</p></fn></author-notes><pub-date pub-type="epub"><day>12</day><month>06</month><year>2018</year></pub-date><pub-date pub-type="collection"><year>2018</year></pub-date><volume>5</volume><elocation-id>62</elocation-id><history><date date-type="received"><day>30</day><month>11</month><year>2017</year></date><date date-type="accepted"><day>08</day><month>05</month><year>2018</year></date></history><permissions><copyright-statement>Copyright &#x00A9; 2018 Tzemanaki, Al, Melhuish and Dogramadzi</copyright-statement><copyright-year>2018</copyright-year><copyright-holder>Tzemanaki, Al, Melhuish and Dogramadzi</copyright-holder><license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the <uri xlink:href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution License (CC BY)</uri>. The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license></permissions><abstract><p>This paper presents the development of a wearable Fingertip Haptic Device (FHD) that can provide cutaneous feedback via a Variable Compliance Platform (VCP). The FHD includes an inertial measurement unit, which tracks the motion of the user&#x2019;s finger while its haptic functionality relies on two parameters: pressure in the VCP and its linear displacement towards the fingertip. The combination of these two features results in various conditions of the FHD, which emulate the remote object or surface stiffness properties. Such a device can be used in tele-operation, including virtual reality applications, where rendering the level of stiffness of different physical or virtual materials could provide a more realistic haptic perception to the user. The FHD stiffness representation is characterised in terms of resulting pressure and force applied to the fingertip created through the relationship of the two functional parameters &#x2013; pressure and displacement of the VCP. The FHD was tested in a series of user studies to assess its potential to create a user perception of the object&#x2019;s variable stiffness. The viability of the FHD as a haptic device has been further confirmed by interfacing the users with a virtual environment. The developed virtual environment task required the users to follow a virtual path, identify objects of different hardness on the path and navigate away from &#x201C;no-go&#x201D; zones. The task was performed with and without the use of the variable compliance on the FHD. The results showed improved performance with the presence of the variable compliance provided by the FHD in all assessed categories and particularly in the ability to identify correctly between objects of different hardness.</p></abstract><kwd-group><kwd>surgical robotics</kwd><kwd>tele-operation</kwd><kwd>haptics</kwd><kwd>virtual environments</kwd><kwd>mechatronics</kwd></kwd-group><contract-num rid="cn01">732515</contract-num><contract-sponsor id="cn01">Horizon 2020 Framework Programme<named-content content-type="fundref-id">10.13039/100010661</named-content></contract-sponsor><counts><fig-count count="14"/><table-count count="6"/><equation-count count="2"/><ref-count count="39"/><page-count count="15"/><word-count count="8972"/></counts></article-meta></front><body><sec id="s1" sec-type="intro"><title>1.&#x00A0;Introduction</title><p>Master-slave robotic systems have found use in many applications ranging from games industry, assistive technologies and medicine (practicing or training) (e.g., in neuromuscular rehabilitation (<xref ref-type="bibr" rid="B12">Iqbal et al., 2014</xref>) or in surgery (<xref ref-type="bibr" rid="B10">Hagn et al., 2010</xref>; <xref ref-type="bibr" rid="B36">Tzemanaki et al., 2014</xref>; <xref ref-type="bibr" rid="B11">Intuitive Surgical, 2017</xref>) to areas where safety issues prevent use of autonomous robots such as in underwater environments, space exploration or nuclear industry (<xref ref-type="bibr" rid="B26">Nagatani et al., 2013</xref>; <xref ref-type="bibr" rid="B17">Kulakov et al., 2015</xref>; <xref ref-type="bibr" rid="B7">Garc&#x00ED;a et al., 2017</xref>).</p><p>Tracking or replicating hand/arm motion is a central part of tele-operation. We interact with our environment mainly using vision and touch. Our hands, with their complex structure, high dexterity and precise manipulation ability are critical instruments in recognising shape, stiffness and weight of an object (<xref ref-type="bibr" rid="B2">Achibet, 2015</xref>). Their high touch sensitivity is achieved due to mechanoreceptors embedded in the skin, which aid the detection of vibration, stretching and cutaneous stimuli.</p><p>The addition of haptics in tele-operation can add value and improve the performance of the user, for example in minimally invasive surgery (MIS) or microsurgery applications as supported by <xref ref-type="bibr" rid="B1">Abushagur et al. (2014)</xref>. Contrary to techniques used in open surgery or even manual minimally invasive surgery (laparoscopy), one of the most frequent criticisms of robot-assisted MIS (R-A MIS) systems is their lack of haptic feedback (<xref ref-type="bibr" rid="B1">Abushagur et al., 2014</xref>). Use of master-slave systems requires extensive training to gain dexterity and efficiency (<xref ref-type="bibr" rid="B21">Ma et al., 2014</xref>). Practicing on actual R-A MIS systems is not easily available when junior surgeons start their training as surgical assistants. Virtual reality (VR) environment applications can be a cost-efficient substitute that accelerates the initial phases of training. Simulators including e.g., virtual pick and place tasks or even a simulated patient&#x2019;s abdomen are already used in R-A MIS training or anatomy learning among others (<xref ref-type="bibr" rid="B37">van der Meijden and Schijven, 2009</xref>; <xref ref-type="bibr" rid="B38">Vander Poorten et al., 2012</xref>; <xref ref-type="bibr" rid="B23">Meng et al., 2013</xref>). The addition of haptic feedback in VR environments creates more realistic scenarios, while providing trainees with a safe environment in which they can develop their skills (<xref ref-type="bibr" rid="B18">Lemole et al., 2007</xref>; <xref ref-type="bibr" rid="B14">Kirkman et al., 2014</xref>).</p><p>Haptic feedback is commonly categorised as kinaesthetic and cutaneous/tactile feedback. Various haptic devices are commercially available e.g., the Geomagic Touch haptic device (USA, formerly Sensable Phantom Omni) with a simple and safe design which made its use popular (<xref ref-type="bibr" rid="B34">Srinivasan and Basdogan, 1997</xref>). Such devices can be classified as grounded kinaesthetic (mainly) feedback devices that are portable but placed on a surface while the user operates their end-effector in 3D space. A Phantom Omni is used in the work by <xref ref-type="bibr" rid="B20">Li et al. (2015)</xref>, where examined palpation and tumor identification using force feedback and Pseudo-Haptic Feedback (PHF) in a virtual environment. In this work, PHF is based in visual cues of virtual tissue deforming while the speed of the screen&#x2019;s cursor is slowed down to give the impression of stiffness. PHF is a good alternative when a force feedback device is not present or possible, while it can also compliment and improve the results of haptic feedback devices (<xref ref-type="bibr" rid="B20">Li et al., 2015</xref>).</p><p>Similar devices include the Falcon (Novint, USA) or Sigma.7 (Force Dimension, Switzerland), both with a parallel robot configuration as opposed to the serial configuration of Geomagic Touch. However, their prices can be high (<xref ref-type="bibr" rid="B31">Pacchierotti et al., 2017</xref>) and their workspace is usually restricted.</p><p>Moreover, grounded haptic devices have been designed specifically to support fingers. For example, MasterFinger-2 is a 6-DOF (Degree Of Freedom) haptic interface which can be operated using the index finger and the thumb to grasp virtual objects (<xref ref-type="bibr" rid="B39">Monroy et al., 2008</xref>, <xref ref-type="bibr" rid="B30">Pacchierotti et al., 2012</xref>). Such systems can provide both kinaesthetic and cutaneous feedback to the users, however, their workspace is limited and they are not meant to be portable.</p><p>Wearable Haptic Devices (WHDs) such as Hand Exoskeletons (HE) can provide more freedom of movement, mimic the hand movements of the operator and potentially remove the cognitive gap in tele-operation (<xref ref-type="bibr" rid="B37">van der Meijden and Schijven, 2009</xref>). An example WHD is the combination of Cyberglove and CyberGrasp (CyberGlove Systems, USA) which provides force feedback by pulling the fingertips via cables. The glove by <xref ref-type="bibr" rid="B27">Park et al. (2016)</xref> is also using cables: one to measure the position of each finger and one to exert force on the fingers. Other WHDs utilise rigid force transmission mechanisms attached to the hand to exert force on the fingers such as the one by <xref ref-type="bibr" rid="B9">Fontana et al. (2013)</xref> (thumb and index finger) or DEXMO exoskeleton by <xref ref-type="bibr" rid="B8">Gu et al. (2016)</xref>. Despite DEXMO&#x2019;s high motion accuracy, its major disadvantage is its ability to generate only binary haptic feedback (<xref ref-type="bibr" rid="B8">Gu et al., 2016</xref>).</p><p>HEs are made using soft or rigid materials, covering all or some of the fingers. Some exoskeleton designs are bulky while some cover just fingertips, which can be especially effective for tactile applications and controllable cutaneous feedback (<xref ref-type="bibr" rid="B31">Pacchierotti et al., 2017</xref>) and can be classified according to the cutaneous sensation that they provide: normal indentation, tangential motion, lateral skin stretch or vibration (<xref ref-type="bibr" rid="B31">Pacchierotti et al., 2017</xref>). An example of use of vibration for haptic feedback is the device by <xref ref-type="bibr" rid="B22">Maereg et al. (2017)</xref> with five vibro-tactile actuators, one for each fingertip of the user. Although wearable devices usually provide cutaneous stimuli, with most of the kinaesthetic feedback missing (<xref ref-type="bibr" rid="B29">Prattichizzo et al., 2013</xref>), it is possible to compensate for this deficiency without significant performance degradation (<xref ref-type="bibr" rid="B32">Pacchierotti et al., 2014</xref>).</p><p>Normal indentation is achieved by one or more moving parts that emulate contact with a soft/hard material or give a sense of curvature or pressure to the fingertip. Furthermore, lateral motion with respect to the fingertips can apply cutaneous feedback to fingertip. For example, the combination of the two methods (normal and lateral) has been utilised via three motors, cables and a parallel mechanism by <xref ref-type="bibr" rid="B29">Prattichizzo et al. (2013)</xref> and <xref ref-type="bibr" rid="B32">Pacchierotti et al. (2014)</xref> or by a serial mechanism wrapped around the finger actuated by a motor and a voice coil by <xref ref-type="bibr" rid="B6">Gabardi et al. (2016)</xref> for surface exploration. In the work by <xref ref-type="bibr" rid="B32">Pacchierotti et al. (2014)</xref>, the motors adjust the length of cables using position encoders to move the platform towards the fingertip. A force sensor is attached to the platform&#x2019;s centre to measure forces perpendicular to the fingertip.</p><p><xref ref-type="bibr" rid="B13">Kim et al. (2016)</xref> propose a similar haptic fingertip device with the addition of four Inertial Measurement Units (IMU) sensors to track the palm and index finger in a virtual environment. Although these platform devices can be used in different scenarios, the end-effectors are constantly in contact with the fingertip, which does not allow the possibility of intermittent touch with virtual objects. In this respect, <xref ref-type="bibr" rid="B4">Chinello et al. (2015)</xref> presented a wearable fingertip device with two platforms in parallel configuration. Three servo motors are fixed to the upper part of the device and a mobile platform exerts forces to the volar skin surface of the fingertip. Motors actuate three legs connecting these two parts to render forces from the virtual environment. A virtual environment is used for testing also in the work by <xref ref-type="bibr" rid="B22">Maereg et al. (2017)</xref> where tracking is done via a LeapMotion controller and PHF is also explored by visualising displacement in the virtual environment.</p><p>In addition to applying force to a fingertip using moving platforms, the devices that are designed for lateral skin stretch can be also used to exert normal forces on the fingertip. <xref ref-type="bibr" rid="B24">Minamizawa et al. (2007)</xref> developed a wearable fingertip device to exert tangential and normal cutaneous feedback. When the two motors of the device rotate in opposite directions, the belt exerts vertical stress. Equally, rotation in the same direction results in shearing stress. A similar fabric based WHD by <xref ref-type="bibr" rid="B3">Bianchi et al. (2016)</xref> emulates different levels of softness by stretching the fabric across the fingertip, applying tangential forces.</p><p>A 3RSR parallel mechanism located under the finger as a moving platform, described by <xref ref-type="bibr" rid="B19">Leonardis et al. (2015)</xref>, provides both position and orientation information. A fingertip delta type parallel mechanism has been designed by <xref ref-type="bibr" rid="B33">Schorr and Okamura (2017)</xref> that exerts normal, lateral and longitudinal skin deformation, with a maximum normal force of 2 N.</p><p><xref ref-type="bibr" rid="B25">Murray et al. (2003)</xref> found that proportional haptic feedback, as opposed to binary feedback such as in the exoskeleton by <xref ref-type="bibr" rid="B8">Gu et al. (2016)</xref>, can facilitate user performance. As discussed earlier, such devices can facilitate medical diagnosis or training on diagnosis (e.g., tumour detection), while it can also improve safety during precision operations by using haptic feedback as a warning (e.g., when navigating through a narrow space with &#x201C;no-go&#x201D; zones).</p><p>In this paper, we present the design of a Fingertip Haptic Device (FHD) for motion tracking and cutaneous haptic feedback that can be used to interact with a virtual environment in such a way that the user can gather information about the compliance as well as the hardness of different objects. The variable compliance in the FHD differs from previous work where fingertip indentations were achieved using objects of constant stiffness and shape or fabric materials that provide lateral deformation of the skin. Thus, variable compliance is designed to increase the sense of touch, and to provide the sense of touching soft or hard materials using a single device.</p><p>The FHD is 3D printed and comprises an IMU for motion tracking and a soft fingertip platform of adjustable compliance which is linearly actuated. The design of the mechanism is presented in Section 2, while its characterisation and testing are demonstrated in Section 3.1. To validate the developed mechanism, a series of experiments were carried out in a virtual environment where users had to complete a task using the FHD&#x2019;s motion tracking system and haptic feedback. The results of this user study are presented in Section 3.2 and are discussed in Section 4, where a comparison between the various scenarios is made and conclusions are drawn.</p></sec><sec id="s2" sec-type="materials&#x007C;methods"><title>2.&#x00A0;Material and Methods</title><p>This Section presents the materials used for the FHD design as well as the methods used for motion tracking and interaction with a virtual environment. The haptic functionality of the FHD relies on the two features of its Variable Compliance Platform (VCP): the VCP can be inflated with air and linearly displaced so that it pushes against the fingertip. The combination of these can provide both the sense of the material&#x2019;s hardness and of the normal force exerted on the fingertip. The forces generated by the VCP push against the user&#x2019;s fingertip, allowing the user to passively palpate the material hardness.</p><sec id="s2-1"><title>2.1. Components of the FHD</title><p><xref ref-type="fig" rid="F1">Figure 1</xref> shows the side and front view of the FHD, which consists of: (a) the VCP, (b) the Rack and Pinion (RP) mechanism and (c) the Support Structure (SS) with the IMU sensor. The RP mechanism adjusts the distance between the fingertip and the VCP. The FHD&#x2019;s dimensions are 38.6 mm (width) x [38.2&#x2013;53] mm (variable length due to the RP). The distance between the SS and the VCP range is 9.6&#x2013;24.53 mm. The total weight of the FHD (including one motor, TowerPro MG92B) is 40 gr.</p><fig id="F1" position="float"><label>Figure 1</label><caption><p> FHD: <bold>(</bold><bold>A</bold><bold>)</bold> side view with an index finger placed against the SS and held with a flexible strap; the RP mechanism lies behind the finger and the VCP below the finger, which can linearly move along the RP closer/farther from the finger, <bold>(</bold><bold>B</bold><bold>)</bold> Computer Aided Design drawing, <bold>(</bold><bold>C</bold><bold>&#x2013;</bold><bold>D</bold><bold>)</bold> font views with the <bold>(</bold><bold>C</bold><bold>)</bold> minimum and <bold>(</bold><bold>D</bold><bold>)</bold> maximum VCP displacement.</p></caption><graphic xlink:href="frobt-05-00062-g001.tif"/></fig><sec id="s2-1-1"><title>2.1.1. VCP Design and Functionality</title><p>The 3D printed (Polylactic Acid filament) VCP has an area of 478.5 mm<sup>2</sup> that corresponds to the average area of a fingertip as reported by Peters et al. (2002) (index finger, female average 360 mm<sup>2</sup>, male average 420 mm<sup>2</sup>).</p><p>As discussed in the previous section, fingertip haptic feedback is often achieved by pressing rigid (<xref ref-type="bibr" rid="B35">Tsetserukou et al., 2010</xref>; <xref ref-type="bibr" rid="B19">Leonardis et al., 2015</xref>; <xref ref-type="bibr" rid="B33">Schorr and Okamura, 2017</xref>) or soft [e.g., belt systems by <xref ref-type="bibr" rid="B24">Minamizawa et al. (2007)</xref>; <xref ref-type="bibr" rid="B32">Pacchierotti et al. (2014)</xref>, dielectric elastomer actuators by <xref ref-type="bibr" rid="B15">Koo et al. (2008)</xref>, <xref ref-type="bibr" rid="B5">Frediani et al. (2014)</xref>] objects either normal or lateral to the fingertip surface. However, these devices do not offer actual indentation sense because their compliance cannot be changed. Our hypothesis is that variable compliance in a haptic device can provide indentation and varied hardness/softness sense to the user. Consequently, the VCP consists of a rigid base (<xref ref-type="fig" rid="F2">Figure 2</xref>) with its top surface covered by a layer of silicon rubber (DragonSkin, shore hardness 10 A, 475 psi) of 1 mm thickness. The lower surface (<xref ref-type="fig" rid="F2">Figure 2A</xref>) of the VCP is connected to a syringe pump via a 7 mm diameter air tube (<xref ref-type="fig" rid="F3">Figure 3</xref>). The VCP functionality is created by pumping air through 6 holes (<xref ref-type="fig" rid="F2">Figure 2B</xref>) into the gap between its rigid base and the soft silicon membrane of the VCP.</p><fig id="F2" position="float"><label>Figure 2</label><caption><p> Rigid base of the VCP <bold>(</bold><bold>A</bold><bold>)</bold> side view, <bold>(</bold><bold>B</bold><bold>)</bold> isometric view.</p></caption><graphic xlink:href="frobt-05-00062-g002.tif"/></fig><fig id="F3" position="float"><label>Figure 3</label><caption><p> Syringe pump actuation system.</p></caption><graphic xlink:href="frobt-05-00062-g003.tif"/></fig><p>The design of the syringe pump actuation system, shown in <xref ref-type="fig" rid="F3">Figure 3</xref>, utilises an RP mechanism (part 1, <xref ref-type="fig" rid="F3">Figure 3</xref>) and a 20 ml syringe (attached to part 2, <xref ref-type="fig" rid="F3">Figure 3</xref>). The pinion is attached to a motor (Turnigy 1258 TG, stall torque of 1.17 Nm) and the rack moves the syringe along the horizontal axis (0.8 mm displacement per one degree of rotation). The maximum volume of air used for inflation of the VCP was 4 ml, equivalent to pressure of 5.17 kPa, measured using a pressure sensor (HSCSAAN015PDAA5, Honeywell, USA, range of &#x00B1;103 kPa, accuracy of 0.25 kPa).</p><p>The extent of the VCP&#x2019;s deformation when inflated with 4 ml of air is 25 mm, while index fingertip extent is 10.4 mm in average for women and 12.7 mm for men (<xref ref-type="bibr" rid="B28">Peters et al., 2002</xref>). The extent of the VCP is greater than the measured human fingertip because the contact area will be smaller when the soft membrane is inflated.</p></sec><sec id="s2-1-2"><title>2.1.2. RP Mechanism for Linear Displacement of the VCP</title><p>The chosen mechanism provides linear displacement of the VCP towards the fingertip and control of the indentation of the inflated membrane. The rack length is 30 mm (other dimensions are shown in <xref ref-type="fig" rid="F4">Figure 4</xref>). This design was preferred to a parallel mechanism (<xref ref-type="bibr" rid="B32">Pacchierotti et al., 2014</xref>; <xref ref-type="bibr" rid="B19">Leonardis et al., 2015</xref>) to keep the size of the FHD to the minimum.</p><fig id="F4" position="float"><label>Figure 4</label><caption><p> Linear mechanism of the VCP using an RP mechanism.</p></caption><graphic xlink:href="frobt-05-00062-g004.tif"/></fig><p>The shaft of the motor (TowerPro MG92B, stall torque of 0.3 Nm) is directly attached to the pinion. The linear displacement &#x03B4; of the VCP can be calculated as follows:</p><p><disp-formula id="E1"><label>(1)</label><mml:math id="M11"><mml:mi>&#x3B4;</mml:mi><mml:mo>=</mml:mo><mml:mfenced separators="|"><mml:mrow><mml:mn>2</mml:mn><mml:mi>&#x3C0;</mml:mi><mml:mfrac><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:mfrac></mml:mrow></mml:mfenced><mml:mfrac><mml:mrow><mml:mi>&#x3B8;</mml:mi></mml:mrow><mml:mrow><mml:mn>360</mml:mn></mml:mrow></mml:mfrac></mml:math></disp-formula></p><p>where &#x03B8; is the angle of motor rotation and d is the diameter of the pinion. Due to the required teeth precision, both rack and pinion were laser-cut in acrylic.</p><p>The variable compliance is created by varying the pressure inside the VCP which is a function of the piston movement (x) and the movement of the RP (h). It can be approximated using (2):</p><p><disp-formula id="E2"><label>(2)</label><mml:math id="M12"><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mi>p</mml:mi><mml:mo>=</mml:mo><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>k</mml:mi><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mo>+</mml:mo><mml:mi>h</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow></mml:mstyle></mml:math></disp-formula></p><p>where p is the pressure inside the VCP, p<sub>0</sub> is the initial pressure, k<sub>p</sub> is the air spring constant between the piston and the finger. The piston movement (x) is proportional to the air volume supply through the syringe. The perceived hardness will be tested by a range of combinations of x and h that will effectively create different indentations in the human finger. </p></sec></sec><sec id="s2-2"><title>2.2. Motion Tracking and VR Environment</title><p>For the motion tracking of the user&#x2019;s fingertip, an MPU 6050 (InvenSense, USA) IMU is used. The IMU is integrated with the 3D printed support structure of the FHD (<xref ref-type="fig" rid="F1">Figure 1B</xref>). The IMU&#x2019;s raw data are sent to an Arduino MEGA 2560 board via I<sup>2</sup>C.</p><p>The motion of the user is tracked and fed into a virtual environment created in Unity 3D. The user moves virtual objects in the 3D VR environment, which is realised using the IMU data, while interaction with virtual objects is emulated by the FHD by (a) moving the VCP using the RP and (b) inflating the VCP using the syringe pump actuation system.</p></sec><sec id="s2-3"><title>2.3. User Studies</title><sec id="s2-3-1"><title>2.3.1. Air Volume &#x2013; VCP Linear Displacement Relation</title><p>Initial tests with 1 female and 1 male participant were carried out to establish a relationship between the air pressure and the linear displacement of the VCP. The users placed their index finger as shown in <xref ref-type="fig" rid="F1">Figure 1A</xref>. The index finger was used as previous studies have shown that is intuitively used by humans for palpation (<xref ref-type="bibr" rid="B16">Konstantinova et al., 2017</xref>). After the VCP was moved to the point of initial contact with the fingertip, the motor rotated with a 5-degree step (0.52 mm of linear displacement of the VCP). The tests were carried out with 2 ml, 3 ml and 4 ml of air in the VCP, shown in the graphs of <xref ref-type="fig" rid="F5">Figure 5</xref> for the female (red line) and male (blue line) user. Each participant repeated the process 3 times, with the results being exactly repeatable, possibly including small errors due to the sensor&#x2019;s resolution (0.25 kPa). Due to the different fingertip sizes, the maximum pressure in the VCP was overall higher for the female user. Maximum VCP displacement was 8.32 mm for the male user corresponding to a normal force of 3.85 N and 9.36 mm for the female with a normal force of 4.63 N. The normal forces exerted were measured when the fingertip was not present using a micro load cell (CZL635, Phidgets, 0&#x2013;49 N range).</p><fig id="F5" position="float"><label>Figure 5</label><caption><p> Pressure in the VCP vs maximum deformation of its surface.</p></caption><graphic xlink:href="frobt-05-00062-g005.tif"/></fig><p>These tests provided initial results for further investigations of the resulting VCP pressure and combinations of the supplied air volume and its linear displacement along the rack axis. This study was limited to two participants and the dataset with the lower measured force for the male participant was adopted for all subsequent tests to pre-empt potentially uncomfortable high forces between the VCP and the fingertip.</p></sec><sec id="s2-3-2"><title>2.3.2. Perception of Hardness Using the FHD</title><p>In order to represent and assess the hardness levels of the FHD, a wider user study was carried out with 15 participants (18&#x2013;34 years old, ratio of women/men 7/8, ratio of right/left dominant hand 13/2). The participants were asked to put the FHD on their dominant hand&#x2019;s index fingertip and score the hardness of the touch on a scale of 1&#x2013;5 (hard to soft). The experiments tested 10 different conditions created by varying the air volume inside the VCP as well as its linear displacement (and proximity to the fingertip). In one of the conditions, the VCP was not inflated while its linear displacement was 5.72 mm. The remaining 9 combinations are presented in <xref ref-type="table" rid="T1">Table 1</xref>. In this Table, &#x201C;x&#x201D; means that for that specific volume of air, the level of pressure could not be achieved. Each condition was tested 5 times by each participant in a randomised order after a short &#x201C;training&#x201D; session in which the participants could experience the different hardness levels of the FHD.</p><table-wrap id="T1" position="float"><label>Table 1</label><caption><p>Resulting pressure caused by different combinations of air volumes in the VCP and its linear displacement used to&#x00A0;test different hardness levels for the FHD.</p></caption><table frame="hsides" rules="rows"><thead><tr><td valign="bottom" colspan="2" rowspan="2"/><td align="center" valign="top" colspan="3">Air volume in the VCP</td></tr><tr><td align="center" valign="top">2 ml</td><td align="center" valign="top">3 ml</td><td align="center" valign="top">4 ml</td></tr></thead><tbody><tr><td valign="top" rowspan="4">Resulting Pressure</td><td align="center" valign="top">3.5 kPa</td><td align="center" valign="top">1.1 (5.2 mm)</td><td align="center" valign="top">1.2 (2.08 mm)</td><td align="center" valign="top">x</td></tr><tr><td align="center" valign="top">4.5 kPa</td><td align="center" valign="top">2.1 (7.28 mm)</td><td align="center" valign="top">2.2 (5.2 mm)</td><td align="center" valign="top">x</td></tr><tr><td align="center" valign="top">5 kPa</td><td align="center" valign="top">3.1 (7.8 mm)</td><td align="center" valign="top">3.2 (5.72 mm)</td><td align="center" valign="top">3.3 (2.08 mm)</td></tr><tr><td align="center" valign="top">7 kPa</td><td align="center" valign="top">x</td><td align="center" valign="top">4.2 (8.32 mm)</td><td align="center" valign="top">4.3 (6.76 mm)</td></tr></tbody></table></table-wrap><p>These experiments have compared the hardness perception of different users for the same level of pressure at different volumes of air in the VCP. For example, 3.5 kPa can be derived at 2 ml of air and 5.2 mm displacement as well as at 3 ml of air and 2.08 mm displacement of the VCP. The experimental measurements that were used are the ones presented for the male participant of the previous experiment presented in <xref ref-type="fig" rid="F5">Figure 5</xref>. Code names for each combination of air pressure and volume in the VCP that was used in this study, as well as the measured normal force exerted (micro load cell CZL635, Phidgets), are shown in <xref ref-type="table" rid="T2">Table 2</xref>.</p><table-wrap id="T2" position="float"><label>Table 2</label><caption><p> Corresponding exerted normal force for each condition of&#x00A0;Table 1.</p></caption><table frame="hsides" rules="rows"><thead><tr><td valign="top"> Condition</td><td align="center" valign="top">Normal Force (N)</td><td align="center" valign="top">Name in Table 1</td><td align="center" valign="top">Air Volume in the VCP (ml)</td><td align="center" valign="top">Resulting Pressure (kPa)</td></tr></thead><tbody><tr><td valign="top">1</td><td valign="top">6.23</td><td valign="top">3.1</td><td valign="top">2</td><td valign="top">5</td></tr><tr><td valign="top">2</td><td valign="top">2.87</td><td valign="top">2.2</td><td valign="top">3</td><td valign="top">4.5</td></tr><tr><td valign="top">3</td><td valign="top">4.33</td><td valign="top">4.3</td><td valign="top">4</td><td valign="top">7</td></tr><tr><td valign="top">4</td><td valign="top">5.55</td><td valign="top">2.1</td><td valign="top">2</td><td valign="top">4.5</td></tr><tr><td valign="top">5</td><td valign="top">3.14</td><td valign="top">1.1</td><td valign="top">2</td><td valign="top">3.5</td></tr><tr><td valign="top">6</td><td valign="top">6.76</td><td align="center" valign="top" colspan="3">No air</td></tr><tr><td valign="top">7</td><td valign="top">7.31</td><td valign="top">4.2</td><td valign="top">3</td><td valign="top">7</td></tr><tr><td valign="top">8</td><td valign="top">1.68</td><td valign="top">3.3</td><td valign="top">4</td><td valign="top">5</td></tr><tr><td valign="top">9</td><td valign="top">0.93</td><td valign="top">1.2</td><td valign="top">3</td><td valign="top">3.5</td></tr><tr><td valign="top">10</td><td valign="top">3.14</td><td valign="top">3.2</td><td valign="top">3</td><td valign="top">5</td></tr></tbody></table></table-wrap></sec></sec></sec><sec id="s3" sec-type="results"><title>3.&#x00A0;Results</title><p>This section presents the test results using the FHD that characterise its performance. Moreover, we have performed a series of experiments to assess user perception of the VCP hardness levels. All experiments were carried out in accordance with the recommendations of the University&#x2019;s policy on research ethics, UWE Research Ethics Committee. The protocol was approved by the Faculty of Environment &#x0026; Technology Research Ethics Committee. All subjects gave written informed consent in accordance with the Declaration of Helsinki.</p><sec id="s3-1"><title>3.1. Characterization of the FHD Components</title><sec id="s3-1-1"><title>3.1.1. VCP Pressure and Deformation</title><p>The maximum deformation of the VCP was measured 5 times using a high accuracy CCD laser displacement sensor (LK-G402, Keyence) while it was inflated with the volume of air in the range of 0&#x2013;4 ml (<xref ref-type="fig" rid="F6">Figure 6A</xref>) and without contact with the user&#x2019;s finger. Above 1 ml, the VCP deformation increases slower with the volume.</p><fig id="F6" position="float"><label>Figure 6</label><caption><p><bold>(</bold><bold>A</bold><bold>)</bold> Air volume (mm<sup>3</sup>) vs maximum VCP deformation and <bold>(B</bold><bold>)</bold> Pressure in the VCP vs supplied air volume.</p></caption><graphic xlink:href="frobt-05-00062-g006.tif"/></fig><p>The pressure was recorded when the VCP was inflated (blue line, <xref ref-type="fig" rid="F6">Figure 6B</xref>) and when it was inflated but with the index finger placed against the surface until contact with the rigid base of the VCP was felt by the user (red line, <xref ref-type="fig" rid="F6">Figure 6B</xref>). A steep pressure change (1.5 to 6.2 kPa) occurs when the air volume increases from 1 to 2ml and when the fingertip applies pressure on the VCP. During this volume increase, the maximum deformation changes from 48 to 71&#x0025; of its highest value, i.e., a 28&#x0025; change, compared to a change of 48&#x0025; for 0&#x2013;1 ml. This means that when the VCP deforms with a high rate, the pressure increases slowly. Maximum pressure with the finger present is 9.3 kPa, compared to 5.1 kPa without the presence of a fingertip. <xref ref-type="fig" rid="F7">Figure 7</xref> depicts the actual VCP in non-inflated and inflated states. For example, the maximum deformation at 4 ml is 8 mm.</p><fig id="F7" position="float"><label>Figure 7</label><caption><p> Inflation of the VCP with <bold>(</bold><bold>A</bold><bold>)</bold> no air <bold>(</bold><bold>B</bold><bold>)</bold> 2 ml and <bold>(</bold><bold>C</bold><bold>)</bold> 4 ml.</p></caption><graphic xlink:href="frobt-05-00062-g007.tif"/></fig><p>Combining the two graphs in <xref ref-type="fig" rid="F6">Figure 6</xref>, it is possible to identify a relationship between the pressure inside the VCP and its maximum deformation (<xref ref-type="fig" rid="F8">Figure 8</xref>). The air pressure inside the VCP does not change when subjected to deformations between 2 mm (0.5 ml) and 3.8 mm (1 ml). This was assumed to occur due to the gap between the rigid (base) and flexible (membrane) part of the VCP.</p><fig id="F8" position="float"><label>Figure 8</label><caption><p> Pressure inside the VCP vs its maximum deformation.</p></caption><graphic xlink:href="frobt-05-00062-g008.tif"/></fig></sec><sec id="s3-1-2"><title>3.1.2. User Study Assessment of FHD Hardness</title><p>The box graph of <xref ref-type="fig" rid="F9">Figure 9</xref> illustrates how participants scored the hardness of the VCP for each condition. While the hardness of condition 6 (the VCP was not inflated) was evaluated with a score of &#x201C;1&#x201D; (hard), conditions 8 (4 ml of air) and 9 (3 ml of air) were evaluated as the softest (scores of &#x201C;4&#x201D; and &#x201C;5&#x201D;). In both conditions, the VCP moved by 2.08 mm and applied force to the fingertip of participants gently with the percentage of hardness score &#x201C;4&#x201D; and &#x201C;5&#x201D; being similar; however, the percentage of score &#x201C;5&#x201D; is higher in condition 9 (just above 35&#x0025;), which suggests that for the same displacement, the VCP feels softer when filled with 3 ml of air. At 3 ml, the VCP is at medium capacity which makes it more compliant than at 2 ml or 4 ml. This is also seen when comparing conditions 2 (3 ml of air) and 5 (2 ml of air), for which the percentage of score &#x201C;2&#x201D; is under 20&#x0025; and just above 35&#x0025; respectively and hence condition 2 is considered softer.</p><fig id="F9" position="float"><label>Figure 9</label><caption><p> Distribution of participants&#x2019; response for each condition among a scale of 1&#x2013;5 (hard to soft).</p></caption><graphic xlink:href="frobt-05-00062-g009.tif"/></fig><p>The distribution of hardness score for conditions 1, 4 and 7 was between &#x201C;2&#x201D; and &#x201C;3&#x201D;. The percentage of score &#x201C;2&#x201D; of condition 1, 4 and 7 was approximately 50, 40 and 49&#x0025; respectively, with condition 4 providing slightly softer feeling than conditions 1 and 7. This was expected as the VCP was displaced by 0.52 mm more (1 step) than in condition 4. A comparison between conditions 1 and 7 shows that in the latter, the VCP has 1 ml of air more and it is displaced by 1 step more than in condition 1. As the hardness score is similar for these conditions, this indicates that 1 step of increase in air volume cancels out 1 step of increase in displacement. Comparing conditions 5 (2 ml of air) and 3 (4 ml of air and 3 steps of displacement more than condition (5), their percentage of the combined score of &#x201C;3&#x201D; and &#x201C;4&#x201D; is similar. However, condition 5 had a more equal distribution between scores &#x201C;2&#x201D;, &#x201C;3&#x201D; and &#x201C;4&#x201D; than condition 3 which, as will be discussed later, prompted a more consistent response between participants.</p><p>For conditions 2 and 10, the distribution was similar due to only 1 step of displacement difference between them, mainly between scores &#x201C;3&#x201D; and &#x201C;4&#x201D;, with &#x201C;3&#x201D; being the prevailing score. However, conditions 3 and 10 seem to have a clearer tendency towards a score of &#x201C;3&#x201D;, with condition 10 (3 ml of air) considered slightly softer. Finally, <xref ref-type="fig" rid="F10">Figure 10</xref> shows that there was no significant difference between responses of men and women, with SD for conditions 1&#x2013;10 respectively: 0.07, 0.11, 0.17, 0.15, 0.1, 0.04, 0.13, 0.14, 0.27, 0.11 (mean of 0.128).</p><fig id="F10" position="float"><label>Figure 10</label><caption><p> Mean scores for responses of women (red colour) and men (blue colour).</p></caption><graphic xlink:href="frobt-05-00062-g010.tif"/></fig><p>It is worth noting that user perception of hardness does not always correlate with the measured normal force exerted by the VCP. For example, condition 7 was considered softer than 6 despite the VCP exerting higher normal force in the former. This is due to the inflation of the VCP with 3 ml of air.</p><p>The 2nd column of <xref ref-type="table" rid="T3">Table 3</xref> summarises the conditions that correspond to each score of the 1st column according to most of the participants&#x2019; answers. However, for conditions 1 and 10, the participants&#x2019; responses were not consistent (each condition was randomly repeated 5 times). For example, participant A scored condition 1 with &#x201C;2, &#x201C;3&#x201D;, &#x201C;4&#x201D;, &#x201C;2&#x201D;, &#x201C;3&#x201D; across the 5 repetitions of the test, while participant B scored condition 5 with &#x201C;2&#x201D;,&#x201C;3&#x201D;, &#x201C;3&#x201D;, &#x201C;2&#x201D;, &#x201C;2&#x201D;. This would indicate that condition 5 receives more robust (consistent) responses than condition 1, as the participant appoints the same score to it more times (in this case, score &#x201C;2&#x201D; and &#x201C;3&#x201D;, instead of score &#x201C;2&#x201D;, &#x201C;3&#x201D; and &#x201C;4&#x201D;). Based on this criterion, conditions 2, 5, 6, 8 and 9 were the most robust, as shown in the 3<sup>rd</sup> column of <xref ref-type="table" rid="T3">Table 3</xref>. <xref ref-type="fig" rid="F11">Figure 11</xref> shows the distribution of the &#x201C;robustness percentage&#x201D; of all conditions, determined by whether a participant&#x2019;s set of (5) responses regarding a condition contained a maximum of 2 different scores (e.g., &#x201C;2&#x201D; and &#x201C;3&#x201D;).</p><table-wrap id="T3" position="float"><label>Table 3</label><caption><p> Results of user study: mapping each condition to a hardness score.</p></caption><table frame="hsides" rules="rows"><thead><tr><td valign="bottom">Score (1&#x2013;5, hard-soft)</td><td align="center" valign="bottom">Condition with highest percentage</td><td align="center" valign="bottom">Conditions with consistent responses</td></tr></thead><tbody><tr><td align="left" valign="top">&#x201C;1&#x201D;</td><td valign="top">6</td><td valign="top">6</td></tr><tr><td align="left" valign="top">&#x201C;2&#x201D;</td><td valign="top">1</td><td valign="top">5</td></tr><tr><td align="left" valign="top">&#x201C;3&#x201D;</td><td valign="top">10</td><td valign="top">2</td></tr><tr><td align="left" valign="top">&#x201C;4&#x201D;</td><td valign="top">8</td><td valign="top">8</td></tr><tr><td align="left" valign="top">&#x201C;5&#x201D;</td><td valign="top">9</td><td valign="top">9</td></tr></tbody></table></table-wrap><fig id="F11" position="float"><label>Figure 11</label><caption><p> Percentage of consistency (robustness) between participants&#x2019; responses.</p></caption><graphic xlink:href="frobt-05-00062-g011.tif"/></fig><p> It must be noted that responses of different participants can vary for a given condition. For example, participant C scored condition 9 with &#x201C;5&#x201D;, &#x201C;5&#x201D;, &#x201C;4&#x201D;, &#x201C;4&#x201D;, &#x201C;4&#x201D; while participant D scored it with &#x201C;3&#x201D;, &#x201C;3&#x201D;, &#x201C;2&#x201D;, &#x201C;2&#x201D;, &#x201C;2&#x201D;. In the analysis of the data and the application in the experiments of the following section, this behaviour was still treated as robust. Despite such discrepancies and because the responses can only be subjective for each individual, it was considered important that conditions evoke consistent responses for each individual. Consequently, the 3rd column of <xref ref-type="table" rid="T3">Table 3</xref> summarises the chosen robust conditions to be used in further studies of the FHD, while the 1st column shows their corresponding hardness level. The results show that the FHD can offer at least 5 different hardness levels and therefore it could realize various hardness levels of different objects in VR environment applications.</p></sec></sec><sec id="s3-2"><title>3.2. Implementation of the FHD - Path Following and Identification of Object Hardness</title><p>Based on the results of the previous user study and the experimental comparison between various combinations of the two features of the FHD (linear displacement and air volume of the VCP), the &#x201C;robust&#x201D; conditions of <xref ref-type="table" rid="T3">Table 3</xref> were used to emulate different levels of hardness in a VR environment created in Unity 3D.</p><p><xref ref-type="fig" rid="F12">Figure 12</xref> shows a snapshot of the environment; it includes a path (white) with start and end with 4 red objects placed at random points on the path (the size of each object has no importance in terms of haptic information). This path was the basis of a user study aimed at the evaluation of the FHD and its effectiveness in determining various levels of hardness as well as effectiveness in distinguishing between a safe and a &#x201C;no-go zone&#x201D;. Testing of the two features simultaneously provides a realistic scenario e.g., in a surgical operation where sensory information can be convoluted, and the surgeon must be able to correspond each cutaneous signal to its own stimuli. In total, 14 participants (24&#x2013;38 years old, ratio of women/men 1:1, ratio of right/left dominant hand 12/2) took part in this study.</p><fig id="F12" position="float"><label>Figure 12</label><caption><p> Task in Unity 3D showing a virtual path, moving sphere and lumps of various hardness levels.</p></caption><graphic xlink:href="frobt-05-00062-g012.tif"/></fig><p>Participants were asked to put the FHD on the index finger of their dominant hand, as shown in <xref ref-type="fig" rid="F1">Figure 1A</xref>, and use it to move the small white sphere (bottom part of <xref ref-type="fig" rid="F12">Figure 12</xref>) along the path. They did this by tilting their index finger (pitch) to control the forward/backward movement and pointing in the direction parallel to the sphere&#x2019;s chosen path (yaw). The IMU tracks the change of direction and the virtual sphere moves accordingly. The goal of the task was to move the sphere from start to end as fast and as accurately as possible, while receiving haptic feedback from the FHD. Force feedback is initiated when the small white sphere derails from the path as well as when it touches a red object (lump). The participants also need to discern which 2 of the 4 red lumps are the hardest.</p><p>A short &#x201C;training&#x201D; session allowed participants to get accustomed with navigation in the VR environment following a path (different to the path of the main experiment) and to familiarize themselves with various hardness levels by interacting with virtual objects. Subsequently, each participant completed 3 sets of a total of 6 tasks in a random sequence. In each of the 6 tasks, the participants experience various levels of haptic feedback when the sphere moves off-path (<xref ref-type="fig" rid="F13">Figure 13C</xref>) and when it touches the red lumps (<xref ref-type="fig" rid="F13">Figure 13B</xref>). The levels of hardness of the red lumps were chosen randomly and are summarised in <xref ref-type="table" rid="T4">Table 4</xref>. In task 4, the FHD provided no haptic feedback when the sphere derailed from the path or was on the lumps. Furthermore, the area surrounding the path was divided in 3 zones (inner, middle and outer zone), triggering levels of haptic feedback corresponding to increasing hardness as the sphere derails from the path, as shown in <xref ref-type="table" rid="T4">Table 4</xref>.</p><fig id="F13" position="float"><label>Figure 13</label><caption><p> Snapshots from an experiment where the moving sphere is <bold>(</bold><bold>A</bold><bold>)</bold> on the path, <bold>(</bold><bold>B</bold><bold>)</bold> on a red lump and <bold>(</bold><bold>C</bold><bold>)</bold> off-path.</p></caption><graphic xlink:href="frobt-05-00062-g013.tif"/></fig><table-wrap id="T4" position="float"><label>Table 4</label><caption><p> Levels of hardness (1&#x2013;5 for hard-soft) of (a) the red lumps per experimental path (from the one proximal to the start towards the end of the path) (b) the zones surrounding the path and (c) success rate for detection of the hard red lumps per path.</p></caption><table frame="hsides" rules="rows"><thead><tr><td valign="bottom" rowspan="2"/><td valign="top" colspan="4">Hardness level of red lump</td><td valign="top" colspan="3">Level of haptic feedback in zones surrounding the path</td><td valign="top" rowspan="2">User success rate in detecting the hard red lumps</td></tr><tr><td align="center" valign="bottom">1st</td><td align="center" valign="bottom">2nd</td><td align="center" valign="bottom">3rd</td><td align="center" valign="bottom">4th</td><td align="center" valign="bottom">Inner</td><td align="center" valign="bottom">Middle</td><td align="center" valign="bottom">Outer</td></tr></thead><tbody><tr><td valign="top">Task 1</td><td valign="top">&#x201C;2&#x201D;</td><td valign="top">&#x201C;4&#x201D;</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;4&#x201D;</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;2&#x201D;</td><td valign="top">&#x201C;1&#x201D;</td><td align="center" valign="top">83.3&#x0025;</td></tr><tr><td valign="top">Task 2</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;5&#x201D;</td><td valign="top">&#x201C;5&#x201D;</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;5&#x201D;</td><td valign="top">&#x201C;4&#x201D;</td><td valign="top">&#x201C;3&#x201D;</td><td align="center" valign="top">80&#x0025;</td></tr><tr><td valign="top">Task 3</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;2&#x201D;</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;2&#x201D;</td><td valign="top">&#x201C;1&#x201D;</td><td valign="top">&#x201C;1&#x201D;</td><td valign="top">&#x201C;1&#x201D;</td><td align="center" valign="top">78.6&#x0025;</td></tr><tr><td valign="top">Task 4</td><td align="center" valign="top" colspan="7">No haptic feedback</td><td align="center" valign="top">n/a</td></tr><tr><td valign="top">Task 5</td><td valign="top">&#x201C;5&#x201D;</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;5&#x201D;</td><td valign="top">&#x201C;4&#x201D;</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;2&#x201D;</td><td align="center" valign="top">90.5&#x0025;</td></tr><tr><td valign="top">Task 6</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;3&#x201D;</td><td valign="top">&#x201C;4&#x201D;</td><td valign="top">&#x201C;4&#x201D;</td><td valign="top">&#x201C;2&#x201D;</td><td valign="top">&#x201C;1&#x201D;</td><td valign="top">&#x201C;1&#x201D;</td><td align="center" valign="top">84.5&#x0025;</td></tr></tbody></table></table-wrap><p>The duration of each task, how long the moving sphere was off-path and its distance from the path were recorded. The RMS error of the distance between the moving sphere and the path was calculated to evaluate the accuracy and efficiency of the device at each level of haptic feedback. Note that the RMSE was calculated in units of the VR environment which do not correspond to physical units of measurements. For this reason, no units have been used in the analysis of the RMSE metric below.</p><sec id="s3-2-1"><title>3.2.1. Detection of Lump Hardness</title><p>The success rate of the hard red lump detection for each task is summarised in the last column of <xref ref-type="table" rid="T4">Table 4</xref>. Participants were most successful in task 5 with a success rate of 90.5&#x0025;. It is worth noting that tasks 5 and 2 involved red lumps with the same level of hardness (&#x201C;3&#x201D; and &#x201C;5&#x201D;). However, the levels of haptic feedback for the path differ between the two tasks, with the inner zone represented by &#x201C;4&#x201D; and &#x201C;5&#x201D; respectively. It is possible that the difference between the red lump hardness and the haptic feedback of the inner zone of task 5 lead to a better perception of the hardness of the lumps.</p><p>It can also be seen that successful detection was the lowest during task 3, where the haptic feedback of all zones had a level of &#x201C;1&#x201D; (rated as the hardest in the previous study). It is possible that the abrupt change from no feedback (when sphere is on the path) to maximum feedback (sphere goes off-path) confused the participants as they could not distinguish clearly whether the received feedback was related to the lump or the path.</p></sec><sec id="s3-2-2"><title>3.2.2. Learning Curve of FHD</title><p>To investigate the learning curve related to use of the FHD, the mean success rate of hard red lumps detection was calculated for each set of the 6 tasks (3 sets in total). It was found that there was an overall improvement from 81.4&#x0025; at the 1st set to 87.1&#x0025; successful detections at the final set of tasks when participants were more accustomed with the FHD. The same trend is true for other objective metrics, i.e., the RMS error of the distance between the moving sphere and the path, the duration of each trial and the total off-path time. These results are summarised in <xref ref-type="table" rid="T5">Table 5</xref>, showing 28&#x0025; increased accuracy in following the path, 23&#x0025; faster completion of each experiment and 45&#x0025; reduction in time spent off-path.</p><table-wrap id="T5" position="float"><label>Table 5</label><caption><p>Mean of objective metrics over all participants for each set of the and 6 tasks.</p></caption><table frame="hsides" rules="rows"><thead><tr><td valign="bottom"/><td align="center" valign="top">1st Set</td><td align="center" valign="top">2nd Set</td><td align="center" valign="top">3rd Set</td></tr></thead><tbody><tr><td valign="bottom">User success rate in detecting the hard red lumps</td><td align="center" valign="top">81.4 &#x0025;</td><td align="center" valign="top">82.8 &#x0025;</td><td align="center" valign="top">87.1 &#x0025;</td></tr><tr><td valign="bottom">RMSE in distance from path</td><td align="center" valign="top">1.774</td><td align="center" valign="top">1.636</td><td align="center" valign="top">1.273</td></tr><tr><td valign="bottom">Completion Time</td><td align="center" valign="top">41.25 s</td><td align="center" valign="top">33.99 s</td><td align="center" valign="top">31.78 s</td></tr><tr><td valign="bottom">Off-path total time</td><td align="center" valign="top">5.5 s</td><td align="center" valign="top">3.74 s</td><td align="center" valign="top">2.26 s</td></tr><tr><td valign="bottom">Percentage of off-path total time/completion time</td><td align="center" valign="top">13.23 &#x0025;</td><td align="center" valign="top">9.66 &#x0025;</td><td align="center" valign="top">7.33 &#x0025;</td></tr></tbody></table></table-wrap></sec><sec id="s3-2-3"><title>3.2.3. Comparison of tasks by level of haptic feedback</title><p>The mean of the same objective metrics can be calculated for each task and these are given in <xref ref-type="table" rid="T6">Table 6</xref>, where for each metric the best result between the Tasks is highlighted. Task 4 did not include any haptic feedback but instead participants relied solely on vision. Although it was completed in the shortest time, it was the one with the highest RMS error in the distance from the path and the highest percentage of off-path time out of the total completion time. On the contrary, all other tasks were associated with higher accuracy despite a longer completion time. Task 5 had the highest accuracy with a 23&#x0025; improvement compared to the accuracy without haptic feedback. Furthermore, when haptic feedback was available, time spent off-path was reduced, e.g., by 45&#x0025; in task 3 and by 36&#x0025; in task 5.</p><table-wrap id="T6" position="float"><label>Table 6</label><caption><p> Mean of objective metrics over all participants for each task.</p></caption><table frame="hsides" rules="rows"><thead><tr><td valign="bottom"/><td align="center" valign="bottom">Task 1</td><td align="center" valign="bottom">Task 2</td><td align="center" valign="bottom">Task 3</td><td align="center" valign="bottom">Task 4</td><td align="center" valign="bottom">Task 5</td><td align="center" valign="bottom">Task 6</td></tr></thead><tbody><tr><td valign="top">Success rate of detection of hard red lumps</td><td align="center" valign="top">83.3&#x0025;</td><td align="center" valign="top">80&#x0025;</td><td align="center" valign="top">78.6&#x0025;</td><td align="center" valign="top">n/a</td><td align="center" valign="top">90.5&#x0025;</td><td align="center" valign="top">84.5&#x0025;</td></tr><tr><td valign="top">RMSE in distance from path</td><td align="center" valign="top">1.562</td><td align="center" valign="top">1.532</td><td align="center" valign="top">1.41</td><td align="center" valign="top">1.76</td><td align="center" valign="top">1.36</td><td align="center" valign="top">1.74</td></tr><tr><td valign="top">Completion Time</td><td align="center" valign="top">36.11 s</td><td align="center" valign="top">34.9 s</td><td align="center" valign="top">38.87 s</td><td align="center" valign="top">26.54 s</td><td align="center" valign="top">39.11 s</td><td align="center" valign="top">40.56 s</td></tr><tr><td valign="top">Off-path total time</td><td align="center" valign="top">3.12 s</td><td align="center" valign="top">3.22 s</td><td align="center" valign="top">3.29 s</td><td align="center" valign="top">3.99 s</td><td align="center" valign="top">4.22 s</td><td align="center" valign="top">3.99 s</td></tr><tr><td valign="top">Percentage of off-path total time/completion time</td><td align="center" valign="top">8.45&#x0025;</td><td align="center" valign="top">9.06&#x0025;</td><td align="center" valign="top">8.01&#x0025;</td><td align="center" valign="top">14.57&#x0025;</td><td align="center" valign="top">9.35&#x0025;</td><td align="center" valign="top">11&#x0025;</td></tr></tbody></table></table-wrap><p>A comparison of the results between men and women shows that overall women were more accurate and spent less time off-path (<xref ref-type="fig" rid="F14">Figure 14</xref>). This could be due to differences in fingertip thickness; however, as there were no fingertip measurements taken, this cannot be conclusive.</p><fig id="F14" position="float"><label>Figure 14</label><caption><p> Comparison of RMSE and time off-path between men and women.</p></caption><graphic xlink:href="frobt-05-00062-g014.tif"/></fig></sec><sec id="s3-2-4"><title>3.2.4. Experience feedback from the user</title><p>After the end of each participation in the study, participants optionally gave feedback on their experience with the FHD and the VR environment, 12 of which answered the following open-style questions:</p><list list-type="bullet"><list-item><p>Do you have any comments about the &#x201C;path following&#x201D; experiment&#x003F;</p></list-item><list-item><p>Do you have any comments about the haptic feedback you experienced in general&#x003F;</p></list-item><list-item><p>Do you think that haptic feedback helped you complete the task&#x003F; Do you think you were faster/more efficient&#x003F;</p></list-item><list-item><p>You experienced various level of haptic feedback. Do you consider:</p><list list-type="bullet"><list-item><p>Any of these levels you experienced as distracting&#x003F;</p></list-item><list-item><p>The levels generally adequate&#x003F;</p></list-item><list-item><p>Any or all levels too small/negligible&#x003F;</p></list-item></list></list-item></list><p>From the 12 respondents answering the feedback form, 1 believed that his performance was better overall without the addition of haptic feedback. However, the objective metrics show that his accuracy in following the path was improved with the presence of haptic feedback. Most respondents thought that the hardness levels were adequate, while 1 thought that continuous haptic feedback was distracting and 1 found the hardest level the most difficult to identify. 7 respondents thought that the haptic feedback they experienced was useful and helped them complete the task, 2 were unsure and 1 found it unhelpful (2 did not comment on this). 4 mentioned that the task was difficult to complete: 2 of these respondents performed close to the average while the other 2 performed above the average in accuracy and without making any mistakes in identifying the lumps&#x2019; hardness levels.</p><p>When the virtual sphere was on the red lumps, 3 respondents found it difficult to distinguish between haptic feedback due to derailing from the path or due to the lump&#x2019;s hardness. These respondents made more mistakes in identifying lump hardness level than average, however their path following accuracy was above average. This is also observed from the results of Task 3 in <xref ref-type="table" rid="T6">Table 6</xref>, where the success of hardness identification is the lowest.</p><p>4 respondents mentioned wrist and muscle fatigue during the use of the FHD, but their overall performance was above the average. This was due to the hand movements necessary to control the 3D position of the virtual sphere using data from the IMU. 2 women respondents reported that having thin fingers might have prevented them from appropriately feeling the haptic feedback. Their hardness identification was average and their accuracy following the path was above the average.</p></sec></sec></sec><sec id="s4" sec-type="discussion"><title>4.&#x00A0;Discussion</title><p>The FHD provides haptic feedback via the VCP, a soft deformable surface, which can provide information to the user about objects with variable stiffness, including soft and deformable surfaces. This differs from other works in the literature, where, for example, stiffness is modelled as a rigid spring (<xref ref-type="bibr" rid="B22">Maereg et al., 2017</xref>). Furthermore, the FHD can allow for both &#x201C;pressing&#x201D; and &#x201C;tapping&#x201D; on the user&#x2019;s fingertip, depending on the task, unlike haptic devices mentioned in the Introduction Section which are continuously in contact with the user&#x2019;s fingertip.</p><p>As noted in <xref ref-type="table" rid="T2">Table 2</xref>, the FHD can provide a range of forces between 1&#x2013;7&#x00A0;N. The conditions that were tested had a mean of 4.2 N (SD of 2.1). The range of normal forces in the &#x201C;robust&#x201D; conditions had a mean of 3.7&#x00A0;N (SD of 2.3), which is comparable to the 3.2&#x00A0;N mean of normal forces measured when participants palpated tissue using their index finger directly as reported by <xref ref-type="bibr" rid="B16">Konstantinova et al. (2017)</xref>. The measured forces (of <xref ref-type="table" rid="T2">Table 2</xref>) are an estimate, as the precise magnitude depends on the thickness of the user&#x2019;s fingertip. Future work needs to include this measurement as a parameter of the FHD.</p><p>The user study for evaluation of the FHD was set up in a way that tests its efficacy on two different components: (a) using the haptic feedback as a warning of going into a &#x201C;no-go&#x201D; zone and at the same time (b) identifying lumps of different hardness. In a surgical scenario, the virtual path could represent a part of the human abdomen, while the lumps would represent tissue structures of various hardness. Different hardness can also indicate abnormal tissue, e.g., a tumour. From the results presented in <xref ref-type="table" rid="T5 T6">Tables 5, 6</xref>, it can be concluded that the presence of haptic feedback improves the positional accuracy of the participants by a mean of 23&#x0025;, while it also reduces the time spent in the &#x201C;no-go&#x201D; zone by 45&#x0025;. At the same time, participants were able to distinguish between different levels of object hardness, reaching an average of 90.5&#x0025; success in Task 5.</p><p>The conditions that are used in the experiments, and which are described in <xref ref-type="table" rid="T1">Table 1</xref>, were created by varying the parameters of the FHD and derive the combinations that helped determine its functionality. While these conditions do not relate to specific material properties of objects of daily life or of a surgical scenario, this was necessary so that a characterisation in terms of user perception of softness/hardness was developed. In future work, conditions of the FHD will be mapped to specific objects and this will be further tested by the users.</p><p>All participants finished the task faster when no haptic feedback was present. Most participants mentioned this in their feedback form after the experiments but believed that their accuracy was probably worse in that case. The experiments showed that when the VCP was inflated with 3 ml or 4 ml of air and touched the fingertip gently, participants gave the highest scores (indicating feeling of softness). When the displacement of the VCP increased, the hardness of the material was scored lower (harder). Furthermore, an important criterion used in determining the five different levels of hardness was their &#x201C;robustness&#x201D;, i.e., conditions that had consistent responses per participant. This was a necessary adaption, since perception of each user for the same condition can be different and due to various physical or psychological factors.</p><p>In general, women performed better than men in terms of accuracy, time spent off path and correct identification of &#x201C;harder&#x201D; lumps. As mentioned previously, this could be due to the differences in fingertip width or thickness (also reported by respondents on their feedback form), which can be further explored in future work. Calibration in software depending on fingertip in combination with a pressure sensor that detects initial contact with the fingertip width could dissipate these differences. Information from this sensor would be used to adjust the hardness levels; with the current setup, it is possible that users with larger fingertips experience saturation, i.e., levels &#x201C;1&#x201D; and &#x201C;2&#x201D; feel similar due to the linear displacement of the VCP pushing against their finger too much. Adjusting the linear motion of the VCP according to the user&#x2019;s fingertip thickness would result in more consistent normal forces among users at each hardness level. Consequently, calibration of the functionality of the FHD according to fingertip size could improve the consistency of user perception for each condition (as summarised in <xref ref-type="table" rid="T2">Table 2</xref>).</p><p>Future work will also include re-designing of the FHD to make it even more compact and portable, as well as replace its tracking system (IMU) with other hand tracking devices such as the sensory hand exoskeleton of our previous work, also meant for application in a surgical scenario (<xref ref-type="bibr" rid="B36">Tzemanaki et al., 2014</xref>). Future experiments can include PHF to complement the functionality of the FHD, by showing deformation in the virtual tissues when the user &#x201C;touches&#x201D; them as in the work by <xref ref-type="bibr" rid="B20">Li et al. (2015)</xref>. However, the purpose of this would only serve a scenario e.g. of a simulator for training whereas the FHD could also be able to provide haptic feedback during a surgical procedure.</p></sec><sec id="S5"><title>Ethics Statement</title><p>All experiments were carried out in accordance with the recommendations of the University&#x2019;s policy on research ethics, UWE Research Ethics Committee. The protocol was approved by the Faculty of Environment &#x0026; Technology Research Ethics Committee. All subjects gave written informed consent in accordance with the Declaration of Helsinki.</p></sec><sec id="S6"><title>Author Contributions</title><p>AT contributed the majority of the text, made substantial contributions in project conception, design and analysis of data and supervised the development process. GAA made substantial contributions in the design, development of the work and acquisition of data, took part in the analysis of results and contributed to drafting the work. CM contributed to the conception of the work and revised it. SD made substantial contributions to the conception of the work, supervised the design and development and revised the manuscript. All authors approved the final version and agree to be accountable in ensuring that questions related to the accuracy or integrity of any part of the work are investigated and resolved.</p></sec><sec id="S7"><title>Conflict of Interest Statement</title><p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p></sec></body><back><fn-group><fn fn-type="financial-disclosure"><p><bold>Funding.</bold> This project has received funding from the European Union&#x2019;s Horizon 2020 research and innovation programme under grant agreement No 732515&#x201D;.</p></fn></fn-group><ref-list><title>References</title><ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Abushagur</surname><given-names>A. A</given-names></name> <name><surname>Arsad</surname><given-names>N</given-names></name> <name><surname>Reaz</surname><given-names>M</given-names><suffix>I</suffix></name> <name><surname>Bakar</surname><given-names>A. A</given-names></name> <name><surname>Achibet</surname><given-names>M</given-names></name></person-group>. (<year>2014</year>). <article-title>Advances in bio-tactile sensors for minimally invasive surgery using the fibre Bragg grating force sensor technique: a survey</article-title>. <source>Sensors</source> <volume>14</volume> (<issue>4</issue>), <fpage>6633</fpage>&#x2013;<lpage>6665</lpage>. <pub-id pub-id-type="doi">10.3390/s140406633</pub-id></citation></ref><ref id="B2"><citation citation-type="thesis"><person-group person-group-type="author"><name><surname>Achibet</surname><given-names>M</given-names></name></person-group>. (<year>2015</year>) <article-title>Contributions to the design of novel hand-based interaction techniques for virtual environments</article-title>. <source>PhD thesis</source>. <publisher-loc>France</publisher-loc>: <publisher-name>INSA de Rennes</publisher-name>.</citation></ref><ref id="B3"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Bianchi</surname><given-names>M</given-names></name> <name><surname>Battaglia</surname><given-names>E</given-names></name> <name><surname>Poggiani</surname><given-names>M</given-names></name> <name><surname>Ciotti</surname><given-names>S</given-names></name> <name><surname>Bicchi</surname><given-names>A</given-names></name></person-group>. (<year>2016</year>). &#x201C;<article-title>A wearable fabric based display for haptic multi-cue delivery</article-title>,&#x201D; <comment>in</comment> <source>Haptics Symposium (HAPTICS), 2016 IEEE</source>. <publisher-name>IEEE</publisher-name> <fpage>277</fpage>&#x2013;<lpage>283</lpage>.</citation></ref><ref id="B4"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Chinello</surname><given-names>F</given-names></name> <name><surname>Malvezzi</surname><given-names>M</given-names></name> <name><surname>Pacchierotti</surname><given-names>C</given-names></name> <name><surname>Prattichizzo</surname><given-names>D</given-names></name></person-group>. (<year>2015</year>). &#x201C;<article-title>Design and development of a 3rrs wearable fingertip cutaneous device</article-title>&#x201D; <conf-name>IEEE International Conference on Advanced Intelligent Mechatronics (AIM)</conf-name> <fpage>293</fpage>&#x2013;<lpage>298</lpage>.</citation></ref><ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fontana</surname><given-names>M</given-names></name> <name><surname>Fabio</surname><given-names>S</given-names></name> <name><surname>Marcheschi</surname><given-names>S</given-names></name> <name><surname>Bergamasco</surname><given-names>M</given-names></name></person-group>. (<year>2013</year>). <article-title>Haptic hand exoskeleton for precision grasp simulation</article-title>. <source>J. Mech. Robot.</source> <volume>5</volume> (<issue>4</issue>):<elocation-id>041014</elocation-id>. <pub-id pub-id-type="doi">10.1115/1.4024981</pub-id></citation></ref><ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Frediani</surname><given-names>G</given-names></name> <name><surname>Mazzei</surname><given-names>D</given-names></name> <name><surname>De Rossi</surname><given-names>D. E</given-names></name> <name><surname>Carpi</surname><given-names>F</given-names></name></person-group>. (<year>2014</year>). <article-title>Wearable wireless tactile display for virtual interactions with soft bodies</article-title>. <source>Front. Bioeng. Biotechnol.</source> <volume>2</volume>, <fpage>31</fpage>. <pub-id pub-id-type="doi">10.3389/fbioe.2014.00031</pub-id></citation></ref><ref id="B6"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Gabardi</surname><given-names>M</given-names></name> <name><surname>Solazzi</surname><given-names>M</given-names></name> <name><surname>Leonardis</surname><given-names>D</given-names></name> <name><surname>Frisoli</surname><given-names>A</given-names></name></person-group>. (<year>2016</year>). &#x201C;<article-title>A new wearable fingertip haptic interface for the rendering of virtual shapes and surface features</article-title>&#x201D; <conf-name>2016 IEEE Haptics Symposium (HAPTICS)</conf-name> <fpage>140</fpage>&#x2013;<lpage>146</lpage>.</citation></ref><ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Garc&#x00ED;a</surname><given-names>J. C</given-names></name> <name><surname>Patr&#x00E3;o</surname><given-names>B</given-names></name> <name><surname>Almeida</surname><given-names>L</given-names></name> <name><surname>P&#x00E9;rez</surname><given-names>J</given-names></name> <name><surname>Menezes</surname><given-names>P</given-names></name> <name><surname>Dias</surname><given-names>J</given-names></name> <etal/></person-group>. (<year>2017</year>). <article-title>A natural interface for remote operation of underwater robots</article-title>. <source>IEEE Comput. Graph. Appl.</source> <volume>37</volume> (<issue>1</issue>), <fpage>34</fpage>&#x2013;<lpage>43</lpage>. <pub-id pub-id-type="doi">10.1109/MCG.2015.118</pub-id></citation></ref><ref id="B8"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Gu</surname><given-names>X</given-names></name> <name><surname>Zhang</surname><given-names>Y</given-names></name> <name><surname>Sun</surname><given-names>W</given-names></name> <name><surname>Bian</surname><given-names>Y</given-names></name> <name><surname>Zhou</surname><given-names>D</given-names></name> <name><surname>Kristensson</surname><given-names>PO</given-names></name></person-group>. (<year>2016</year>). &#x201C;<article-title>Dexmo: An inexpensive and lightweight mechanical exoskeleton for motion capture and force feedback in vr</article-title>&#x201D; <conf-name>Proceedings of the CHI Conference on Human Factors in Computing Systems</conf-name> <fpage>1991</fpage>&#x2013;<lpage>1995</lpage>.</citation></ref><ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hagn</surname><given-names>U</given-names></name> <name><surname>Konietschke</surname><given-names>R</given-names></name> <name><surname>Tobergte</surname><given-names>A</given-names></name> <name><surname>Nickl</surname><given-names>M</given-names></name> <name><surname>J&#x00F6;rg</surname><given-names>S</given-names></name> <name><surname>K&#x00FC;bler</surname><given-names>B</given-names></name> <etal/></person-group>. (<year>2010</year>). <article-title>DLR MiroSurge: a versatile system for research in endoscopic telesurgery</article-title>. <source>Int. J. Comput. Assist. Radiol. Surg.</source> <volume>5</volume> (<issue>2</issue>), <fpage>183</fpage>&#x2013;<lpage>193</lpage>. <pub-id pub-id-type="doi">10.1007/s11548-009-0372-4</pub-id></citation></ref><ref id="B11"><citation citation-type="web"><collab>Intuitive Surgical</collab>. (<year>2017</year>). <article-title>Da Vinci Surgery - Robotic-Assisted Minimally Invasive Surgery</article-title>. Available at: <uri xlink:href="www.davincisurgery.com/davinci-surgery">www.davincisurgery.com/davinci-surgery</uri></citation></ref><ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Iqbal</surname><given-names>J</given-names></name> <name><surname>Khan</surname><given-names>H</given-names></name> <name><surname>Tsagarakis</surname><given-names>N. G</given-names></name> <name><surname>Caldwell</surname><given-names>D. G</given-names></name></person-group>. (<year>2014</year>). <article-title>A novel exoskeleton robotic system for hand rehabilitation&#x2013;conceptualization to prototyping</article-title>. <source>Biocybern. Biomed. Eng.</source> <volume>34</volume> (<issue>2</issue>), <fpage>79</fpage>&#x2013;<lpage>89</lpage>. <pub-id pub-id-type="doi">10.1016/j.bbe.2014.01.003</pub-id></citation></ref><ref id="B13"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Kim</surname><given-names>M</given-names></name> <name><surname>Jang</surname><given-names>I</given-names></name> <name><surname>Lee</surname><given-names>Y</given-names></name> <name><surname>Lee</surname><given-names>Y</given-names></name> <name><surname>Lee</surname><given-names>D</given-names></name></person-group>. (<year>2016</year>). &#x201C;<article-title>Wearable 3-dof cutaneous haptic device with integrated imu-based finger tracking</article-title>&#x201D; <conf-name>13th IEEE International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)</conf-name> <fpage>649</fpage>&#x2013;<lpage>649</lpage>.</citation></ref><ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kirkman</surname><given-names>M. A</given-names></name> <name><surname>Ahmed</surname><given-names>M</given-names></name> <name><surname>Albert</surname><given-names>A. F</given-names></name> <name><surname>Wilson</surname><given-names>M. H</given-names></name> <name><surname>Nandi</surname><given-names>D</given-names></name> <name><surname>Sevdalis</surname><given-names>N</given-names></name></person-group>. (<year>2014</year>). <article-title>The use of simulation in neurosurgical education and training. A systematic review</article-title>. <source>J. Neurosurg.</source> <volume>121</volume> (<issue>2</issue>), <fpage>228</fpage>&#x2013;<lpage>246</lpage>. <pub-id pub-id-type="doi">10.3171/2014.5.JNS131766</pub-id></citation></ref><ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Koo</surname><given-names>I. M</given-names></name> <name><surname>Jung</surname><given-names>K</given-names></name> <name><surname>Koo</surname><given-names>J. C</given-names></name> <name><surname>Nam</surname><given-names>J. -D</given-names></name> <name><surname>Lee</surname><given-names>Y. K</given-names></name> <name><surname>Choi</surname><given-names>H. R</given-names></name></person-group>. (<year>2008</year>). <article-title>Development of soft-actuator-based wearable tactile display</article-title>. <source>IEEE Trans. Robot.</source> <volume>24</volume> (<issue>3</issue>), <fpage>549</fpage>&#x2013;<lpage>558</lpage>. <pub-id pub-id-type="doi">10.1109/TRO.2008.921561</pub-id></citation></ref><ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Konstantinova</surname><given-names>J</given-names></name> <name><surname>Cotugno</surname><given-names>G</given-names></name> <name><surname>Dasgupta</surname><given-names>P</given-names></name> <name><surname>Althoefer</surname><given-names>K</given-names></name> <name><surname>Nanayakkara</surname><given-names>T</given-names></name></person-group>. (<year>2017</year>). <article-title>Palpation force modulation strategies to identify hard regions in soft tissue organs</article-title>. <source>PLoS ONE</source> <volume>12</volume> (<issue>2</issue>):<elocation-id>e0171706</elocation-id>. <pub-id pub-id-type="doi">10.1371/journal.pone.0171706</pub-id></citation></ref><ref id="B17"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Kulakov</surname><given-names>F</given-names></name> <name><surname>Alferov</surname><given-names>G</given-names></name> <name><surname>Efimova</surname><given-names>P</given-names></name></person-group>. (<year>2015</year>). &#x201C;<article-title>Methods of remote control over space robots</article-title>,&#x201D; <comment>in</comment> <source>Mechanics-Seventh Polyakhov&#x2019;s Reading, 2015 International Conference on</source>. <publisher-name>IEEE</publisher-name> <fpage>1</fpage>&#x2013;<lpage>6</lpage>.</citation></ref><ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lemole</surname><given-names>G. M</given-names></name> <name><surname>Banerjee</surname><given-names>P. P</given-names></name> <name><surname>Luciano</surname><given-names>C</given-names></name> <name><surname>Neckrysh</surname><given-names>S</given-names></name> <name><surname>Charbel</surname><given-names>F. T</given-names></name></person-group>. (<year>2007</year>). <article-title>Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback</article-title>. <source>Neurosurgery</source> <volume>61</volume> (<issue>1</issue>), <fpage>142</fpage>&#x2013;<lpage>149</lpage>. <pub-id pub-id-type="doi">10.1227/01.neu.0000279734.22931.21</pub-id></citation></ref><ref id="B19"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Leonardis</surname><given-names>D</given-names></name> <name><surname>Solazzi</surname><given-names>M</given-names></name> <name><surname>Bortone</surname><given-names>I</given-names></name> <name><surname>Frisoli</surname><given-names>A</given-names></name></person-group>. (<year>2015</year>). &#x201C;<article-title>A wearable fingertip haptic device with 3 dof asymmetric 3-rsr kinematics</article-title>&#x201D; <conf-name>IEEE World Haptics Conference (WHC)</conf-name> <fpage>388</fpage>&#x2013;<lpage>393</lpage>.</citation></ref><ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname><given-names>M</given-names></name> <name><surname>Konstantinova</surname><given-names>J</given-names></name> <name><surname>Secco</surname><given-names>E. L</given-names></name> <name><surname>Jiang</surname><given-names>A</given-names></name> <name><surname>Liu</surname><given-names>H</given-names></name> <name><surname>Nanayakkara</surname><given-names>T</given-names></name> <etal/></person-group>. (<year>2015</year>). <article-title>Using visual cues to enhance haptic feedback for palpation on virtual model of soft tissue</article-title>. <source>Med. Biol. Eng. Comput.</source> <volume>53</volume> (<issue>11</issue>), <fpage>1177</fpage>&#x2013;<lpage>1186</lpage>. <pub-id pub-id-type="doi">10.1007/s11517-015-1309-4</pub-id></citation></ref><ref id="B21"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Ma</surname><given-names>X</given-names></name> <name><surname>Guo</surname><given-names>S</given-names></name> <name><surname>Guo</surname><given-names>J</given-names></name> <name><surname>Wei</surname><given-names>W</given-names></name> <name><surname>Ji</surname><given-names>Y</given-names></name> <name><surname>Wang</surname><given-names>Y</given-names></name></person-group>. (<year>2014</year>). &#x201C;<article-title>A developed robotic manipulation system for remote catheter operation</article-title>&#x201D; <conf-name>IEEE International Conference on Mechatronics and Automation (ICMA)</conf-name> <fpage>912</fpage>&#x2013;<lpage>917</lpage>.</citation></ref><ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maereg</surname><given-names>A. T</given-names></name> <name><surname>Nagar</surname><given-names>A</given-names></name> <name><surname>Reid</surname><given-names>D</given-names></name> <name><surname>Secco</surname><given-names>E. L</given-names></name></person-group>. (<year>2017</year>). <article-title>Wearable vibrotactile haptic device for stiffness discrimination during virtual interactions</article-title>. <source>Front. Robot. AI.</source> <volume>4</volume>:<elocation-id>42</elocation-id>. <pub-id pub-id-type="doi">10.3389/frobt.2017.00042</pub-id></citation></ref><ref id="B23"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Meng</surname><given-names>M</given-names></name> <name><surname>Fallavollita</surname><given-names>P</given-names></name> <name><surname>Blum</surname><given-names>T</given-names></name> <name><surname>Eck</surname><given-names>U</given-names></name> <name><surname>Sandor</surname><given-names>C</given-names></name> <name><surname>Weidert</surname><given-names>S</given-names></name></person-group>. (<year>2013</year>). &#x201C;<article-title>Kinect for interactive ar anatomy learning</article-title>,&#x201D; <comment>in</comment> <source>Mixed and Augmented Reality (ISMAR), 2013 IEEE International Symposium on</source>. <publisher-name>IEEE</publisher-name> <fpage>277</fpage>&#x2013;<lpage>278</lpage>.</citation></ref><ref id="B24"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Minamizawa</surname><given-names>K</given-names></name> <name><surname>Fukamachi</surname><given-names>S</given-names></name> <name><surname>Kajimoto</surname><given-names>H</given-names></name> <name><surname>Kawakami</surname><given-names>N</given-names></name> <name><surname>Tachi</surname><given-names>S</given-names></name></person-group>. (<year>2007</year>). &#x201C;<article-title>Gravity grabber: wearable haptic display to present virtual mass sensation</article-title>,&#x201D; <comment>in</comment> <source>ACM SIGGRAPH 2007 emerging technologies (ACM)</source> <fpage>p. 8</fpage>.</citation></ref><ref id="B39"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Monroy</surname><given-names>M</given-names></name> <name><surname>Oyarzabal</surname><given-names>M</given-names></name> <name><surname>Ferre</surname><given-names>M</given-names></name> <name><surname>Campos</surname><given-names>A</given-names></name> <name><surname>Barrio</surname><given-names>J</given-names></name></person-group>. (<year>2008</year>). &#x201C;<article-title>MasterFinger: Multi-finger Haptic Interface for Collaborative Environments</article-title>&#x201D; <conf-name>In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications</conf-name> <fpage>411</fpage>&#x2013;<lpage>419</lpage>.</citation></ref><ref id="B25"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Murray</surname><given-names>A. M</given-names></name> <name><surname>Klatzky</surname><given-names>R. L</given-names></name> <name><surname>Khosla</surname><given-names>P. K</given-names></name></person-group>. (<year>2003</year>). &#x201C;<article-title>Psychophysical characterization and testbed validation of a wearable vibrotactile glove for telemanipulation</article-title>,&#x201D; <comment>in</comment> <source>Presence: Teleoperators and Virtual Environments</source>, <volume>Vol. 12</volume>, <fpage>156</fpage>&#x2013;<lpage>182</lpage>. <pub-id pub-id-type="doi">10.1162/105474603321640923</pub-id></citation></ref><ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nagatani</surname><given-names>K</given-names></name> <name><surname>Kiribayashi</surname><given-names>S</given-names></name> <name><surname>Okada</surname><given-names>Y</given-names></name> <name><surname>Otake</surname><given-names>K</given-names></name> <name><surname>Yoshida</surname><given-names>K</given-names></name> <name><surname>Tadokoro</surname><given-names>S</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Emergency response to the nuclear accident at the fukushima daiichi nuclear power plants using mobile rescue robots</article-title>. <source>J. Field Robotics</source> <volume>30</volume> (<issue>1</issue>), <fpage>44</fpage>&#x2013;<lpage>63</lpage>. <pub-id pub-id-type="doi">10.1002/rob.21439</pub-id></citation></ref><ref id="B30"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Pacchierotti</surname><given-names>C</given-names></name> <name><surname>Chinello</surname><given-names>F</given-names></name> <name><surname>Malvezzi</surname><given-names>M</given-names></name> <name><surname>Meli</surname><given-names>L</given-names></name> <name><surname>Prattichizzo</surname><given-names>D</given-names></name></person-group>. (<year>2012</year>). &#x201C;<article-title>Two finger grasping simulation with cutaneous and kinesthetic force feedback</article-title>&#x201D; <conf-name>International Conference on Human Haptic Sensing and Touch Enabled Computer Applications</conf-name> <fpage>373</fpage>&#x2013;<lpage>382</lpage>.</citation></ref><ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pacchierotti</surname><given-names>C</given-names></name> <name><surname>Sinclair</surname><given-names>S</given-names></name> <name><surname>Solazzi</surname><given-names>M</given-names></name> <name><surname>Frisoli</surname><given-names>A</given-names></name> <name><surname>Hayward</surname><given-names>V</given-names></name> <name><surname>Prattichizzo</surname><given-names>D</given-names></name></person-group>. (<year>2017</year>). <article-title>&#x2018;Wearable haptic systems for the fingertip and the hand: taxonomy, review, and perspectives&#x2019;</article-title>. <source>IEEE Trans. Haptics</source> <volume>10</volume> (<issue>4</issue>):<elocation-id>580</elocation-id>&#x2013;<lpage>600</lpage>. <pub-id pub-id-type="doi">10.1109/TOH.2017.2689006</pub-id></citation></ref><ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pacchierotti</surname><given-names>C</given-names></name> <name><surname>Tirmizi</surname><given-names>A</given-names></name> <name><surname>Prattichizzo</surname><given-names>D</given-names></name></person-group>. (<year>2014</year>). <article-title>Improving transparency in teleoperation by means of cutaneous tactile force feedback</article-title>. <source>ACM Trans. Appl. Percept.</source> <volume>11</volume> (<issue>1</issue>):<elocation-id>4</elocation-id>&#x2013;<lpage>16</lpage>. <pub-id pub-id-type="doi">10.1145/2604969</pub-id></citation></ref><ref id="B27"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Park</surname><given-names>Y</given-names></name> <name><surname>Jo</surname><given-names>I</given-names></name> <name><surname>Bae</surname><given-names>J</given-names></name></person-group>. (<year>2016</year>). &#x201C;<article-title>Development of a dual-cable hand exoskeleton system for virtual reality</article-title>&#x201D; <conf-name>IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)</conf-name> <fpage>1019</fpage>&#x2013;<lpage>1024</lpage>.</citation></ref><ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peters</surname><given-names>M</given-names></name> <name><surname>Mackenzie</surname><given-names>K</given-names></name> <name><surname>Bryden</surname><given-names>P</given-names></name></person-group>. (<year>2002</year>). <article-title>Finger length and distal finger extent patterns in humans</article-title>. <source>Am. J. Phys. Anthropol.</source> <volume>117</volume> (<issue>3</issue>), <fpage>209</fpage>&#x2013;<lpage>217</lpage>. <pub-id pub-id-type="doi">10.1002/ajpa.10029</pub-id></citation></ref><ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Prattichizzo</surname><given-names>D</given-names></name> <name><surname>Chinello</surname><given-names>F</given-names></name> <name><surname>Pacchierotti</surname><given-names>C</given-names></name> <name><surname>Malvezzi</surname><given-names>M</given-names></name></person-group>. (<year>2013</year>). <article-title>Towards wearability in fingertip haptics: a 3-dof wearable device for cutaneous force feedback</article-title>. <source>IEEE Trans. Haptics</source> <volume>6</volume> (<issue>4</issue>), <fpage>506</fpage>&#x2013;<lpage>516</lpage>. <pub-id pub-id-type="doi">10.1109/TOH.2013.53</pub-id></citation></ref><ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schorr</surname><given-names>S. B</given-names></name> <name><surname>Okamura</surname><given-names>A. M</given-names></name></person-group>. (<year>2017</year>). <article-title>Three-dimensional skin deformation as force substitution: wearable device design and performance during haptic exploration of virtual environments</article-title>. <source>IEEE Trans. Haptics</source> <volume>10</volume> (<issue>3</issue>):<elocation-id>418</elocation-id>&#x2013;<lpage>430</lpage>. <pub-id pub-id-type="doi">10.1109/TOH.2017.2672969</pub-id></citation></ref><ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Srinivasan</surname><given-names>M. A</given-names></name> <name><surname>Basdogan</surname><given-names>C</given-names></name></person-group>. (<year>1997</year>). <article-title>Haptics in virtual environments: Taxonomy, research status, and challenges</article-title>. <source>Comput. Graph.</source> <volume>21</volume> (<issue>4</issue>), <fpage>393</fpage>&#x2013;<lpage>404</lpage>. <pub-id pub-id-type="doi">10.1016/S0097-8493(97)00030-7</pub-id></citation></ref><ref id="B35"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Tsetserukou</surname><given-names>D</given-names></name> <name><surname>Sato</surname><given-names>K</given-names></name> <name><surname>Tachi</surname><given-names>S</given-names></name></person-group>. (<year>2010</year>). &#x201C;<article-title>Exo-interfaces: novel exoskeleton haptic interfaces for virtual reality, augmented sport and rehabilitation</article-title>&#x201D; <conf-name>Proceedings of the 1st Augmented Human International Conference (ACM).</conf-name></citation></ref><ref id="B36"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Tzemanaki</surname><given-names>A</given-names></name> <name><surname>Burton</surname><given-names>TM</given-names></name> <name><surname>Gillatt</surname><given-names>D</given-names></name> <name><surname>Melhuish</surname><given-names>C</given-names></name> <name><surname>Persad</surname><given-names>R</given-names></name> <name><surname>Pipe</surname><given-names>AG</given-names></name></person-group>. (<year>2014</year>). &#x201C;<article-title>&#x03BC;Angelo: A novel minimally invasive surgical system based on an anthropomorphic design</article-title>&#x201D; <conf-name>5th IEEE RAS &#x0026; EMBS International Conference on Biomedical Robotics and Biomechatronics</conf-name> <fpage>369</fpage>&#x2013;<lpage>374</lpage>.</citation></ref><ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>van der Meijden</surname><given-names>O. A. J</given-names></name> <name><surname>Schijven</surname><given-names>M. P</given-names></name></person-group>. (<year>2009</year>). <article-title>The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review</article-title>. <source>Surg. Endosc.</source> <volume>23</volume> (<issue>6</issue>), <fpage>1180</fpage>&#x2013;<lpage>1190</lpage>. <pub-id pub-id-type="doi">10.1007/s00464-008-0298-x</pub-id></citation></ref><ref id="B38"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Vander Poorten</surname><given-names>E</given-names></name> <name><surname>Demeester</surname><given-names>E</given-names></name> <name><surname>Lammertse</surname><given-names>P</given-names></name></person-group>. (<year>2012</year>). &#x201C;<article-title>Haptic feedback for medical applications, a survey</article-title>,&#x201D; <comment>in</comment> <source>Proceedings Actuator 2012</source>.</citation></ref></ref-list></back></article>
