<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Robot. AI</journal-id>
<journal-title>Frontiers in Robotics and AI</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Robot. AI</abbrev-journal-title>
<issn pub-type="epub">2296-9144</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/frobt.2019.00124</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Robotics and AI</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Tactile Signatures and Hand Motion Intent Recognition for Wearable Assistive Devices</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Stefanou</surname> <given-names>Thekla</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/527937/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Chance</surname> <given-names>Greg</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/788169/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Assaf</surname> <given-names>Tareq</given-names></name>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Dogramadzi</surname> <given-names>Sanja</given-names></name>
<xref ref-type="aff" rid="aff4"><sup>4</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/384296/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>ZITI, Heidelberg University</institution>, <addr-line>Heidelberg</addr-line>, <country>Germany</country></aff>
<aff id="aff2"><sup>2</sup><institution>Bristol Robotics Laboratory, Department of Computer Science, University of Bristol</institution>, <addr-line>Bristol</addr-line>, <country>United Kingdom</country></aff>
<aff id="aff3"><sup>3</sup><institution>Department of Electronic and Electrical Engineering, University of Bath</institution>, <addr-line>Bath</addr-line>, <country>United Kingdom</country></aff>
<aff id="aff4"><sup>4</sup><institution>Bristol Robotics Laboratory, Department of Engineering Design and Mathematics, University of the West England</institution>, <addr-line>Bristol</addr-line>, <country>United Kingdom</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Jungwon Yoon, Gwangju Institute of Science and Technology, South Korea</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Wajid Mumtaz, University of Technology Petronas, Malaysia; Gustavo Hern&#x000E1;ndez Melgarejo, CINVESTAV Unidad Guadalajara, Mexico</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Thekla Stefanou <email>thekla.stefanou&#x00040;ziti.uni-heidelberg.de</email></corresp>
<fn fn-type="other" id="fn001"><p>This article was submitted to Biomedical Robotics, a section of the journal Frontiers in Robotics and AI</p></fn></author-notes>
<pub-date pub-type="epub">
<day>21</day>
<month>11</month>
<year>2019</year>
</pub-date>
<pub-date pub-type="collection">
<year>2019</year>
</pub-date>
<volume>6</volume>
<elocation-id>124</elocation-id>
<history>
<date date-type="received">
<day>22</day>
<month>04</month>
<year>2019</year>
</date>
<date date-type="accepted">
<day>04</day>
<month>11</month>
<year>2019</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2019 Stefanou, Chance, Assaf and Dogramadzi.</copyright-statement>
<copyright-year>2019</copyright-year>
<copyright-holder>Stefanou, Chance, Assaf and Dogramadzi</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract><p>Within the field of robotics and autonomous systems where there is a human in the loop, intent recognition plays an important role. This is especially true for wearable assistive devices used for rehabilitation, particularly post-stroke recovery. This paper reports results on the use of tactile patterns to detect weak muscle contractions in the forearm while at the same time associating these patterns with the muscle synergies during different grips. To investigate this concept, a series of experiments with healthy participants were carried out using a tactile arm brace (TAB) on the forearm while performing four different types of grip. The expected force patterns were established by analysing the muscle synergies of the four grip types and the forearm physiology. The results showed that the tactile signatures of the forearm recorded on the TAB align with the anticipated force patterns. Furthermore, a linear separability of the data across all four grip types was identified. Using the TAB data, machine learning algorithms achieved a 99% classification accuracy. The TAB results were highly comparable to a similar commercial intent recognition system based on a surface electromyography (sEMG) sensing.</p></abstract>
<kwd-group>
<kwd>motion intent</kwd>
<kwd>wearable sensors</kwd>
<kwd>upper-limb</kwd>
<kwd>tactile sensing</kwd>
<kwd>assistive devices</kwd>
</kwd-group>
<contract-sponsor id="cn001">Engineering and Physical Sciences Research Council<named-content content-type="fundref-id">10.13039/501100000266</named-content></contract-sponsor>
<counts>
<fig-count count="10"/>
<table-count count="5"/>
<equation-count count="0"/>
<ref-count count="45"/>
<page-count count="17"/>
<word-count count="10484"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>1. Introduction</title>
<p>The motivation behind this work lies in empowering individuals with mobility impairments to rehabilitate after stroke or similar debilitating conditions. With an aging population (World Health Organisation, <xref ref-type="bibr" rid="B41">2014</xref>), keeping people active and independent for as long as possible is becoming increasingly important. The number of occupational therapists and physiotherapists in the UK is not sufficient to cover the needs of this aging demographic (McHugh and Swain, <xref ref-type="bibr" rid="B20">2013</xref>). Rehabilitation robots have shown a potential to alleviate this problem by assisting in controlled, repetitive movements typically provided by the therapists. By recognizing patients&#x00027; motion intent, the wearable rehabilitative devices can further assist in performing the desired movement. These devices should provide just enough force to move the limbs as intended keeping the patient in the control loop (Warraich and Kleim, <xref ref-type="bibr" rid="B37">2010</xref>).</p>
<p>Intent recognition has been the subject of numerous studies and various sensing modalities have been used over the years. The consistency and accuracy of motion intent recognition devices varies though depending on the conditions (patient strength, skin moisture etc.). Muscle contraction gives rise to two types of signals, electrical and mechanical. The former, in the form of electromyography (EMG) has been implemented in many commercial products for motion intent recognition (Thalmic Labs, <xref ref-type="bibr" rid="B35">2015b</xref>; Wearable Devices Ltd., <xref ref-type="bibr" rid="B38">2019</xref>). Most recent research has shifted toward the observation of the mechanical signals produced during muscle contraction, namely mechanomyography (MMG) and force myography (FMG), also referred to as tactile imaging. Over the last few decades EMG motion intent recognition has been widely implemented in assisted living (Kiguchi and Hayashi, <xref ref-type="bibr" rid="B13">2012</xref>), rehabilitation (Rehab-Robotics Company Limited, <xref ref-type="bibr" rid="B28">2017</xref>) and prosthetic systems (Atzori et al., <xref ref-type="bibr" rid="B2">2014</xref>). Nonetheless, EMG controlled systems have still not reached acceptable, consistent performance as indicated by (Farina et al., <xref ref-type="bibr" rid="B8">2014</xref>). One of the main inconsistency factors is that the electric potential detected on the surface of the skin as a result of muscle contraction could be affected by skin impedance, while adipose tissue can induce crosstalk. Despite the acknowledged limitations of EMG devices the electromyography device market is expected to grow in the years to come (Technavio, <xref ref-type="bibr" rid="B33">2018</xref>). This shows that there is a demand for understanding and measuring muscle activity that can be incorporated in motion intent recognition systems and integrated into wearable rehabilitation devices.</p>
<p>More recent works report results of the integration of EMG sensors with other means of sensing such as force sensing (Guo et al., <xref ref-type="bibr" rid="B10">2015</xref>; McIntosh et al., <xref ref-type="bibr" rid="B21">2016</xref>). Motion intent recognition studies have observed the mechanical signals produced as a result of the contraction of the muscles (Yap et al., <xref ref-type="bibr" rid="B44">2016</xref>). Two different approaches have been implemented; MMG and FMG. MMG detects low frequency muscle vibrations and their velocity and intensity, usually through the use of accelerometers (Islam et al., <xref ref-type="bibr" rid="B12">2013</xref>). FMG observes muscle architectural changes during contraction that can be monitored by force or stiffness changes on the skin surface (Phillips and Craelius, <xref ref-type="bibr" rid="B25">2017</xref>). A plethora of research on the use of MMG considers the characterization of muscle activity and fatigue as well as the diagnosis of neuromuscular disorders (Islam et al., <xref ref-type="bibr" rid="B12">2013</xref>), sometimes in combination with sEMG (Tarata, <xref ref-type="bibr" rid="B32">2009</xref>). More recent research has focused on using MMG as a control input for prosthesis and medical rehabilitation devices (Ding et al., <xref ref-type="bibr" rid="B6">2017a</xref>,<xref ref-type="bibr" rid="B7">b</xref>). In Ding et al. (<xref ref-type="bibr" rid="B7">2017b</xref>) the MMG based system, which used an inertial measurement unit, with an accelerometer and a gyroscope, achieved a 94% accuracy when distinguishing between the fingers performing tapping motions.</p>
<p>The idea that the volumetric and shape changes that take place inside the muscle can be monitored on the skin surface and used as an indication of motion intent was first captured by Moromugi et al. (<xref ref-type="bibr" rid="B22">2004</xref>). They implemented push buttons with load sensors indented in the skin to capture &#x0201C;muscle stiffness&#x0201D; for the purpose of actuating a prosthetic hand. Wininger et al. (<xref ref-type="bibr" rid="B39">2008</xref>) performed one of the first studies implementing FSR sensing to predict grip force in hand prostheses. Using a grip dynamometer, they mapped the readings measured during gripping and the pressure exerted by the forearm on the force measuring cuff. After testing the concept on healthy young adults, they concluded that this is a useful alternative to EMG. Furthermore, a high resolution tactile sensor system developed by Sch&#x000FC;rmann was used in a proof of concept study to create tactile images of the anterior forearm (Castellini and Koiva, <xref ref-type="bibr" rid="B3">2013</xref>). It was later shown that pressure sensing is not only a cheaper alternative to sEMG but it also provides better measurement consistency (Ravindra and Castellini, <xref ref-type="bibr" rid="B27">2014</xref>). The same sensory technology was embedded and tested in a tactile sensor bracelet (Koiva et al., <xref ref-type="bibr" rid="B16">2015</xref>). A feasibility study on FMG technology was performed by Cho et al. (<xref ref-type="bibr" rid="B4">2016</xref>) which resulted in a classification accuracy of over 70%, while Xiao and Menon (<xref ref-type="bibr" rid="B43">2017</xref>) proved its robustness during on-the-fly verification. However, to the authors&#x00027; knowledge, the majority of such systems have been used in high muscle engagement conditions which did not demonstrate the ability to differentiate subtle variations involved in different hand poses or during low-strength gripping. Furthermore, existing motion intent recognition systems rely heavily on machine learning (Yu and Lee, <xref ref-type="bibr" rid="B45">2015</xref>) for data classification, which lacks transparency in the decision making process. Consequently, the control of these wearable devices can hinder user safety. This paper aims to fill the identified gaps in the body of knowledge toward further development of wearable, upper-limb, stroke rehabilitation devices.</p>
<p>A tactile arm brace (TAB) design and testing was performed and reported in Stefanou et al. (<xref ref-type="bibr" rid="B31">2018b</xref>), where common tactile patterns were identified with healthy participants. The questions we set to answer in this paper include:
<list list-type="bullet">
<list-item><p>What is the performance of tactile sensing as a means of understanding motion intent under low strength conditions? How does it compare with the current approaches?</p></list-item>
<list-item><p>How do tactile features relate to the muscle physiology? Are these features common within a population? Could a more transparent decision making system be developed?</p></list-item>
</list></p>
<p>Section 2 gives an overview of our user study and describes the grip types used and their muscle synergy analyses. The results of this study are presented in section 3 which compares the forearm&#x00027;s tactile signatures to its expected physiological states and employs machine learning techniques to classify the state of the hand. Section 4 compares the TAB performance to the Myo armband (Thalmic Labs, <xref ref-type="bibr" rid="B36">2015c</xref>) based on sEMG.</p></sec>
<sec sec-type="methods" id="s2">
<title>2. Methodology</title>
<p>One of the main aims of this study was to establish the potential of the TAB in distinguishing between different hand motions produced by different combinations of muscle engagement. This section introduces the experimental set-up and procedure during this participant study followed up with analyses of the muscle contractions associated with the grips performed.</p>
<sec>
<title>2.1. The Experimental Set-Up</title>
<p>The TAB is a low-cost sensorised arm brace (<xref ref-type="fig" rid="F1">Figure 1</xref>). It consists of an armband fitted with 8 force sensitive resistive (FSR) sensors uniformly distributed around the user&#x00027;s forearm. Its purpose is to monitor the normal interaction forces, as detailed in Stefanou et al. (<xref ref-type="bibr" rid="B31">2018b</xref>), capturing tactile signatures of the forearm.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>The experimental set-up included the tactile arm brace (TAB), the gripping device and a forearm support. An Arduino MEGA with a custom-made shield was used to capture the sensor data and transfer it to MATLAB in real-time.</p></caption>
<graphic xlink:href="frobt-06-00124-g0001.tif"/>
</fig>
<p>A gripping device was developed for these experiments. It comprises two load cells under a handle (<xref ref-type="fig" rid="F1">Figure 1</xref>) (Stefanou et al., <xref ref-type="bibr" rid="B30">2018a</xref>) and has a resolution of 0.27N and sensitivity of 0.17N. An average error of 1.79% was calculated in the range of 0N and 9.81N. The acquisition of the TAB and gripping device data was done using an Arduino MEGA on which a custom made board was mounted to provide 16-bit analog inputs with a sampling rate of 89Hz. Calibration of the TAB FSR sensors was performed prior to the experiments where the force on each TAB sensor was recorded as a function of time. The two load cells were also calibrated using a set of known weights. The communication between the Arduino, where all sensor data were captured, and the computer was synchronous.</p></sec>
<sec>
<title>2.2. Participant Study</title>
<p>Experiments were performed with 20 healthy participants, 10 male and 10 female. Ethics approval from the University of the West of England Ethics Committee was acquired as well as informed written consent from all participants. The TAB sensing surfaces were placed three quarters of the way up the length of the forearm.</p>
<sec>
<title>2.2.1. Experimental Procedure</title>
<p>Four different types of static grips were chosen (<xref ref-type="fig" rid="F2">Figure 2</xref>), representative of grips typically used in activities of daily living and based on the feasibility and comfort of the gripping device. These included four prismatic grips; three precision grips, with 2 (<xref ref-type="fig" rid="F2">Figure 2A</xref>), 3 (<xref ref-type="fig" rid="F2">Figure 2B</xref>) and five fingers (<xref ref-type="fig" rid="F2">Figure 2C</xref>), and a power grip using all 5 fingers (<xref ref-type="fig" rid="F2">Figure 2D</xref>). The forearm was placed on the support in the supination position as shown in <xref ref-type="fig" rid="F1">Figure 1</xref>.</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p>Photographs of the four grip configuration used with the gripping device. During the experiments the forearm always rested on the arm support shown in the supination position. <bold>(A)</bold> <italic>Grip1</italic>, the prismatic precision grip with two fingers. <bold>(B)</bold> <italic>Grip2</italic>, the prismatic precision grip with three fingers. <bold>(C)</bold> <italic>Grip3</italic>, the prismatic precision grip with five fingers. <bold>(D)</bold> <italic>Grip4</italic>, the prismatic power grip with five fingers.</p></caption>
<graphic xlink:href="frobt-06-00124-g0002.tif"/>
</fig>
<p>The participants were given visual instructions on a monitor and verbal guidance during trials. Their seat height was adjusted accordingly so that they could comfortably place their forearm on the 3D-printed support shown in <xref ref-type="fig" rid="F2">Figure 2</xref>. The experiment was repeated four times, once for each grip type, where each type of grip was performed 5 times. To create a variability in the sensor waveform profiles, the participants were instructed when to start gripping but were allowed to choose how quickly to grip and when/how quickly to let go. The instructions were given as follows:
<list list-type="bullet">
<list-item><p><italic>GRIP</italic> (<italic>t</italic>); initiate gripping</p></list-item>
<list-item><p><italic>PREPARE</italic> (<italic>t&#x0002B;7</italic>); warning to release the grip, if they have not done so already</p></list-item>
<list-item><p><italic>3s</italic> allowed for rest</p></list-item>
<list-item><p><italic>GRIP</italic> (<italic>t&#x0002B;10</italic>).</p></list-item>
</list></p></sec></sec>
<sec>
<title>2.3. Grips and Muscle Synergy</title>
<p>This section analyses the four grip types shown in <xref ref-type="fig" rid="F2">Figure 2</xref>. The activity of the forearm muscles during each grip type determines the tactile signatures expected to be recorded by the TAB. The contribution of each forearm muscle to each grip type was determined based on its anatomical and physiological parameters. The hand configuration, the placement of the fingers, the wrist angle and the supination/pronation of the forearm were taken into account when analysing the TAB sensor readings and the features that can distinguish the <italic>griping</italic> and <italic>relaxed</italic> states and the four grip types.</p>
<p>Previous studies have indicated that individual finger force contributions during 5-finger precision (trapezoid) gripping are proportional to each finger&#x00027;s strength (Radwin et al., <xref ref-type="bibr" rid="B26">1992</xref>). The first three gripping configurations used are precision grips. The fourth grip type used, <xref ref-type="fig" rid="F2">Figure 2D</xref>, is a power grip, with the thumb adducted. All four grip types have a prismatic shape and they were all performed with the forearm in the supination position resting on a 3D-printed support. The grips were selected according to the detailed taxonomy of human grasps presented by Feix et al. (<xref ref-type="bibr" rid="B9">2016</xref>). <xref ref-type="table" rid="T1">Table 1</xref> details these four grips and the percentage force contribution of each finger as found in the studies by Radwin et al. (<xref ref-type="bibr" rid="B26">1992</xref>) and Kinoshita et al. (<xref ref-type="bibr" rid="B15">1996</xref>)<xref ref-type="fn" rid="fn0001"><sup>1</sup></xref>. The grip force limits used during the experiment for each grip type are presented in <xref ref-type="table" rid="T1">Table 1</xref>.</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>The four grips performed during the experiment, the respective forces achieved with each one and the individual digit contributions (Radwin et al., <xref ref-type="bibr" rid="B26">1992</xref>; Kinoshita et al., <xref ref-type="bibr" rid="B14">1995</xref>, <xref ref-type="bibr" rid="B15">1996</xref>), as well as the maximum force limits the participants in this study were instructed to not exceed.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th/>
<th valign="top" align="center" colspan="4" style="border-bottom: thin solid #000000;"><bold>Contribution (%)</bold></th>
<th/>
<th/>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Trials: max force</bold></th>
</tr>
<tr>
<th valign="top" align="left"><bold>GRIP</bold></th>
<th valign="top" align="left"><bold>Grip type</bold></th>
<th valign="top" align="center"><bold>I</bold></th>
<th valign="top" align="left"><bold>M</bold></th>
<th valign="top" align="left"><bold>R</bold></th>
<th valign="top" align="left"><bold>L</bold></th>
<th valign="top" align="left"><bold>Thumb</bold></th>
<th valign="top" align="left"><bold>Strength</bold></th>
<th valign="top" align="left"><bold>Force</bold></th>
<th valign="top" align="center"><bold>% of</bold></th>
</tr>
<tr>
<th/>
<th/>
<th/>
<th/>
<th/>
<th/>
<th valign="top" align="left"><bold>position</bold></th>
<th valign="top" align="left"><bold>max. (N)</bold></th>
<th/>
<th valign="top" align="center"><bold>strength</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left"><italic>Grip1</italic></td>
<td valign="top" align="left">Precision,</td>
<td valign="top" align="center">100</td>
<td valign="top" align="left">N/A</td>
<td valign="top" align="left">N/A</td>
<td valign="top" align="left">N/A</td>
<td valign="top" align="left">Abducted</td>
<td valign="top" align="left">152.2</td>
<td valign="top" align="left">0.5 kg/4.9N</td>
<td valign="top" align="center">3.2</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">prismatic</td>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left"><italic>Grip2</italic></td>
<td valign="top" align="left">Precision</td>
<td valign="top" align="center">43</td>
<td valign="top" align="left">57</td>
<td valign="top" align="left">N/A</td>
<td valign="top" align="left">N/A</td>
<td valign="top" align="left">Abducted</td>
<td valign="top" align="left">121.8</td>
<td valign="top" align="left">1.0 kg/9.8N</td>
<td valign="top" align="center">8.0</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">prismatic</td>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left"><italic>Grip3</italic></td>
<td valign="top" align="left">Precision</td>
<td valign="top" align="center">35</td>
<td valign="top" align="left">26</td>
<td valign="top" align="left">20</td>
<td valign="top" align="left">19</td>
<td valign="top" align="left">Abducted</td>
<td valign="top" align="left">100</td>
<td valign="top" align="left">1.5 kg/14.7N</td>
<td valign="top" align="center">15</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">prismatic</td>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left"><italic>Grip4</italic></td>
<td valign="top" align="left">Power</td>
<td valign="top" align="center">25</td>
<td valign="top" align="left">35</td>
<td valign="top" align="left">25</td>
<td valign="top" align="left">14</td>
<td valign="top" align="left">Adducted</td>
<td valign="top" align="left">402</td>
<td valign="top" align="left">2.0 kg/19.6N</td>
<td valign="top" align="center">4.9</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">prismatic</td>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
<td/>
</tr>
</tbody>
</table>
</table-wrap>
<p>The pose and involvement of each hand digit and the wrist in each grip type determine the expected magnitude of the forearm muscle contractions. The contribution of each muscle and its proximity to individual TAB sensors, with any damping effects caused by the soft tissue, determine the expected tactile signature of the forearm. A detailed diagram of the forearm cross-section anatomy, just below the elbow, was used to associate each TAB sensor with the muscles in its proximity (<xref ref-type="fig" rid="F3">Figure 3</xref>, each sensor annotated as S1-S8). The digit flexor muscles, the flexor digitorum superficialis (FDS) and flexor digitorum profundus (FDP) (Nordin and Frankel, <xref ref-type="bibr" rid="B24">2001</xref>), flex the four fingers and the diagram provides details of their correspondences to the TAB sensors. The cross-section area of each muscle shown in <xref ref-type="fig" rid="F3">Figure 3</xref> has a direct correlation to its strength capacity (Cutts et al., <xref ref-type="bibr" rid="B5">1991</xref>). The muscle compartments that control the middle finger joints are expected to produce the highest forces followed by the ones that control the index and ring fingers. This is important as each of the four parts of the FDP and FDS muscles has different proximities to the TAB sensors, and is thus expected to produce a different tactile signature. The interosseous membrane (Skahen et al., <xref ref-type="bibr" rid="B29">1997</xref>) that divides the forearm into the anterior and posterior compartments is tensed when the forearm is in the supination position (McGinley and Kozin, <xref ref-type="bibr" rid="B19">2001</xref>). This leads to effective damping that confines any muscle contraction effects (in particular in the inner forearm layer) to the compartment of its origin. This especially affects non-superficial muscle contraction effects on the TAB sensors.</p>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p>These histograms present the force distributions detected by each TAB sensor when performing <italic>Grip1</italic>, the 2-finger precision grip. The blue histograms present the TAB forces when the hand is <italic>relaxed</italic> and the orange when it&#x00027;s <italic>gripping</italic>. The approximate sensor locations are also indicated.</p></caption>
<graphic xlink:href="frobt-06-00124-g0003.tif"/>
</fig>
<sec>
<title>2.3.1. Individual Grip Type Analysis</title>
<p>During <italic>Grip4</italic> (<xref ref-type="fig" rid="F2">Figure 2D</xref>), the thumb is adducted, which means that the extensor pollicis longus muscle is active. This does not happen with the three precision grips. The digit flexors have a high activity, from highest to lowest in order, these are the FDS, the FDP and the flexor pollicis longus (FPL). Furthermore, the wrist extensor muscles engage to a smaller extent as they contract to tighten the digit flexor tendons. Given the nature of the grip and the wrist orientation, the ulnar wrist extensor is expected to play a bigger role during <italic>Grip4</italic> than the radial wrist extensor. In contrast to <italic>Grip4</italic> (the power grip), in <italic>Grip1</italic> (<xref ref-type="fig" rid="F2">Figure 2A</xref>), <italic>Grip2</italic> (<xref ref-type="fig" rid="F2">Figure 2B</xref>) and <italic>Grip3</italic> (<xref ref-type="fig" rid="F2">Figure 2C</xref>) the thumb is abducted and the ulnar deviation of the wrist is not as prominent. The hand configuration in <italic>Grip4</italic> engages the FDS more than in the other grips as there is a higher PIP (proximal interphalangeal) joint flexion (Nordin and Frankel, <xref ref-type="bibr" rid="B24">2001</xref>). The DIP (distal interphalangeal) joint flexion is lesser though in this configuration compared to the others implying lower FDP engagement. Therefore, differences between the two are expected to be seen on the ventral radial part of the forearm. During <italic>Grip1</italic> and <italic>Grip2</italic> only some parts of the FDS and FDP muscles are active, flexing the index and middle fingers. The middle finger has a higher force contribution than the index in <italic>Grip2</italic> (<xref ref-type="table" rid="T1">Table 1</xref>), the 3-finger precision grip. It is therefore assumed that the parts of the muscles responsible for the actuation of the middle finger produce a greater contraction and hence a larger force on the TAB sensors in its proximity. The thumb abduction, actuated by the abductor pollicis longus (APL), is expected to be greater in <italic>Grip3</italic> than <italic>Grip2</italic> in order to position the thumb in opposition to both the index and middle fingers.</p>
<p>The expected sensor responses for each grip type are presented in <xref ref-type="table" rid="T2">Table 2</xref>; they are classified as Low/Medium/High.</p>

<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p>Expected muscle contraction and TAB sensor responses for the four different grips, <italic>Grip1, Grip2, Grip3</italic>, and <italic>Grip4</italic> (<xref ref-type="fig" rid="F2">Figure 2</xref>).</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th valign="top" align="center" colspan="2" style="border-right: thin solid #000000; border-left: thin solid #000000; border-bottom: thin solid #000000;"><bold>Grip1</bold></th>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Grip2</bold></th>
</tr>
<tr>
<th valign="top" align="left" style="border-right: thin solid #000000;"><bold>TAB sensor</bold></th>
<th valign="top" align="left"><bold>Muscle activation</bold></th>
<th valign="top" align="left" style="border-right: thin solid #000000;"><bold>Expected change</bold></th>
<th valign="top" align="left"><bold>Muscle activation</bold></th>
<th valign="top" align="left"><bold>Expected change</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">S1</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">FDS(I),FPL</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium</td>
<td valign="top" align="left">FDS(I,M),FPL</td>
<td valign="top" align="left">Medium</td>
</tr>
<tr>
<td valign="top" align="left">S2</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">ECR-longus, FPL</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Low-Medium</td>
<td valign="top" align="left">ECR-longus, FPL</td>
<td valign="top" align="left">Low-Medium</td>
</tr>
<tr>
<td valign="top" align="left">S3</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">ECR-brevis</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium</td>
<td valign="top" align="left">ECR-brevis</td>
<td valign="top" align="left">Medium</td>
</tr>
<tr>
<td valign="top" align="left">S4</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">APL</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Low-Medium</td>
<td valign="top" align="left">APL</td>
<td valign="top" align="left">Low-Medium</td>
</tr>
<tr>
<td valign="top" align="left">S5</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">ECU</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium</td>
<td valign="top" align="left">ECU</td>
<td valign="top" align="left">Medium</td>
</tr>
<tr>
<td valign="top" align="left">S6</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">ECU</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Low</td>
<td valign="top" align="left">FDP(M)</td>
<td valign="top" align="left">Low</td>
</tr>
<tr>
<td valign="top" align="left">S7</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">FDP(I)</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Low</td>
<td valign="top" align="left">FDP(<underline>I</underline>,<underline>M</underline>)</td>
<td valign="top" align="left">Low-Medium</td>
</tr>
<tr style="border-bottom: thin solid #000000;">
<td valign="top" align="left">S8</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">FDS(I)</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium</td>
<td valign="top" align="left">FDS(I,<underline>M</underline>)</td>
<td valign="top" align="left">High</td>
</tr> <tr>
<td/>
<td valign="top" align="center" colspan="2" style="border-left: thin solid #000000; border-right: thin solid #000000; border-bottom: thin solid #000000;"><bold>Grip3</bold></td>
<td valign="top" align="center" colspan="2" style="border-right: thin solid #000000;  border-bottom: thin solid #000000;"><bold>Grip4</bold></td>
</tr>
<tr style="border-bottom: thin solid #000000;">
<td valign="top" align="left"><bold>TAB sensor</bold></td>
<td valign="top" align="left" style="border-left: thin solid #000000;"><bold>Muscle activation</bold></td>
<td valign="top" align="left" style="border-right: thin solid #000000;"><bold>Expected change</bold></td>
<td valign="top" align="left"><bold>Muscle activation</bold></td>
<td valign="top" align="left"><bold>Expected change</bold></td>
</tr> <tr>
<td valign="top" align="left">S1</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">FDS(I,M),FPL</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium</td>
<td valign="top" align="left">FDS(I,M),FPL</td>
<td valign="top" align="left">Low-Medium</td>
</tr>
<tr>
<td valign="top" align="left">S2</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">ECR-longus, FPL</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium</td>
<td valign="top" align="left">ECR-longus, FPL</td>
<td valign="top" align="left">Low-Medium</td>
</tr>
<tr>
<td valign="top" align="left">S3</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">ECR-brevis</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium</td>
<td valign="top" align="left">ECR-brevis</td>
<td valign="top" align="left">Medium</td>
</tr>
<tr>
<td valign="top" align="left">S4</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">APL</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Low-Medium</td>
<td valign="top" align="left">APL</td>
<td valign="top" align="left">Low</td>
</tr>
<tr>
<td valign="top" align="left">S5</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">ECU</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium</td>
<td valign="top" align="left">ECU</td>
<td valign="top" align="left">Low</td>
</tr>
<tr>
<td valign="top" align="left">S6</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">FDP(M,R,L)</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium</td>
<td valign="top" align="left">FDP(R,L)</td>
<td valign="top" align="left">High</td>
</tr>
<tr>
<td valign="top" align="left">S7</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">FDP(M,R), FDS(L)</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium-High</td>
<td valign="top" align="left">FDP(M,R), FDS(L)</td>
<td valign="top" align="left">Medium-High</td>
</tr>
<tr>
<td valign="top" align="left">S8</td>
<td valign="top" align="left" style="border-left: thin solid #000000;">FDP(I,M), FDS(I,M,R,L)</td>
<td valign="top" align="left" style="border-right: thin solid #000000;">Medium-High</td>
<td valign="top" align="left">FDP(I,M), FDS(I,M,R,L)</td>
<td valign="top" align="left">High</td>
</tr>
</tbody>
</table>
</table-wrap>


</sec></sec>
<sec>
<title>2.4. Data Labeling</title>
<p>To label the data with the state of the hand for each grip type, <italic>gripping</italic> or <italic>relaxed</italic>, the onsets and terminations of gripping were determined using the gripping force measurements. An algorithm was developed to determine gripping instances by checking whether the grip force threshold of 0.49N has been exceeded and whether any proximal grip/release events have occurred. A release event is identified only after a grip event and when the grip force is &#x0003C;0.69N. Another criterion for determining the release event is whether at the midpoint between the possible release point and the corresponding grip point the grip force is higher than a certain threshold (which varies between the grip types). All thresholds used in the algorithm were based on the sensitivity and accuracy of the gripping device and determined by trial and error. Each data point was labeled with a number that indicates the configuration of the hand (one of four grip types presented in <xref ref-type="fig" rid="F2">Figure 2</xref>) and whether the hand was <italic>gripping</italic> or <italic>relaxed</italic>.</p></sec></sec>
<sec id="s3">
<title>3. Results and Analysis</title>
<p>The TAB sensor readings were scaled across all grip types using the minmax normalization technique between 0 and 1, for each participant individually. While the hand adopts each of the four different finger configurations without exerting any force on the gripping device, the tactile signature on the TAB changes. However, the tactile signatures have variations across the participants, i.e., the <italic>relaxed</italic> signature of one user is different to the <italic>relaxed</italic> signature of another user. This is due to the participants&#x00027; different forearm sizes and shapes, TAB placement and fit, as well as the muscle to adipose tissue ratios. All these factors that affect the signature on the TAB, as would be expected, are especially evident when the hand is in one of the <italic>relaxed</italic> states. Not only does the force magnitude of the detected changes vary, but also the prominence of some force changes at certain parts of the forearm is higher than at others in different participants across hand configurations.</p>
<sec>
<title>3.1. Tactile Signatures&#x02014;A First Glance</title>
<p>The force measurements around the forearm in the <italic>gripping</italic> and <italic>relaxed</italic> states for all participants are visualized in <xref ref-type="fig" rid="F3">Figures 3</xref>&#x02013;<xref ref-type="fig" rid="F6">6</xref>, one for each grip type. At first glance, it is obvious that the data are not normally distributed and that generalizing for all participants may not be feasible. In order to further break down the data distribution and gain a better understanding of the results, bivariate histograms were produced (<xref ref-type="fig" rid="F7">Figure 7</xref>). These visualize the frequency density of the individual sensor readings with respect to the grip force for each grip type.</p>
<fig id="F4" position="float">
<label>Figure 4</label>
<caption><p>These histograms present the force distributions detected by each TAB sensor when performing <italic>Grip2</italic>, the 3-finger precision grip. The blue histograms present the TAB forces when the hand is <italic>relaxed</italic> and the orange when it&#x00027;s <italic>gripping</italic>. The approximate sensor locations are also indicated.</p></caption>
<graphic xlink:href="frobt-06-00124-g0004.tif"/>
</fig>
<fig id="F5" position="float">
<label>Figure 5</label>
<caption><p>These histograms present the force distributions detected by each TAB sensor when performing <italic>Grip3</italic>, the 5-finger precision grip. The blue histograms present the TAB forces when the hand is <italic>relaxed</italic> and the orange when it&#x00027;s <italic>gripping</italic>. The approximate sensor locations are also indicated.</p></caption>
<graphic xlink:href="frobt-06-00124-g0005.tif"/>
</fig>
<fig id="F6" position="float">
<label>Figure 6</label>
<caption><p>These histograms present the force distributions detected by each TAB sensor when performing <italic>Grip4</italic>, the 5-finger power grip. The blue histograms present the TAB forces when the hand is <italic>relaxed</italic> and the orange when it&#x00027;s <italic>gripping</italic>. The approximate sensor locations are also indicated.</p></caption>
<graphic xlink:href="frobt-06-00124-g0006.tif"/>
</fig>
<fig id="F7" position="float">
<label>Figure 7</label>
<caption><p>These are the deconstruction of a single histogram from <xref ref-type="fig" rid="F5">Figure 5</xref>. They present the force distribution of the <italic>S6</italic> data against the gripping force recorded during <italic>Grip3</italic> when the hand is <bold>(A)</bold> <italic>relaxed</italic> and <bold>(B)</bold> <italic>gripping</italic>.</p></caption>
<graphic xlink:href="frobt-06-00124-g0007.tif"/>
</fig>
<p><xref ref-type="fig" rid="F3">Figures 3</xref>&#x02013;<xref ref-type="fig" rid="F6">6</xref> show how the muscles engage during each grip type (indicated by red circles) and the response of each of the TAB sensors when the hand is <italic>gripping</italic> (orange histograms) and <italic>relaxed</italic> (blue histograms). The force frequency distribution of some sensors is significantly different in the <italic>relaxed</italic> and <italic>gripping</italic> states for particular grip types. For example, there is a clear distinction between the <italic>relaxed</italic> and <italic>gripping</italic> force distributions of <italic>S6</italic> for <italic>Grip2, Grip3</italic>, and <italic>Grip4</italic>. A very strong bi-modal distribution is visible for <italic>Grip4</italic> (<xref ref-type="fig" rid="F6">Figure 6</xref>) while a clear shift of the median can be seen in <italic>Grip3</italic> (<xref ref-type="fig" rid="F5">Figure 5</xref>). <italic>Grip1</italic> and <italic>Grip2</italic>, that differ only by the use of the middle finger have very similar tactile signatures when <italic>gripping</italic>, with the exception of the TAB sensors situated on the volar/ulnar side of the forearm; ie. <italic>S7</italic> and <italic>S8</italic>. The distributions have longer tails and higher frequencies, around 0.8 which can be explained by the parts of the FDS and FDP muscles that flex the middle finger (annotated with &#x0201C;M&#x0201D; in the forearm cross-section diagrams), which are located in the proximity of those two sensors. This is one example that demonstrates that despite variations across participants, the results do agree with the physiological analysis performed and expected tactile signatures as tabulated in <xref ref-type="table" rid="T2">Table 2</xref>. It was expected that <italic>Grip1</italic> and <italic>Grip2</italic> would cause similar forces on all TAB sensors with the exception of <italic>S7</italic> and <italic>S8</italic>. Comparisons with <italic>Grip3</italic> where the remaining parts of the FDP and FDS muscles, located in the inner/ulnar part of the forearm, are engaged as well, indicate that not only are the <italic>S7</italic> and <italic>S8</italic> responses higher but also that <italic>S6</italic>, located at the ulnar side of the forearm, is sensitive to <italic>Grip3</italic> as expected (<xref ref-type="table" rid="T2">Table 2</xref>). Overall, the results agree with the analysis performed and the expected outcomes.</p>
<p><xref ref-type="fig" rid="F7">Figure 7</xref> shows the <italic>S6</italic> readings against the grip force during <italic>Grip3</italic> (scaled as indicated earlier). S6 was chosen because of its location in a region of the forearm where prominent force changes take place. This can be also seen in the histogram of <italic>S6</italic> in <xref ref-type="fig" rid="F5">Figure 5</xref> (<italic>Grip 3</italic>). The <italic>relaxed</italic> state data are presented in <xref ref-type="fig" rid="F7">Figure 7A</xref> and the <italic>gripping</italic> data in <xref ref-type="fig" rid="F7">Figure 7B</xref>. While <italic>relaxed</italic>, the recorded grip force mainly ranges between 0 and 5% of the maximum recorded. There is no correlation between the force measured by <italic>S6</italic> and the one measured by the gripping device. The variability of the forces recorded when the hand is <italic>relaxed</italic> could be due to the variability of TAB tightness on the participants&#x00027; forearm, sensor hysteresis or increased blood flow in the forearm (as a result of gripping). This behavior was observed in all <italic>relaxed</italic> sensor data (<italic>S1</italic>&#x02013;<italic>S8</italic>). In <xref ref-type="fig" rid="F7">Figure 7B</xref>, the TAB readings correspond to gripping forces ranging between 20% and 60% of the maximum recorded force. In the 3D plot, a more normalized distribution is visible, in contrast to the histograms presented in <xref ref-type="fig" rid="F3">Figures 3</xref>&#x02013;<xref ref-type="fig" rid="F6">6</xref> which do not break down the gripping force. The visible diagonal indicates a linear correlation of the grip force with the TAB <italic>S6</italic> force readings. This happens despite the variability in the <italic>relaxed</italic> contact forces recorded across participants (<xref ref-type="fig" rid="F7">Figure 7A</xref>) caused mainly by TAB fit and forearm adipose tissue content. This could be a result of the general increase in the physiological cross-sectional area (PCSA) of the forearm and <italic>S6</italic> location near the radius bone. However, it is also, more specifically, an indication of muscle activity in the proximity of the sensor.</p></sec>
<sec>
<title>3.2. Separability of Tactile Signatures</title>
<p>Following initial data analysis, statistical analysis was used to determine whether the four grip configurations of the hand while <italic>relaxed</italic> or <italic>gripping</italic> are statistically different and thus separable. That would provide some indications on the possible predictive algorithm that could be used to classify the data and statistical evidence of the tactile signatures mapping to the muscle synergies involved in each grip type. The data were split in eight groups; with the hand <italic>relaxed</italic> or <italic>gripping</italic> in the 4 positions shown in <xref ref-type="fig" rid="F2">Figure 2</xref>. The separability of all these states is important for the development of predictive algorithms that are transparent and based on physiological cues. For the creation of a generalized model, that can perform well with all TAB users, it is paramount that there are similar patterns across the participants.</p>
<p>The two-sample Kormogolov-Smirnov (KS) test, run in Matlab using the <italic>kstest2</italic> function (MathWorks, <xref ref-type="bibr" rid="B18">2018</xref>), was chosen to make the necessary comparisons as it does not assume Gaussian distribution. After testing each TAB sensor data normality, it was concluded that the data do not adhere to a Gaussian distribution (this is also evident in <xref ref-type="fig" rid="F3">Figures 3</xref>&#x02013;<xref ref-type="fig" rid="F6">6</xref>). In this two-sample KS test, the hypothesis with regards to the distribution of a dataset is rejected by comparing the <italic>p</italic>-value with the significance level <italic>Alpha</italic> (default value of 0.05). <italic>D</italic>, the test statistic, is an indication of the &#x0201C;distance,&#x0201D; or difference, between the two samples&#x00027; distributions when using the <italic>kstest2</italic> function. When it is 0, the data follow the exact same distribution as the specified one, the higher the value the greater the difference between the two data distributions.</p>
<p>No statistically significant similarity was found when analysing and comparing the individual sensor datasets for each grip type across participants. This was attributed to the differences in the forearm shapes and their adipose tissue-to-fat content ratio, as well as experimental errors and confounding variables. The latter includes the tightness of the TAB on each individual, the spacing of the TAB sensors and the surface area covered by the eight sensors in relation to the total circumference of the forearm (the larger the circumference the lower the coverage and thus the lower the sensing resolution). Nonetheless, statistically significant differences were found between the <italic>relaxed</italic> and <italic>gripping</italic> states across all 20 participants. The KS test statistic value, <italic>D</italic>, calculated for the <italic>relaxed</italic> state datasets across participants (for the individual grip types) was under 0.1. Therefore, 0.1 was set as a critical test value to determine statistically different distributions. Thus, any statistical difference between two data samples would need to be confirmed by both the null hypothesis and the critical test value.</p>
<sec>
<title>3.2.1. Finger Pose Signatures</title>
<p>The TAB data for the <italic>relaxed</italic> state were compared between the four hand grip configurations. Individual participant data were used with the two-sample KS test. The hypothesis tested was that the TAB sensor readings of different grip configurations belong to the same distribution. The test was run for each of eight TAB sensors and in each case the hypothesis was rejected in more than 99% of the data while the value of <italic>D</italic> always exceeded the critical value (0.1), with one exception. These results suggest that, even when no force is exerted on the gripping device, the position and cross-sectional area of the muscles that hold the joints at the four different hand grip configurations give rise to distinct tactile signatures. <italic>D</italic> values ranged from 0.14 to 0.999, indicating that the forearm undergoes greater changes in some areas than others while transitioning between the four hand configurations. However, this observation was not consistent amongst all participants.</p>
<p>Even in the <italic>relaxed</italic> state where the force differences are subtle the results agree with the grip analysis and muscle contractions. For example, between <italic>Grip3</italic> and <italic>Grip4</italic>, the ventral radial part of the forearm was expected to provide information to distinguish between the two (section 2.3.1). <italic>S1</italic> and <italic>S2</italic> would be the features to look out for, rather than the sensors at the ulnar part of the forearm (in particular <italic>S6</italic> which is in the proximity of the FDP). Looking at individual participant <italic>S1</italic>/<italic>S2</italic> data, the statistical distance (<italic>D</italic>) between the two distributions was over 0.5 while for <italic>S6</italic>, in most cases, it was under 0.2. However, the importance of these features was not clear when all participant data were considered. <italic>D</italic> values were lower than 0.2 in most cases for the force distributions or <italic>S1</italic> or <italic>S2</italic> at the two different hand configurations. Furthermore, the tactile signatures of <italic>Grip1</italic> and <italic>Grip4</italic> in the <italic>relaxed</italic> state were the most similar despite the two hand configurations being very different.</p>
<p>Overall, the force density distributions on the TAB sensors for each hand configuration and across participants were different. This is an indication that the creation of a generalized model is probably unfeasible with the current data. Nonetheless, the tactile signature differences of the four hand poses in the <italic>relaxed</italic> state were found to be more significant than these variations across participants, as the KS statistic value, <italic>D</italic>, was found to be much higher. The sensors with the tighter distributions, and higher <italic>D</italic> value would be the most useful features for classification of the four different hand grip configurations. However, given the variability of the tactile signatures across the participants, a combination of all TAB sensor responses would be needed to determine the hand pose.</p></sec>
<sec>
<title>3.2.2. Relaxed vs. Gripping</title>
<p>How can the TAB differentiate a hand <italic>gripping</italic> from being <italic>relaxed</italic>? What is the TAB sensors&#x00027; response to different grips? Are these features common across participants? The 2-sample KS test was used to compare the <italic>relaxed</italic> and <italic>gripping</italic> sensor data. <xref ref-type="fig" rid="F8">Figure 8</xref> presents the force readings of all participant data for each of the four grip types. <italic>S3</italic> records distinct force changes across all grip types. This sensor is not near the digit flexors which are responsible for gripping but is in the proximity of the radial bone where there is a low adipose tissue content forming a more rigid contact surface. As the large digit flexor muscles contract, their stiffness and PCSA increase (PCSA being proportional to muscle force), in turn increasing the forearm&#x00027;s cross sectional area. This generates higher normal and shear forces on the arm brace, the former being monitored by the TAB sensors. These forces are transmitted more efficiently on the brace by the more rigid parts of the forearm. Thus, <italic>S3</italic> can be a consistent general indicator of <italic>gripping</italic> but possibly not as good of a feature for differentiating grips.</p>
<fig id="F8" position="float">
<label>Figure 8</label>
<caption><p>The figure presents the force readings of each TAB sensor (using all participant data) for the four different hand configurations; <bold>(A)</bold> <italic>Grip1</italic>, <bold>(B)</bold> <italic>Grip2</italic>, <bold>(C)</bold> <italic>Grip3</italic>, and <bold>(D)</bold> <italic>Grip4</italic>.</p></caption>
<graphic xlink:href="frobt-06-00124-g0008.tif"/>
</fig>
<p>The force distributions at <italic>S8, S7</italic>, and <italic>S6</italic> agree with the grip analysis (section 2.3.1). The combination of these three sensors provide good features for predicting the grip type. When different parts of the FDP and FDS muscles engage during <italic>gripping</italic> there is a clear separability in the recorded force distributions. As expected (<xref ref-type="table" rid="T2">Table 2</xref>), the <italic>S6</italic> data have the best separability during <italic>Grip4</italic> (<xref ref-type="fig" rid="F2">Figure 2D</xref>). This is evident in <xref ref-type="fig" rid="F8">Figure 8D</xref> and the higher KS statistic values, <italic>D</italic>, calculated when the little and ring fingers engage. Despite the broad force density distributions across participants, there is a clear separability of the <italic>relaxed</italic> and <italic>gripping</italic> states in the <italic>S7</italic> and <italic>S8</italic> data. These two, as expected, are better predictors of <italic>Grip2</italic> and <italic>Grip3</italic> than <italic>S6</italic>, due to their proximity to the corresponding muscles. For <italic>Grip1</italic>, what was surprising was the similarity of the <italic>S1 relaxed</italic> and <italic>gripping</italic> distributions.The results of the remaining sensors are in agreement with the expected outcomes presented in <xref ref-type="table" rid="T2">Table 2</xref>.</p>
<p>Comparisons of <xref ref-type="fig" rid="F9">Figure 9</xref> and <xref ref-type="fig" rid="F8">Figure 8C</xref> show how the separability of the <italic>Grip3</italic> data is much clearer in a single participant&#x00027;s data. The force variance observed on <italic>S4</italic> between the <italic>gripping</italic> and <italic>relaxed</italic> states is probably due to the lower adipose tissue content in that region of the forearm. The same applied to <italic>S1</italic> but despite the sensor&#x00027;s proximity to the flexor muscles this was not the case for all participants.</p>
<fig id="F9" position="float">
<label>Figure 9</label>
<caption><p>The force density distributions, <italic>relaxed</italic> and <italic>gripping</italic>, from a single participant&#x00027;s data when performing <italic>Grip3</italic>.</p></caption>
<graphic xlink:href="frobt-06-00124-g0009.tif"/>
</fig>
<p>Overall, the results indicate that most of the sensors show a statistically significant difference between the states which suggests that a data-driven, linear algorithm may prove sufficient at identifying these states, despite the broad force distributions. Using the overall tactile signature of the forearm for motion intent recognition would work better than just focusing on particular regions. Also, as evident by the state separation in <xref ref-type="fig" rid="F9">Figure 9</xref> a personalized system would work better than a generalized one.</p></sec></sec>
<sec>
<title>3.3. Principal Component Analysis (PCA)</title>
<p>Given the evidence of linear separability in the data, PCA was performed, using the IBM SPSS software (IBM, <xref ref-type="bibr" rid="B11">2019</xref>), to determine the extent to which the TAB data variability of each grip type across states can be expressed by linear components. Eight different components that describe the variance in the data were produced. Using the Keiser criterion only the ones with an eigenvalue greater than 1 were considered. The <italic>Varimax</italic> method was used and a maximum of 200 iterations were allowed for convergence. It is recommended that the components chosen describe 70&#x02013;80% of the data variance. However, the principal components (PCs) explain 60.5, 68.7, 65.7, and 77.4% of the variance of <italic>Grip1-4</italic>, respectively. The lower values are due to the physiological differences between the participants. The results are in line with the grip analysis (section 2.3.1, <xref ref-type="table" rid="T2">Table 2</xref>). The PCs&#x00027; highest correlations are observed with the sensors placed in the proximity of the digit flexors, <italic>S6, S7</italic> and <italic>S8</italic>, and <italic>S3</italic>. <italic>S3</italic> was found to be a good indicator of <italic>gripping</italic> but not of differentiating between the grip types (section 3.2.2).</p></sec>
<sec>
<title>3.4. Tactile Signatures&#x02014;Expectations vs. Results</title>
<p>During multiple participant experiments, the forearm tactile signature on the TAB mostly aligns with the muscle physiology and muscle synergies. Some observed incoherence in the data suggests that personalization of the TAB would greatly improve its accuracy. The variability between participants that affects the baseline readings and accentuates certain features over others can be attributed to:
<list list-type="bullet">
<list-item><p>Forearm shape &#x02013; causes variations of the baseline readings</p></list-item>
<list-item><p>TAB fit - tighter fit elevates the baseline readings, especially in the proximity of the ulna and radius, a loose fit may not detect muscle activity</p></list-item>
<list-item><p>Adipose tissue content - lowers the baseline readings and dampens the output</p></list-item>
<list-item><p>Forearm strength/muscle mass raises baseline and makes the output more prominent</p></list-item>
<list-item><p>Forearm circumference affects the distance between the sensors and their proximity to individual muscles.</p></list-item>
</list></p>
<p>The tactile signatures of individual forearm muscle contractions can be used to create a map of each muscle contraction from the TAB response. This can be implemented in a motion intent recognition system to actuate an exoskeleton, instead of (or in combination with) training a machine learning model on particular motions. The advantage of this approach is that even unknown sensory inputs will provide a reliable control input increasing the robustness of the exoskeleton control.</p></sec></sec>
<sec id="s4">
<title>4. Tab State Prediction</title>
<p>Machine learning techniques were employed to create generalized models from the TAB data recorded in the experiments. To start with, a model was created for each grip type, individually, to predict whether the hand is <italic>gripping</italic> or <italic>relaxed</italic> (binary classification). All grip types&#x00027; <italic>relaxed</italic> data were then grouped together to predict which of the four grip types is being performed when the hand is <italic>gripping</italic> (5 classes). Following that, the algorithms were ran on all <italic>relaxed</italic> state data to create a model that discerns between the four grip configurations and the subtle differences between the four relaxed hand poses (4 classes). Finally, all TAB data were used to classify the four <italic>relaxed</italic> and four <italic>gripping</italic> states (8 classes).</p>
<sec>
<title>4.1. Features</title>
<p>Engineered features were created and ranked to improve the performance of the predictive algorithms for individual grip types as well as for the overall dataset. Both the temporal and spatial tactile patterns can provide characteristics of different hand movements despite the differences in the TAB response for each participant. It is, therefore, important to take into consideration the history of each sensor&#x00027;s readings (considering the change of force within about 0.5 s) as well as the spatial features which were generated from the differences between each of the TAB sensor force readings. The differences of the sensor readings could provide information on antagonistic muscle pairs or combinations of muscle contractions. Concatenating these features with the TAB sensor readings, a total of 44 features were established. The first eight are the TAB sensor readings [<italic>S</italic><sub>1</sub>, &#x02026;,<italic>S</italic><sub>8</sub>], 9&#x02013;16 their derivatives [<italic>dS</italic><sub>1</sub>, &#x02026;,<italic>dS</italic><sub>8</sub>] and features 17-44 are the force reading differences between each pair of sensors; i.e., [<italic>S</italic><sub>1</sub> &#x02212; <italic>S</italic><sub>2</sub>, <italic>S</italic><sub>1</sub> &#x02212; <italic>S</italic><sub>3</sub>, &#x02026;,<italic>S</italic><sub>7</sub> &#x02212; <italic>S</italic><sub>8</sub>].</p>
<p>The most important of these features were extracted for grip classification. Two approaches were taken to determine feature importance. In the first, the features were ranked using the Random Forests (RF) algorithm where the first 10 features were chosen for each classification scenario. The second approach transformed the engineered features with the principal components established using PCA. All individual sensor data, <italic>S1</italic>- <italic>S8</italic>, and some spatial features were found to be important. For <italic>Grip1</italic> these included [<italic>S</italic><sub>7</sub>&#x02212;<italic>S</italic><sub>3</sub>, <italic>S</italic><sub>8</sub>&#x02212;<italic>S</italic><sub>2</sub>], for <italic>Grip2</italic> [<italic>S</italic><sub>8</sub>&#x02212;<italic>S</italic><sub>3</sub>, <italic>S</italic><sub>5</sub>&#x02212;<italic>S</italic><sub>3</sub>], for <italic>Grip3</italic> [<italic>S</italic><sub>6</sub> &#x02212; <italic>S</italic><sub>3</sub>, <italic>S</italic><sub>7</sub> &#x02212; <italic>S</italic><sub>6</sub>], and for <italic>Grip4</italic> [<italic>S</italic><sub>4</sub> &#x02212; <italic>S</italic><sub>3</sub>, <italic>S</italic><sub>8</sub> &#x02212; <italic>S</italic><sub>4</sub>]. The temporal features, on the other hand, did not provide any useful information. The results are in line with our initial analysis as the spatial data that featured high on the importance list for individual grips mainly involved <italic>S3</italic>. <italic>S6, S7</italic>, and <italic>S8</italic> were also key sensors in these spatial data. As seen earlier, <italic>S3</italic> and <italic>S6</italic> are good indicators of <italic>gripping</italic> due to the high variability in their data distributions during gripping. <italic>S8, S7</italic>, and <italic>S6</italic> data are useful features in distinguishing between the four grip types. A number of machine learning techniques were employed to create models and make state predictions. The models were trained on 80% of the data and tested on the remaining 20% of participants&#x00027; data.</p></sec>
<sec>
<title>4.2. Learning the Tactile Features</title>
<p>Using the most important features calculated by the RF algorithm and the PCA-transformed data, supervised learning algorithms were implemented with a 10-fold cross-validation. There are multiple learning algorithms that can be used for classification purposes, both parametric and non-parametric. The k-nearest-neighbor (kNN) (Akhlaghi et al., <xref ref-type="bibr" rid="B1">2016</xref>), logistic regression, random forests (RF) and support vector machines (SVM) (Wolf et al., <xref ref-type="bibr" rid="B40">2013</xref>) were all implemented using the python scikit-learn library.</p>
<p>The prediction accuracy was higher when using the chosen engineered features in comparison with using solely the <italic>S1</italic>&#x02212;<italic>S8</italic> readings. Using all engineered features instead of only the ones found to be of importance has no significant effect on the results (h = 0). Being selective with the features decreases the computational time without compromising the accuracy. The results indicated that although the accuracy differences between the algorithms were not significant, the SVM had the best overall performance. When attempting to distinguish between all grip types and <italic>gripping</italic>/<italic>relaxed</italic>, a total of 8 classes, only about a quarter of the data were correctly classified. The kNN algorithm was not able to capture the complexity of the features as well as an ensemble method like RF. The SVM models, which used a Gaussian kernel, also resulted in higher prediction accuracy than kNNs and performed marginally better than RF in determining the state of the individual grips and classifying all grips and positions. The prediction accuracy using the principal components of the TAB data was the highest in the classification of <italic>Grip3</italic>. This could be due to the fact that the ratio of the maximum force allowed during the experiment, to the maximum that can be generated using that grip type is the highest for <italic>Grip3</italic> (as evident in <xref ref-type="table" rid="T1">Table 1</xref>).</p>
<p>The high standard deviation of accuracy of all machine learning algorithms demonstrates a great variation of the TAB response within the participant pool. The training data and validation data prediction accuracies differed and only about a quarter of the data were correctly classified. Using the default parameters of the scikit-learn library, the SVM model generated with the features chosen using RF performed better than the one with the PCA transformed features. An exhaustive search over the <italic>gamma</italic> and <italic>C</italic> parameter values of the SVM was performed using grid search. This revealed that the model performs best when gamma is 1.0 and C is 1. Using the optimal parameter values, the SVM model&#x00027;s performance improved but it still failed to reach the desired accuracy as evident in <xref ref-type="table" rid="T3">Table 3</xref>.</p>
<table-wrap position="float" id="T3">
<label>Table 3</label>
<caption><p>The classification accuracies (mean and standard deviation) achieved using the SVM models trained on the PCA transformed features and the RF selected features, with a 10-fold cross validation.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th valign="top" align="left"><bold>Feature acquisition</bold></th>
<th valign="top" align="center"><bold>PCA</bold></th>
<th valign="top" align="center"><bold>RF</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Grip1</td>
<td valign="top" align="center">77.10 &#x000B1; 9.01</td>
<td valign="top" align="center">81.42 &#x000B1; 10.03</td>
</tr>
<tr>
<td valign="top" align="left">Grip2</td>
<td valign="top" align="center">84.78 &#x000B1; 7.4</td>
<td valign="top" align="center">85.76 &#x000B1; 8.70</td>
</tr>
<tr>
<td valign="top" align="left">Grip3</td>
<td valign="top" align="center">89.30 &#x000B1; 6.06</td>
<td valign="top" align="center">93.43 &#x000B1; 5.03</td>
</tr>
<tr>
<td valign="top" align="left">Grip4</td>
<td valign="top" align="center">83.54 &#x000B1; 11.27</td>
<td valign="top" align="center">88.44 &#x000B1; 8.55</td>
</tr>
<tr>
<td valign="top" align="left">All grips, all positions</td>
<td valign="top" align="center">22.37 &#x000B1; 9.86</td>
<td valign="top" align="center">27.14 &#x000B1; 9.26</td>
</tr>
<tr>
<td valign="top" align="left">All grips and all positions</td>
<td valign="top" align="center">54.06 &#x000B1; 11.35</td>
<td valign="top" align="center">56.13 &#x000B1; 10.38</td>
</tr>
<tr>
<td valign="top" align="left">All positions (relaxed)</td>
<td valign="top" align="center">22.37 &#x000B1; 9.86</td>
<td valign="top" align="center">23.97 &#x000B1; 7.03</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>The model did not perform well when tested on the data of the 4 &#x0201C;unseen&#x0201D; participants (20% of the data). This indicated that the variability across the population is not adequately captured with such a small number of participants to allow for generalization. The model was then trained on partial data from all participants. The data were shuffled prior to being split into the hold-out groups which meant that the training sample would most probably include some data from all 20 participants. This raised the model performance, a seen in <xref ref-type="table" rid="T4">Table 4</xref>. The most notable improvements were made with 8-way classification (4 <italic>gripping</italic> and 4 <italic>relaxed</italic> states) and on classifying the 4 relaxed hand configurations where the differences are subtle. The accuracy for both increased to over 95%. This indicates that using these learning algorithms the model would need to be personalized to the user.</p>
<table-wrap position="float" id="T4">
<label>Table 4</label>
<caption><p>Training on all participants&#x00027; data.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Classification accuracy (%)</bold></th>
</tr>
<tr>
<th valign="top" align="left"><bold>Features</bold></th>
<th valign="top" align="center"><bold>PC</bold></th>
<th valign="top" align="center"><bold>RF</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Grip1</td>
<td valign="top" align="center">96.19 &#x000B1; 0.20</td>
<td valign="top" align="center">97.06 &#x000B1; 0.11</td>
</tr>
<tr>
<td valign="top" align="left">Grip2</td>
<td valign="top" align="center">97.61 &#x000B1; 0.16</td>
<td valign="top" align="center">98.07 &#x000B1; 0.17</td>
</tr>
<tr>
<td valign="top" align="left">Grip3</td>
<td valign="top" align="center">98.17 &#x000B1; 0.10</td>
<td valign="top" align="center">98.96 &#x000B1; 0.11</td>
</tr>
<tr>
<td valign="top" align="left">Grip4</td>
<td valign="top" align="center">99.10 &#x000B1; 0.10</td>
<td valign="top" align="center">99.44 &#x000B1; 0.06</td>
</tr>
<tr>
<td valign="top" align="left">All grips, all positions</td>
<td valign="top" align="center">97.60 &#x000B1; 0.04</td>
<td valign="top" align="center">93.75 &#x000B1; 0.15</td>
</tr>
<tr>
<td valign="top" align="left">All positions (relaxed)</td>
<td valign="top" align="center">95.89 &#x000B1; 0.26</td>
<td valign="top" align="center">96.75 &#x000B1; 0.12</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>The classification accuracies (mean and standard deviation) achieved using the SVM models trained on the PCA transformed features and the RF selected features, with a 10-fold cross validation</italic>.</p>
</table-wrap-foot>
</table-wrap>
<p>The confusion matrix (<xref ref-type="fig" rid="F10">Figure 10</xref>), shows the results of the 10-fold cross-validation of the SVM model which was trained using the principal component transformed features of partial unseen data from all participants. The labels <italic>relaxed1</italic>/<italic>gripping1</italic> represent the <italic>relaxed</italic>/<italic>gripping</italic> states of the <italic>Grip1</italic> hand configuration and the same applies for the other three grip types. It can be observed that the highest false negatives in the &#x0201C;one-vs.-rest&#x0201D; (training a single classifier per class) SVM approach were found when trying to distinguish between:
<list list-type="bullet">
<list-item><p><italic>Grip3</italic> and <italic>Grip4</italic> which use all fingers; 1.6% of <italic>gripping3</italic> was incorrectly classified as <italic>gripping4</italic> and 1.5% of <italic>gripping4</italic> as <italic>gripping3</italic></p></list-item>
<list-item><p><italic>Grip2</italic> and <italic>Grip3</italic> with similar trapezoid shapes; 1.8% of <italic>gripping2</italic> was incorrectly classified as <italic>gripping3</italic> and 1.2% of <italic>gripping3</italic> as <italic>gripping2</italic>.</p></list-item>
</list></p>
<fig id="F10" position="float">
<label>Figure 10</label>
<caption><p>The confusion matrix presenting results of the 10-fold cross-validation of the SVM model which was trained using the principal component transformed features of partial unseen data from all participants.</p></caption>
<graphic xlink:href="frobt-06-00124-g0010.tif"/>
</fig>
<p><xref ref-type="table" rid="T1">Table 1</xref> shows that the ring and little finger do not contribute as much force as the index or middle finger during <italic>Grip3</italic> which contributes to its miss-classification as <italic>Grip2</italic>. Overall, <italic>Grip3</italic> has the highest false positive rate with 4.7%. <italic>Relaxed4</italic> is the one with the lowest false positive rate, with 0.9%. The relaxed states were, overall, classified slightly better than the gripping states possibly due to some difference in muscle engagement between the participants. In summary, the highest miss-classification of a state is for <italic>gripping3</italic>, 4.2%, the lowest for <italic>relaxed4</italic> and <italic>relaxed2</italic>, 1.6%, with an average of 2.5% across all states.</p>
<p>The number of participants involved in this study was not sufficient to allow creation of a generalizable intent recognition model that can perform accurately (e.g., 90%) on the unseen participants. Nonetheless, the TAB system performance achieved an average classification accuracy of 99.96 &#x000B1; 0.08% which is comparable to the state-of-the-art motion intent recognition systems not tested on weak muscle activations (Wolf et al., <xref ref-type="bibr" rid="B40">2013</xref>; Xiao et al., <xref ref-type="bibr" rid="B42">2014</xref>; Li et al., <xref ref-type="bibr" rid="B17">2017</xref>).</p></sec>
<sec>
<title>4.3. TAB vs. Myo Armband</title>
<sec>
<title>4.3.1. Comparative Study</title>
<p>The two sensorised armbands, the TAB and the Myo, offer two different means of detecting muscle contractions. The first uses interaction forces on the forearm while the other detects electrical signals that reach the skin surface. The TAB experiments were replicated with the same participants using Myo armband (Thalmic Labs, <xref ref-type="bibr" rid="B36">2015c</xref>). The Thalmic Labs&#x00027; Myo Gesture Control Armband (Thalmic Labs, <xref ref-type="bibr" rid="B36">2015c</xref>) features an Inertial Measurement Unit (IMU) and eight sets of sEMG. These are accessed using the Windows SDK provided. The Myo was chosen for comparison with the TAB for its ease of use, adaptability to various forearm sizes and the number of measurement points (8) which is the same as in the TAB. The sEMG data were sampled at around 100 Hz. Positioning of the Myo EMG sensors on the forearm was done according to the manufacturers instructions (Thalmic Labs, <xref ref-type="bibr" rid="B34">2015a</xref>).</p></sec>
<sec>
<title>4.3.2. sEMG: Features and State Prediction</title>
<p>The same 44 engineered features as earlier were produced for the Myo data. The temporal features included voltage changes over 0.5 s windows, as in Wolf et al. (<xref ref-type="bibr" rid="B40">2013</xref>). The RF algorithm was used to rank the features and the 10 most important were chosen. The temporal features ranked high in contrast to the TAB where they ranked the lowest. In both, cases, however, the radial part of the forearm provided key features. All participant data were shuffled and a 10-fold cross-validation was performed using the kNN, SVM and RF learning algorithms. The best performance was achieved with the RF model (<xref ref-type="table" rid="T5">Table 5</xref>). The highest prediction accuracy with the Myo data was achieved for <italic>Grip4</italic>, where the force limit set for the experiment was the highest amongst the grip types (<xref ref-type="table" rid="T1">Table 1</xref>). Higher forces engage more muscle fibers, creating a higher potential difference and therefore stronger signals to be detected by sEMG sensors. Overall,the classification accuracy for individual grip types seems to be dependent on the grip force being used since <italic>Grip</italic>1<sub><italic>accuracy</italic></sub> &#x0003C; <italic>Grip</italic>2<sub><italic>accuracy</italic></sub> &#x0003C; <italic>Grip</italic>3<sub><italic>accuracy</italic></sub> &#x0003C; <italic>Grip</italic>4<sub><italic>accuracy</italic></sub>.</p>
<table-wrap position="float" id="T5">
<label>Table 5</label>
<caption><p>The classification accuracy achieved using RF on the Myo EMG (10-fold cross-validation) data.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th valign="top" align="center"><bold>Random forest</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Grip1</td>
<td valign="top" align="center">65.12 &#x000B1; 6.92</td>
</tr>
<tr>
<td valign="top" align="left">Grip2</td>
<td valign="top" align="center">69.23 &#x000B1; 10.00</td>
</tr>
<tr>
<td valign="top" align="left">Grip3</td>
<td valign="top" align="center">73.73 &#x000B1; 11.19</td>
</tr>
<tr>
<td valign="top" align="left">Grip4</td>
<td valign="top" align="center">78.81 &#x000B1; 9.11</td>
</tr>
<tr>
<td valign="top" align="left">All grips, all positions</td>
<td valign="top" align="center">26.01 &#x000B1; 2.956</td>
</tr>
<tr>
<td valign="top" align="left">All grips and relaxed</td>
<td valign="top" align="center">40.21 &#x000B1; 11.22</td>
</tr>
<tr>
<td valign="top" align="left">All positions (relaxed)</td>
<td valign="top" align="center">40.56 &#x000B1; 4.07</td>
</tr>
</tbody>
</table>
</table-wrap></sec>
<sec>
<title>4.3.3. Performance: TAB vs. Myo</title>
<p>The tactile and sEMG data have different types of key features. Temporal features were important when classifying sEMG data, which confirmed results from Wolf et al. (<xref ref-type="bibr" rid="B40">2013</xref>). With the TAB tactile data the most prominent features were the eight raw sensor readings followed by a number of spatial features. The PCA transformed features performed at similar levels as the RF selected features. The best performing algorithm was the SVM. It is worth noting that more complex and computationally expensive algorithms such as a recurrent neural network (LSTM) were also implemented with similar accuracy to SVM. SVMs are commonly used in this type of application, with examples like NASA&#x00027;s Biosleeve project (Wolf et al., <xref ref-type="bibr" rid="B40">2013</xref>) and McIntosh&#x00027;s SVM implementation in a multimodal sensing (sEMG and FSR) system (McIntosh et al., <xref ref-type="bibr" rid="B21">2016</xref>).</p>
<p>Myo EMG data classification accuracy is highly dependent on the gripping force which is not the case with the TAB data classification. The TAB system confusion matrix was indicative of its high performance (over 90%). The corresponding Myo system matrix indicated its inability to correctly identify the data. In the classification of the four <italic>relaxed</italic> hand configurations, the majority of the data were classified as <italic>Grip1</italic>. Only 0.4% of the <italic>relaxed</italic> power grip is classified correctly. The weak electrical signals generated by low gripping forces seemed unable to reach the skin surface. Cross-talk may be another reason for this low-accuracy performance.</p>
<p>The results show that a motion intent recognition system that uses tactile sensing can achieve high classification accuracy (&#x0003E;90%) when personalized. Distinguishing between four different static hand positions and 4 grips, in the same positions (a total of 8 classes), &#x0003E;99% classification accuracy can be achieved. This is comparable to the state-of-the-art systems that use sEMG, such as NASA&#x00027;s BioSleeve (Wolf et al., <xref ref-type="bibr" rid="B40">2013</xref>). There were not enough participants in this study to create a generalized TAB system. The need for personalization is not unique to the TAB or the general force/pressure approach when a small number of participants is used. This has been recognized in similar studies that achieved accuracy as high (Wolf et al., <xref ref-type="bibr" rid="B40">2013</xref>; McIntosh et al., <xref ref-type="bibr" rid="B21">2016</xref>). Under the same conditions the Myo sEMG data classification did not perform as well.The Myo data trained models were unable to distinguish between the different classes. A more sophisticated sEMG system may have been more sensitive to the low-strength motions used and thus more capable of making distinctions between the states.</p>
<p>This study proves that the TAB can achieve as high hand motion classification accuracy as the state-of-the-art sEMG systems (Phillips and Craelius, <xref ref-type="bibr" rid="B25">2017</xref>). A more extensive study with a broader range of motions would elicit the limitations of the TAB and confirm the results obtained in this study. More testing is required to identify the extent to which adipose tissue content affects the TAB performance. Moreover, the sensor placement with respect to the distance from the proximal elbow joint and the ability to monitor particular muscles should be explored. Algorithms such as SVMs have the potential to correctly classify different data from the same participants. However, if a higher accuracy is needed for a generalized model then a much larger population is required in order to capture the distribution of forearm sizes, shapes and adipose tissue and muscle content. Especially since the resolution of the tactile signature was very much dependent on the forearm circumference as the sensing area was always constant.</p></sec></sec></sec>
<sec id="s5">
<title>5. Conclusions and Further Work</title>
<p>The integration of the TAB in rehabilitative and assistive devices is well justified through the experiments performed that emulated low strength arm/hand conditions. It has been shown that the forearm tactile signatures can be mapped to particular muscle contractions. These could be used as a control input to a wearable device the ability to respond to all sensory inputs appropriately without having to &#x0201C;learn&#x0201D; motion primitives. Achieving such transparency in the system would improve safety of wearable autonomous devices. The TAB was able to detect the tactile signature differences under such conditions and achieve higher classification accuracies than the sEMG of the Myo armband. The personalized model trained on partial PC-transformed data from all participants achieved an accuracy of 95.9% when distinguishing between the four <italic>relaxed</italic> hand positions. The 8-state (four <italic>relaxed</italic> and four <italic>gripping</italic>) classification accuracy was 97.6%.</p>
<p>In this study the mechanical cues that arise as a result of weak muscle contractions were captured by recording the normal forces generated on an arm brace. The use of higher sensitivity sensors, an array of which would cover the entirety of the TAB circumference could improve the prediction accuracy and the resolution would no longer be dependent on the forearm circumference. This could potentially also provide better data for the generalization of the system. Furthermore, incorporating sheer force sensing in addition to the normal force monitoring could further improve the TAB performance. To conclude, this study with the TAB system has shown that force myography is a promising motion intent recognition technique that could be potentially useful for upper-limb rehabilitation devices providing the transparency required for inherent safety.</p></sec>
<sec sec-type="data-availability-statement" id="s6">
<title>Data Availability Statement</title>
<p>The datasets generated for this study are available on request to the corresponding author.</p></sec>
<sec id="s7">
<title>Ethics Statement</title>
<p>This study was carried out in accordance with the recommendations of the Faculty Research Ethics Committee at the University of the West of England (UWE REC REF No: FET.18.02.028) with written informed consent from all subjects. All subjects gave written informed consent in accordance with the Declaration of Helsinki. The protocol was approved by the Faculty Research Ethics Committee.</p></sec>
<sec id="s8">
<title>Author Contributions</title>
<p>TS conceived the presented idea, preformed the experiments and the data analysis, and interpretation under the supervision of SD, GC, and TA. TS drafted the article which SD and GC discussed and commented on.</p>
<sec>
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p></sec></sec>
</body>
<back>
<ack><p>We acknowledge financial support by Deutsche Forschungsgemeinschaft within the funding programme Open Access Publishing, by the Baden-W&#x000FC;rttemberg Ministry of Science, Research and the Arts and by Ruprecht-Karls-Universit&#x000E4;t Heidelberg.</p>
</ack><sec sec-type="supplementary-material" id="s10">
<title>Supplementary Material</title>
<p>The Supplementary Material for this article can be found online at: <ext-link ext-link-type="uri" xlink:href="https://www.frontiersin.org/articles/10.3389/frobt.2019.00124/full#supplementary-material">https://www.frontiersin.org/articles/10.3389/frobt.2019.00124/full#supplementary-material</ext-link></p>
<supplementary-material xlink:href="Table_1.pdf" id="SM1" mimetype="application/pdf" xmlns:xlink="http://www.w3.org/1999/xlink"/>
<supplementary-material xlink:href="Image_1.pdf" id="SM2" mimetype="application/pdf" xmlns:xlink="http://www.w3.org/1999/xlink"/></sec>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Akhlaghi</surname> <given-names>N.</given-names></name> <name><surname>Baker</surname> <given-names>C. A.</given-names></name> <name><surname>Lahlou</surname> <given-names>M.</given-names></name> <name><surname>Zafar</surname> <given-names>H.</given-names></name> <name><surname>Murthy</surname> <given-names>K. G.</given-names></name> <name><surname>Rangwala</surname> <given-names>H. S.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>Real-time classification of hand motions using ultrasound imaging of forearm muscles</article-title>. <source>IEEE Trans. Biomed. Eng.</source> <volume>63</volume>, <fpage>1687</fpage>&#x02013;<lpage>1698</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2015.2498124</pub-id><pub-id pub-id-type="pmid">26560865</pub-id></citation></ref>
<ref id="B2">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Atzori</surname> <given-names>M.</given-names></name> <name><surname>Gijsberts</surname> <given-names>A.</given-names></name> <name><surname>Castellini</surname> <given-names>C.</given-names></name> <name><surname>Caputo</surname> <given-names>B.</given-names></name> <name><surname>Hager</surname> <given-names>A.-G. M.</given-names></name> <name><surname>Elsig</surname> <given-names>S.</given-names></name> <etal/></person-group>. (<year>2014</year>). <article-title>Electromyography data for non-invasive naturally-controlled robotic hand prostheses</article-title>. <source>Sci. Data</source> <volume>1</volume>:<fpage>140053</fpage>. <pub-id pub-id-type="doi">10.1038/sdata.2014.53</pub-id><pub-id pub-id-type="pmid">25977804</pub-id></citation></ref>
<ref id="B3">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Castellini</surname> <given-names>C.</given-names></name> <name><surname>Koiva</surname> <given-names>R.</given-names></name></person-group> (<year>2013</year>). <article-title>Using a high spatial resolution tactile sensor for intention detection</article-title>, in <source>2013 IEEE 13th International Conference on Rehabilitation Robotics (ICORR)</source> (<publisher-loc>Seattle, WA</publisher-loc>: <publisher-name>IEEE</publisher-name>), <fpage>7</fpage>.</citation></ref>
<ref id="B4">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cho</surname> <given-names>E.</given-names></name> <name><surname>Chen</surname> <given-names>R.</given-names></name> <name><surname>Merhi</surname> <given-names>L.-k.</given-names></name> <name><surname>Xiao</surname> <given-names>Z. G.</given-names></name> <name><surname>Pousett</surname> <given-names>B.</given-names></name></person-group> (<year>2016</year>). <article-title>Force myography to control robotic upper extremity prostheses: a feasibility study</article-title>. <source>Front. Bioeng. Biotechnol</source>. <volume>4</volume>:<fpage>18</fpage>. <pub-id pub-id-type="doi">10.3389/fbioe.2016.00018</pub-id><pub-id pub-id-type="pmid">27014682</pub-id></citation></ref>
<ref id="B5">
<citation citation-type="other"><person-group person-group-type="author"><name><surname>Cutts</surname> <given-names>A.</given-names></name> <name><surname>McN ALEXANDERt</surname> <given-names>R.</given-names></name> <name><surname>KERt</surname> <given-names>R. F.</given-names></name></person-group> (<year>1991</year>). <source>Ratios of Cross-Sectional Areas of Muscles and Their Tendons in a Healthy Human Forearm</source>. Technical report. <pub-id pub-id-type="pmid">1917668</pub-id></citation></ref>
<ref id="B6">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ding</surname> <given-names>H.</given-names></name> <name><surname>He</surname> <given-names>Q.</given-names></name> <name><surname>Zeng</surname> <given-names>L.</given-names></name> <name><surname>Zhou</surname> <given-names>Y.</given-names></name> <name><surname>Shen</surname> <given-names>M.</given-names></name> <name><surname>Dan</surname> <given-names>G.</given-names></name></person-group> (<year>2017a</year>). <article-title>Motion intent recognition of individual fingers based on mechanomyogram</article-title>. <source>Patt. Recogn. Lett.</source> <volume>88</volume>, <fpage>41</fpage>&#x02013;<lpage>48</lpage>. <pub-id pub-id-type="doi">10.1016/j.patrec.2017.01.012</pub-id></citation></ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ding</surname> <given-names>H.</given-names></name> <name><surname>He</surname> <given-names>Q.</given-names></name> <name><surname>Zhou</surname> <given-names>Y.</given-names></name> <name><surname>Dan</surname> <given-names>G.</given-names></name> <name><surname>Cui</surname> <given-names>S.</given-names></name></person-group> (<year>2017b</year>). <article-title>An individual finger gesture recognition system based on motion-intent analysis using mechanomyogram signal</article-title>. <source>Front. Neurol.</source> <volume>8</volume>:<fpage>573</fpage>. <pub-id pub-id-type="doi">10.3389/fneur.2017.00573</pub-id><pub-id pub-id-type="pmid">29167655</pub-id></citation></ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Farina</surname> <given-names>D.</given-names></name> <name><surname>Jiang</surname> <given-names>N.</given-names></name> <name><surname>Rehbaum</surname> <given-names>H.</given-names></name> <name><surname>Holobar</surname> <given-names>A.</given-names></name> <name><surname>Graimann</surname> <given-names>B.</given-names></name> <name><surname>Dietl</surname> <given-names>H.</given-names></name> <etal/></person-group>. (<year>2014</year>). <article-title>The extraction of neural information from the surface EMG for the control of upper-limb prostheses: emerging avenues and challenges</article-title>. <source>IEEE Trans. Neural Syst. Rehabil. Eng</source>. <volume>22</volume>, <fpage>797</fpage>&#x02013;<lpage>809</lpage>. <pub-id pub-id-type="doi">10.1109/TNSRE.2014.2305111</pub-id><pub-id pub-id-type="pmid">24760934</pub-id></citation></ref>
<ref id="B9">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Feix</surname> <given-names>T.</given-names></name> <name><surname>Romero</surname> <given-names>J.</given-names></name> <name><surname>Schmiedmayer</surname> <given-names>H. B.</given-names></name> <name><surname>Dollar</surname> <given-names>A. M.</given-names></name> <name><surname>Kragic</surname> <given-names>D.</given-names></name></person-group> (<year>2016</year>). <article-title>The GRASP taxonomy of human grasp types</article-title>. <source>IEEE Trans. Hum. Mach. Syst</source>. <volume>46</volume>, <fpage>66</fpage>&#x02013;<lpage>77</lpage>. <pub-id pub-id-type="doi">10.1109/THMS.2015.2470657</pub-id></citation></ref>
<ref id="B10">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Guo</surname> <given-names>W.</given-names></name> <name><surname>Sheng</surname> <given-names>X.</given-names></name> <name><surname>Zhang</surname> <given-names>D.</given-names></name> <name><surname>Zhu</surname> <given-names>X.</given-names></name></person-group> (<year>2015</year>). <article-title>Development of a hybrid surface EMG and MMG acquisition system for human hand motion analysis</article-title>, in <source>Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)</source> (<publisher-loc>Protsmouth</publisher-loc>).</citation></ref>
<ref id="B11">
<citation citation-type="journal"><person-group person-group-type="author"><collab>IBM</collab></person-group> (<year>2019</year>).<collab>IBM SPSS Software | IBM</collab>.</citation></ref>
<ref id="B12">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Islam</surname> <given-names>M. A.</given-names></name> <name><surname>Sundaraj</surname> <given-names>K.</given-names></name> <name><surname>Ahmad</surname> <given-names>R. B.</given-names></name> <name><surname>Ahamed</surname> <given-names>N. U.</given-names></name></person-group> (<year>2013</year>). <article-title>Mechanomyogram for muscle function assessment: a review</article-title>. <source>PLoS ONE</source> <volume>8</volume>:<fpage>e58902</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0058902</pub-id><pub-id pub-id-type="pmid">23536834</pub-id></citation></ref>
<ref id="B13">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kiguchi</surname> <given-names>K.</given-names></name> <name><surname>Hayashi</surname> <given-names>Y.</given-names></name></person-group> (<year>2012</year>). <article-title>An EMG-based control for an upper-limb power-assist exoskeleton robot</article-title>. <source>IEEE Trans. Syst. Man Cybern. B (Cybernet)</source> <volume>42</volume>, <fpage>1064</fpage>&#x02013;<lpage>1071</lpage>. <pub-id pub-id-type="doi">10.1109/TSMCB.2012.2185843</pub-id><pub-id pub-id-type="pmid">22334026</pub-id></citation></ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kinoshita</surname> <given-names>H.</given-names></name> <name><surname>Kawai</surname> <given-names>S.</given-names></name> <name><surname>Ikuta</surname> <given-names>K.</given-names></name></person-group> (<year>1995</year>). <article-title>Contributions and co-ordination of individual fingers in multiple finger prehension</article-title>. <source>Ergonomics</source> <volume>38</volume>, <fpage>1212</fpage>&#x02013;<lpage>1230</lpage>. <pub-id pub-id-type="doi">10.1080/00140139508925183</pub-id><pub-id pub-id-type="pmid">7758447</pub-id></citation></ref>
<ref id="B15">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kinoshita</surname> <given-names>H.</given-names></name> <name><surname>Murase</surname> <given-names>T.</given-names></name> <name><surname>Bandou</surname> <given-names>T.</given-names></name> <name><surname>Banmu</surname> <given-names>T.</given-names></name></person-group> (<year>1996</year>). <article-title>Grip posture and forces during holding cylindrical objects with circular grips</article-title>. <source>Ergonomics</source> <volume>39</volume>, <fpage>1163</fpage>&#x02013;<lpage>1176</lpage>. <pub-id pub-id-type="doi">10.1080/00140139608964536</pub-id><pub-id pub-id-type="pmid">8681936</pub-id></citation></ref>
<ref id="B16">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Koiva</surname> <given-names>R.</given-names></name> <name><surname>Riedenklau</surname> <given-names>E.</given-names></name> <name><surname>Viegas</surname> <given-names>C.</given-names></name> <name><surname>Castellini</surname> <given-names>C.</given-names></name></person-group> (<year>2015</year>). <article-title>Shape conformable high spatial resolution tactile bracelet for detecting hand and wrist activity</article-title>, in <source>2015 IEEE International Conference on Rehabilitation Robotics (ICORR)</source> (<publisher-loc>Singapore</publisher-loc>: <publisher-name>IEEE</publisher-name>).</citation></ref>
<ref id="B17">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>X.</given-names></name> <name><surname>Fang</surname> <given-names>P.</given-names></name> <name><surname>Tian</surname> <given-names>L.</given-names></name> <name><surname>Li</surname> <given-names>G.</given-names></name></person-group> (<year>2017</year>). <article-title>Increasing the robustness against force variation in EMG motion classification by common spatial patterns</article-title>, in <source>Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS</source> (<publisher-loc>Seogwipo</publisher-loc>), <fpage>406</fpage>&#x02013;<lpage>409</lpage>.</citation></ref>
<ref id="B18">
<citation citation-type="journal"><person-group person-group-type="author"><collab>MathWorks</collab></person-group> (<year>2018</year>). <collab>kstest2 MATLAB</collab>.</citation></ref>
<ref id="B19">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>McGinley</surname> <given-names>J. C.</given-names></name> <name><surname>Kozin</surname> <given-names>S. H.</given-names></name></person-group> (<year>2001</year>). <article-title>Interosseous membrane anatomy and functional mechanics</article-title>. <source>Clin. Orthopaed. Relat. Res.</source> <volume>383</volume>, <fpage>108</fpage>&#x02013;<lpage>122</lpage>. <pub-id pub-id-type="doi">10.1097/00003086-200102000-00013</pub-id></citation></ref>
<ref id="B20">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>McHugh</surname> <given-names>G.</given-names></name> <name><surname>Swain</surname> <given-names>I. D.</given-names></name></person-group> (<year>2013</year>). <article-title>A comparison between reported and ideal patient-to-therapist ratios for stroke rehabilitation</article-title>. <source>Health</source> <volume>5</volume>, <fpage>105</fpage>&#x02013;<lpage>112</lpage>. <pub-id pub-id-type="doi">10.4236/health.2013.56A2016</pub-id></citation></ref>
<ref id="B21">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>McIntosh</surname> <given-names>J.</given-names></name> <name><surname>McNeill</surname> <given-names>C.</given-names></name> <name><surname>Fraser</surname> <given-names>M.</given-names></name> <name><surname>Kerber</surname> <given-names>F.</given-names></name> <name><surname>Lochtefeld</surname> <given-names>M.</given-names></name> <name><surname>Kruger</surname> <given-names>A.</given-names></name></person-group> (<year>2016</year>). <article-title>EMPress: practical hand gesture classification with wrist-mounted EMG and pressure sensing</article-title>, in <source>34th Annual Chi Conference on Human Factors in Computing Systems, Chi 2016</source> (<publisher-loc>San Jose, CA</publisher-loc>), <fpage>2332</fpage>&#x02013;<lpage>2342</lpage>. <pub-id pub-id-type="doi">10.1145/2858036.2858093</pub-id></citation></ref>
<ref id="B22">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Moromugi</surname> <given-names>S.</given-names></name> <name><surname>Koujina</surname> <given-names>Y.</given-names></name> <name><surname>Ariki</surname> <given-names>S.</given-names></name> <name><surname>Okamoto</surname> <given-names>A.</given-names></name> <name><surname>Tanaka</surname> <given-names>T.</given-names></name> <name><surname>Feng</surname> <given-names>M. Q.</given-names></name> <etal/></person-group>. (<year>2004</year>). <article-title>Muscle stiffness sensor to control an assistance device for the disabled</article-title>. <source>Artif. Life Robot</source>. <volume>8</volume>, <fpage>42</fpage>&#x02013;<lpage>45</lpage>. <pub-id pub-id-type="doi">10.1007/s10015-004-0286-8</pub-id></citation></ref>
<ref id="B23">
<citation citation-type="journal"><person-group person-group-type="author"><collab>NCD Risk Factor Collaboration (NCD-RisC)</collab> <name><surname>Adair</surname> <given-names>L.</given-names></name> <name><surname>Fall</surname> <given-names>C.</given-names></name> <name><surname>Osmond</surname> <given-names>C.</given-names></name> <name><surname>Stein</surname> <given-names>A.</given-names></name> <name><surname>Martorell</surname> <given-names>R.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>A century of trends in adult human height</article-title>. <source>eLife</source> <volume>5</volume>, <fpage>525</fpage>&#x02013;<lpage>534</lpage>. <pub-id pub-id-type="doi">10.7554/eLife.13410</pub-id></citation></ref>
<ref id="B24">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Nordin</surname> <given-names>M.</given-names></name> <name><surname>Frankel</surname> <given-names>V. H.</given-names></name></person-group> (<year>2001</year>). <source>Basic Biomechanics of the Musculoskeletal System</source>. <publisher-loc>Philadelphia, PA</publisher-loc>: <publisher-name>Lippincott Williams &#x00026; Wilkins</publisher-name>.</citation></ref>
<ref id="B25">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Phillips</surname> <given-names>S. L.</given-names></name> <name><surname>Craelius</surname> <given-names>W.</given-names></name></person-group> (<year>2017</year>). <article-title>Residual kinetic imaging: a versatile interface for prosthetic control</article-title>. <source>Robotica</source> <volume>23</volume>, <fpage>277</fpage>&#x02013;<lpage>282</lpage>. <pub-id pub-id-type="doi">10.1017/S0263574704001298</pub-id></citation></ref>
<ref id="B26">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Radwin</surname> <given-names>R. G.</given-names></name> <name><surname>OH</surname> <given-names>S.</given-names></name> <name><surname>Jense</surname> <given-names>T. R.</given-names></name> <name><surname>Webster</surname> <given-names>J. G.</given-names></name></person-group> (<year>1992</year>). <article-title>External finger forces in submaximal five-finger static pinch prehension</article-title>. <source>Ergonomics</source> <volume>35</volume>, <fpage>275</fpage>&#x02013;<lpage>288</lpage>. <pub-id pub-id-type="doi">10.1080/00140139208967813</pub-id><pub-id pub-id-type="pmid">1572337</pub-id></citation></ref>
<ref id="B27">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ravindra</surname> <given-names>V.</given-names></name> <name><surname>Castellini</surname> <given-names>C.</given-names></name></person-group> (<year>2014</year>). <article-title>A comparative analysis of three non-invasive human-machine interfaces for the disabled</article-title>. <source>Front. Neurorobot.</source> <volume>8</volume>:<fpage>24</fpage>. <pub-id pub-id-type="doi">10.3389/fnbot.2014.00024</pub-id><pub-id pub-id-type="pmid">25386135</pub-id></citation></ref>
<ref id="B28">
<citation citation-type="journal"><person-group person-group-type="author"><collab>Rehab-Robotics Company Limited</collab></person-group> (<year>2017</year>). <collab>Hand of Hope</collab>.</citation></ref>
<ref id="B29">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Skahen</surname> <given-names>J. R.</given-names></name> <name><surname>Palmer</surname> <given-names>A. K.</given-names></name> <name><surname>Werner</surname> <given-names>F. W.</given-names></name> <name><surname>Fortino</surname> <given-names>M. D.</given-names></name></person-group> (<year>1997</year>). <article-title>The interosseous membrane of the forearm: anatomy and function</article-title>. <source>J. Hand Surg.</source> <volume>22</volume>, <fpage>981</fpage>&#x02013;<lpage>985</lpage>. <pub-id pub-id-type="doi">10.1016/S0363-5023(97)80036-6</pub-id><pub-id pub-id-type="pmid">9471064</pub-id></citation></ref>
<ref id="B30">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Stefanou</surname> <given-names>T.</given-names></name> <name><surname>Chance</surname> <given-names>G.</given-names></name> <name><surname>Assaf</surname> <given-names>T.</given-names></name> <name><surname>Dogramadzi</surname> <given-names>S.</given-names></name></person-group> (<year>2018a</year>). <article-title>Response times of a tactile motion intent recognition system</article-title>, in <source>The Hamlyn Symposium on Medical Robotics Proceedings</source> (<publisher-loc>London, UK</publisher-loc>: <publisher-name>Hamlyn Centre</publisher-name>), <fpage>131</fpage>&#x02013;<lpage>132</lpage>.</citation></ref>
<ref id="B31">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Stefanou</surname> <given-names>T.</given-names></name> <name><surname>Chance</surname> <given-names>G.</given-names></name> <name><surname>Assaf</surname> <given-names>T.</given-names></name> <name><surname>Lenz</surname> <given-names>A.</given-names></name> <name><surname>Dogramadzi</surname> <given-names>S.</given-names></name></person-group> (<year>2018b</year>). <article-title>Wearable tactile sensor brace for motion intent recognition in upper-limb rehabilitation</article-title>, in <source>Proceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics</source> (<publisher-loc>Twente</publisher-loc>).</citation></ref>
<ref id="B32">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Tarata</surname> <given-names>M.</given-names></name></person-group> (<year>2009</year>). <article-title>The electromyogram and mechanomyogram in monitoring neuromuscular fatigue: techniques, results, potential use within the dynamic effort</article-title>, in <source>Proceedings of the 7th International Conference Measurement</source> (<publisher-loc>Smolenice</publisher-loc>), <fpage>67</fpage>&#x02013;<lpage>77</lpage>.</citation></ref>
<ref id="B33">
<citation citation-type="other"><person-group person-group-type="author"><collab>Technavio</collab></person-group> (<year>2018</year>). <source>Global Electromyography (EMG) Devices Market 2018-2022</source>. Technical report.</citation></ref>
<ref id="B34">
<citation citation-type="web"><person-group person-group-type="author"><collab>Thalmic Labs</collab></person-group> (<year>2015a</year>). <source>How to Wear the Myo Armband &#x02013; Welcome to Myo Support</source>. Available online at: <ext-link ext-link-type="uri" xlink:href="https://support.getmyo.com/hc/en-us/articles/201169525-How-to-wear-the-Myo-armband">https://support.getmyo.com/hc/en-us/articles/201169525-How-to-wear-the-Myo-armband</ext-link></citation></ref>
<ref id="B35">
<citation citation-type="web"><person-group person-group-type="author"><collab>Thalmic Labs</collab></person-group> (<year>2015b</year>). <source>Myo Gesture Control Armband | Wearable Technology</source>. Available online at: <ext-link ext-link-type="uri" xlink:href="https://www.myo.com/">https://www.myo.com/</ext-link></citation></ref>
<ref id="B36">
<citation citation-type="web"><person-group person-group-type="author"><collab>Thalmic Labs</collab></person-group> (<year>2015c</year>). <source>Thalmic Labs - Makers of Myo Gesture Control Armband</source>. Available online at: <ext-link ext-link-type="uri" xlink:href="https://developer.thalmic.com/downloads">https://developer.thalmic.com/downloads</ext-link></citation></ref>
<ref id="B37">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Warraich</surname> <given-names>Z.</given-names></name> <name><surname>Kleim</surname> <given-names>J. A.</given-names></name></person-group> (<year>2010</year>). <article-title>Neural plasticity: the biological substrate for neurorehabilitation</article-title>. <source>PM and R</source> <volume>2</volume>(<supplement>12 Suppl. 2</supplement>):<fpage>S208</fpage>&#x02013;<lpage>S219</lpage>. <pub-id pub-id-type="doi">10.1016/j.pmrj.2010.10.016</pub-id><pub-id pub-id-type="pmid">21172683</pub-id></citation></ref>
<ref id="B38">
<citation citation-type="web"><person-group person-group-type="author"><collab>Wearable Devices Ltd</collab></person-group>. (<year>2019</year>). <source>Wearable Devices - Mudra Ispire</source>. Available online at: <ext-link ext-link-type="uri" xlink:href="http://wearabledevices.co.il/">http://wearabledevices.co.il/</ext-link> (accessed July 22, 2019).</citation></ref>
<ref id="B39">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wininger</surname> <given-names>M.</given-names></name> <name><surname>Kim</surname> <given-names>N.-H.</given-names></name> <name><surname>Craelius</surname> <given-names>W.</given-names></name></person-group> (<year>2008</year>). <article-title>Pressure signature of forearm as predictor of grip force</article-title>. <source>J. Rehabil. Res. Dev.</source> <volume>45</volume>, <fpage>883</fpage>&#x02013;<lpage>892</lpage>. <pub-id pub-id-type="doi">10.1682/JRRD.2007.11.0187</pub-id><pub-id pub-id-type="pmid">19009474</pub-id></citation></ref>
<ref id="B40">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Wolf</surname> <given-names>M. T.</given-names></name> <name><surname>Assad</surname> <given-names>C.</given-names></name> <name><surname>Stoica</surname> <given-names>A.</given-names></name> <name><surname>You</surname> <given-names>K.</given-names></name> <name><surname>Jethani</surname> <given-names>H.</given-names></name> <name><surname>Vernacchia</surname> <given-names>M. T.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Decoding static and dynamic arm and hand gestures from the JPL BioSleeve</article-title>, in <source>IEEE Aerospace Conference Proceedings</source> (<publisher-loc>Big Sky, MT</publisher-loc>).</citation></ref>
<ref id="B41">
<citation citation-type="book"><person-group person-group-type="author"><collab>World Health Organisation</collab></person-group> (<year>2014</year>). <source>Ageing and Life-Course</source>. <publisher-loc>Geneva</publisher-loc>: <publisher-name>World Health Organisation</publisher-name>.</citation></ref>
<ref id="B42">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Xiao</surname> <given-names>Z. G.</given-names></name> <name><surname>Elnady</surname> <given-names>A. M.</given-names></name> <name><surname>Menon</surname> <given-names>C.</given-names></name></person-group> (<year>2014</year>). <article-title>Control an exoskeleton for forearm rotation using FMG</article-title>, in <source>5th IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics</source> (<publisher-loc>IEEE</publisher-loc>), <fpage>591</fpage>&#x02013;<lpage>596</lpage>.</citation></ref>
<ref id="B43">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xiao</surname> <given-names>Z. G.</given-names></name> <name><surname>Menon</surname> <given-names>C.</given-names></name></person-group> (<year>2017</year>). <article-title>Counting grasping action using force myography: an exploratory study with healthy individuals</article-title>. <source>JMIR Rehabil. Assist. Technol</source>. <volume>4</volume>:<fpage>e5</fpage>. <pub-id pub-id-type="doi">10.2196/rehab.6901</pub-id><pub-id pub-id-type="pmid">28582263</pub-id></citation></ref>
<ref id="B44">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Yap</surname> <given-names>H. K.</given-names></name> <name><surname>Mao</surname> <given-names>A.</given-names></name> <name><surname>Goh</surname> <given-names>J. C.</given-names></name> <name><surname>Yeow</surname> <given-names>C. H.</given-names></name></person-group> (<year>2016</year>). <article-title>Design of a wearable FMG sensing system for user intent detection during hand rehabilitation with a soft robotic glove</article-title>, in <source>Proceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics</source> (<publisher-loc>Singapore</publisher-loc>).</citation></ref>
<ref id="B45">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yu</surname> <given-names>Z.</given-names></name> <name><surname>Lee</surname> <given-names>M.</given-names></name></person-group> (<year>2015</year>). <article-title>Human motion based intent recognition using a deep dynamic neural model</article-title>. <source>Robot. Auton. Syst</source>. <volume>71</volume>, <fpage>134</fpage>&#x02013;<lpage>149</lpage>. <pub-id pub-id-type="doi">10.1016/j.robot.2015.01.001</pub-id></citation></ref>
</ref-list>
<fn-group>
<fn id="fn0001"><p><sup>1</sup>The maximum strength and the individual fingers&#x00027; force contributions to <italic>Grip3</italic>, a prismatic 2 finger grip, grip No.8 in <xref ref-type="bibr" rid="B23">NCD Risk Factor Collaboration (NCD-RisC)</xref>, were not found hence the tripod grip, grip No.14, values were used from Kinoshita&#x00027;s study (Kinoshita et al., <xref ref-type="bibr" rid="B15">1996</xref>).</p></fn>
</fn-group>
<fn-group>
<fn fn-type="financial-disclosure"><p><bold>Funding.</bold> The research performed was supported by the EPSRC (Engineering and Physical Sciences Research Council) in the UK.</p></fn>
</fn-group>
</back>
</article>