Real time decoding of hand grasping signals from macaque premotor and parietal cortex
-
1
Institute of Neuroinformatics, ETHZ, Switzerland
A brain machine interface (BMI) for visually guided grasping would provide significant benefits for paralyzed patients given the crucial role these movements play in our everyday life. We are developing a BMI to decode grasp shape in macaque monkeys online. Neural activity is evaluated using chronically implanted electrodes in the anterior intraparietal cortex (AIP) and ventral premotor cortex (F5), areas known to be involved in the transformation of visual signals into hand grasping instructions. Macaque monkeys were trained in a delayed grasping task, where they first placed their hands at rest and fixated a red LED before a grasping handle was presented in one of 5 different orientations, and the color of an additional LED instructed the animals to grasp the handle either with a power or precision grip, respectively. After a short delay the fixation LED dimmed instructing the monkey to perform the required grasp. Correct trials were rewarded with a small amount of juice. After successful training, 5 floating micro-electrode arrays (FMA; MicroProbe Inc) were implanted in AIP (2) and F5 (3) of 1 animal. Each array comprised 16 platinum-iridium electrodes (length 1.0-4.5mm, spacing 0.5 mm). This configuration was chosen to facilitate the recording of neuronal activity within cortical sulci instead of the cortical surface. Neural signals were sampled using a Cerebus (Cyberkinetics Inc, Foxborough, MA) Neural Signal Processor (NSP) and streamed to a dedicated decoding PC via UDP. Spike sorting was conducted manually online by setting time-amplitude discrimination windows. Decoding was implemented using maximum likelihood estimation. We benchmarked decoder performance offline before commencing BMI experiments by using a spike simulator tool capable of creating artificial Poisson-distributed spike trains as well as loading and replaying previous neuronal recordings. Both decoder and simulator were implemented in C++ including the Neuroshare library for reading files, the Cerebus UDP Network Protocol and a graphical user interface. In grasp decoding trials, spike data was sampled during the planning phase and the planned grasp was decoded in real time. The decoded grasp was presented to the monkey visually during the grasp phase. If correct, the animal received a small juice reward. In initial experiments we decoded grasp type and grasp orientation (target tilted to left or right) with mean accuracy 91%, 6 conditions (grasp type and 3 orientations) with mean accuracy 72%, and the full 10 conditions with mean accuracy 49%. Furthermore, in a preliminary analysis, we used local field potential signals (LFPs) recorded from F5 to decode the behavioural state of the animal (baseline, cue period, planning, pre-movement execution) in a simulated decoding offline using Bayesian classification. LFP data recorded from a single electrode correctly classified the behavioural state with a mean accuracy of 83%. Taking data from 10 F5 electrodes, the mean accuracy was 98%. Similar results were obtained for AIP. These results are proof-of-concept for a BMI for visually guided grasping. This BMI could be extended for a larger number of grip types and orientations, and to include real time state decoding, as needed for prosthetic applications.
Conference:
Neuroinformatics 2008, Stockholm, Sweden, 7 Sep - 9 Sep, 2008.
Presentation Type:
Poster Presentation
Topic:
Brain Machine Interface
Citation:
Townsend
B,
Subasi
E,
Lehmann
S and
Scherberger
H
(2008). Real time decoding of hand grasping signals from macaque premotor and parietal cortex.
Front. Neuroinform.
Conference Abstract:
Neuroinformatics 2008.
doi: 10.3389/conf.neuro.11.2008.01.093
Copyright:
The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers.
They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.
The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.
Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.
For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.
Received:
28 Jul 2008;
Published Online:
28 Jul 2008.
*
Correspondence:
Benjamin Townsend, Institute of Neuroinformatics, ETHZ, Zürich, Switzerland, brt@ini.phys.ethz.ch