Parag Patil, M.D., Ph.D. Grant Support

Advance Neuro Targeting System for Deep Brain Stimulation Surgery

Patil P, Chestek C
Coulter Translational Research Partnership Program
9/1/2020 - 06/20/2022


NCS FR - Elucidating the relationship between motor cortex neural firing rates and dexterous finger movement EMG for use in brain computer interfaces

Patil P, Chestek C, Temmar H
National Science Foundation
9/1/2019 - 8/31/2023

Prosthetic hands controlled directly by the nervous system have been the subject of science fiction for decades, and could lead to dramatic quality of life improvements for people with upper limb amputations or paralysis. The brain is the only known controller capable of moving a 5-fingered robot with high precision to use a wide variety of tools and objects. While we know a lot about the control signals in the brain and movement of the fingers, we do not have a good idea of how one gives rise to the other. Here, we will generate an enormous dataset, which will be publicly distributed to students and other scientists everywhere. It will include many channels of brain activity simultaneously with muscle activity to help work out the transformation between the two, and attempt to replicate this control system using artificial neural networks. A strong demonstration of brain controlled prostheses could lead to human studies, and a clinical system that could impact the quality of life for hundreds of thousands of people with amputations or paralysis, as well generating insights for smarter robotic systems. Beyond the direct output of the research, brain machine interfaces have the capability to inspire a large number of students, including those from underrepresented groups, into careers in science and technology, by showing clearly to a young audience how this kind of education can help people.

This project proposes to record data simultaneously from the brain, muscles, and kinematics of the primate hand during complex finger movements, in order to replicate this control system. Specifically, the objective of this particular application is to establish the first such dataset in a nonhuman primate, recording 200 channels from motor cortex, 12 channels of EMG from the muscles, and precise kinematics during the acquisition of finger targets from 4 different degrees of freedom. Our central hypothesis is that firing rates can be transformed to EMG using a single layer neural nonlinearity followed by a regularized linear regression, and then transformed into finger kinematics through non-linear but well-characterized anatomy. This differs from upper limb signals, which can appear to be linearly modulated by endpoint velocity regardless of posture. We will complete this project with three objectives. In Objective 1, we will establish a world-class surgical team to create a nonhuman primate animal model with simultaneous chronic brain recording and EMG recording. In Objective 2, we will develop an algorithmic approach to map motor cortex firing rates to EMG as well as kinematics, with both offline and online testing. In Objective 3, we will explore low power circuitry to extract this information in real time, at a power consumption that would be appropriate for an implantable medical device. The overall scientific philosophy of this project is that the brain provides an example of a low power neural network, which is relatively shallow between motor cortex and EMG, for controlling a complex soft robotic system. Uncovering this relationship will enable us to use this approach for brain machine interfaces for paralysis as well as guiding future human made robotic approaches.