From 77176f0fed8388e6a85c4a3db49c2abe1a38120b Mon Sep 17 00:00:00 2001 From: jane abdo Date: Sat, 15 Jun 2024 22:48:07 -0400 Subject: [PATCH] modif --- README.md | 25 +++++++++++++++---------- 1 file changed, 15 insertions(+), 10 deletions(-) diff --git a/README.md b/README.md index 197a10e..abc1edd 100644 --- a/README.md +++ b/README.md @@ -8,11 +8,11 @@ Hey! I'm Jane, a professional master's in biomedical engineering student at Poly

Controlling machines with imagination

Introduction:

-A variety of movement types can be decoded from brain signals during movement execution, ex: wrist flexion and extension, grabbing, finger moving… (Volkova et al, 2019). These decoded signals can then be used to control external devices, such as a screen cursor, a mouse or a prosthetic limb. Certain handicapped populations, like paralyzed and amputated people, could largely benefit from the control of external devices. As they do not have brain signals associated with the execution of movement, other ways of controlling the external device are needed. Fortunately, studies have shown that motor imagery (imagining executing movement) and motor control (executing movement) share neural mechanisms, by activating similar brain regions (Guillot et al., 2009). -
Hence, the question is: Can we decode movement types based on brain signals from imagined movement? -
There are many ways of extracting brain signals, the least invasive of which (which is actually not invasive at all) is EEG (electroencephalography). However, because of the location of the electrodes on the scalp, the signal is distorted by the scalp and skull. Other more invasive techniques include EcoG (electrocorticography), in which the electrodes are placed on the surface of the cortex. This technique has been shown to yield better signal quality. -
The question becomes: Can we decode movement types based on EcoG brain signals from imagined movement? -
A group of students have laid the foundations for answering this question during an online computational neuroscience course called [Neuromatch](https://compneuro.neuromatch.io/). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project. +A variety of movement types can be decoded from brain signals during movement execution, ex: wrist flexion and extension, grabbing, finger moving… ( Volkova et al, 2019). These decoded signals can then be used to control external devices, such as a screen cursor, a mouse or a prosthetic limb. Certain handicapped populations, like paralyzed and amputated people, could largely benefit from the control of external devices. As they do not have brain signals associated with the execution of movement, other ways of controlling the external device are needed. Fortunately, studies have shown that motor imagery (imagining executing movement) and motor control (executing movement) share neural mechanisms, by activating similar brain regions ( Guillot et al, 2009). +
Hence, the question is: Can we decode movement types based on brain signals from imagined movement? +
There are many ways of extracting brain signals, the least invasive of which (which is actually not invasive at all) is EEG (electroencephalography). However, because of the location of the electrodes on the scalp, the signal is distorted by the scalp and skull. Other more invasive techniques include EcoG (electrocorticography), in which the electrodes are placed on the surface of the cortex. This technique has been shown to yield better signal quality. +
The question becomes: Can we decode movement types based on EcoG brain signals from imagined movement? +
A group of students have laid the foundations for answering this question during an online computational neuroscience course called [Neuromatch](https://compneuro.neuromatch.io/). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project.

Main Objectives:

Data:

-The raw data comes from a public source (Miller KJ. A library of human electrocorticographic data and analyses. Nat Hum Behav. 2019 Nov;3(11):1225-1235. doi: 10.1038/s41562-019-0678-3. Epub 2019 Aug 26. PMID: 31451738.). The data used for my project has been downloaded not from the original source (raw data), but from a preprocessed source coming from the Neuromatch Academy website for computational neuroscience (https://osf.io/ksqv8). +The raw data comes from a public source ( Miller et al). The data used for my project has been downloaded not from the original source (raw data), but from a preprocessed source coming from the Neuromatch Academy website for computational neuroscience (https://osf.io/ksqv8).

Deliverables:

Jupyter notebook containing data processing, classifiers and data visualization -Tools: -Methods: -Results: -Conclusion: +

Tools :

+

Methods :

+

Results :

+

Conclusion :

+

References :

+Volkova K, Lebedev MA, Kaplan A, Ossadtchi A. Decoding Movement From Electrocorticographic Activity: A Review. Front Neuroinform. 2019 Dec 3;13:74. doi: 10.3389/fninf.2019.00074. PMID: 31849632; PMCID: PMC6901702. +
Miller KJ. A library of human electrocorticographic data and analyses. Nat Hum Behav. 2019 Nov;3(11):1225-1235. doi: 10.1038/s41562-019-0678-3. Epub 2019 Aug 26. PMID: 31451738. +
Guillot A, Collet C, Nguyen VA, Malouin F, Richards C, Doyon J. Brain activity during visual versus kinesthetic imagery: an fMRI study. Hum Brain Mapp. 2009 Jul;30(7):2157-72. doi: 10.1002/hbm.20658. PMID: 18819106; PMCID: PMC6870928.