Skip to content

Commit

Permalink
Merge pull request #5 from brainhack-school2024/iss1
Browse files Browse the repository at this point in the history
Iss1
  • Loading branch information
janeabdo authored Jun 16, 2024
2 parents 5b1268b + fcd91ea commit 8d4216a
Show file tree
Hide file tree
Showing 3 changed files with 62 additions and 0 deletions.
14 changes: 14 additions & 0 deletions Actual_Movement_Condition.html

Large diffs are not rendered by default.

14 changes: 14 additions & 0 deletions Imagined_Movement_Condition.html

Large diffs are not rendered by default.

34 changes: 34 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,37 @@
</a>

Hey! I'm Jane, a professional master's in biomedical engineering student at Polytechnique Montréal. I'm super excited to start my project! :)

<h1> Controlling machines with imagination </h1>
<h3> <strong>Introduction:</strong> </h3>
A variety of movement types can be decoded from brain signals during movement execution, ex: wrist flexion and extension, grabbing, finger moving… (<a href= "https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6901702/ "> Volkova et al, 2019</a>). These decoded signals can then be used to control external devices, such as a screen cursor, a mouse or a prosthetic limb. Certain handicapped populations, like paralyzed and amputated people, could largely benefit from the control of external devices. As they do not have brain signals associated with the execution of movement, other ways of controlling the external device are needed. Fortunately, studies have shown that motor imagery (imagining executing movement) and motor control (executing movement) share neural mechanisms, by activating similar brain regions (<a href= "https://pubmed.ncbi.nlm.nih.gov/18819106/ "> Guillot et al, 2009</a>).
<br> Hence, the question is: Can we decode movement types based on brain signals from imagined movement?
<br> There are many ways of extracting brain signals, the least invasive of which (which is actually not invasive at all) is EEG (electro<strong>encephalo</strong>graphy). However, because of the location of the electrodes on the scalp, the signal is distorted by the scalp and skull. Other more invasive techniques include EcoG (electro<strong>cortico</strong>graphy), in which the electrodes are placed on the surface of the cortex. This technique has been shown to yield better signal quality.
<br> The question becomes: Can we decode movement types based on EcoG brain signals from imagined movement?
<br> A group of students have laid the foundations for answering this question during an online computational neuroscience course called [Neuromatch](https://compneuro.neuromatch.io/). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project.
<h3> <strong>Main Objectives:</strong> </h3>
<ul>
<li>Demonstrate better classifier accuracy through improved data processing </li>
<li>Implement other classifiers and compare performances </li>
<li>Apply acquired data visualization concepts on electrophysiological data </li>
</ul>
<h3> <strong>Personal Objectives:</strong> </h3>
<ul>
<li>Develop an understanding of EcoG data (features, filtering, processing…)</li>
<li>Get familiar with Github/Git</li>
<li>Gain an understanding of open science notions<li>
</ul>
<h3> <strong>Data:</strong> </h3>
The raw data comes from a public source (<a href= "https://pubmed.ncbi.nlm.nih.gov/31451738/ "> Miller et al</a>). The data used for my project has been downloaded not from the original source (raw data), but from a preprocessed source coming from the Neuromatch Academy website for computational neuroscience (https://osf.io/ksqv8).
<h3> <strong>Deliverables:</strong> </h3>
Jupyter notebook containing data processing, classifiers and data visualization
<h3> <strong> Tools :</strong> </h3>
<h3> <strong> Methods :</strong> </h3>
<h3> <strong> Results :</strong> </h3>
<h3> <strong> Conclusion :</strong> </h3>
<h3> <strong> References :</strong> </h3>
Volkova K, Lebedev MA, Kaplan A, Ossadtchi A. Decoding Movement From Electrocorticographic Activity: A Review. Front Neuroinform. 2019 Dec 3;13:74. doi: 10.3389/fninf.2019.00074. PMID: 31849632; PMCID: PMC6901702.
<br>Miller KJ. A library of human electrocorticographic data and analyses. Nat Hum Behav. 2019 Nov;3(11):1225-1235. doi: 10.1038/s41562-019-0678-3. Epub 2019 Aug 26. PMID: 31451738.
<br>Guillot A, Collet C, Nguyen VA, Malouin F, Richards C, Doyon J. Brain activity during visual versus kinesthetic imagery: an fMRI study. Hum Brain Mapp. 2009 Jul;30(7):2157-72. doi: 10.1002/hbm.20658. PMID: 18819106; PMCID: PMC6870928.


0 comments on commit 8d4216a

Please sign in to comment.