From fc10384240f3407bb20c1a84b501d95c4e14dc66 Mon Sep 17 00:00:00 2001 From: jane abdo Date: Fri, 14 Jun 2024 07:41:21 -0400 Subject: [PATCH 1/5] Add HTML file for Plotly visualizations --- Actual_Movement_Condition.html | 14 ++++++++++++++ Imagined_Movement_Condition.html | 14 ++++++++++++++ 2 files changed, 28 insertions(+) create mode 100644 Actual_Movement_Condition.html create mode 100644 Imagined_Movement_Condition.html diff --git a/Actual_Movement_Condition.html b/Actual_Movement_Condition.html new file mode 100644 index 0000000..21c830d --- /dev/null +++ b/Actual_Movement_Condition.html @@ -0,0 +1,14 @@ + + + +
+
+ + \ No newline at end of file diff --git a/Imagined_Movement_Condition.html b/Imagined_Movement_Condition.html new file mode 100644 index 0000000..cc1f743 --- /dev/null +++ b/Imagined_Movement_Condition.html @@ -0,0 +1,14 @@ + + + +
+
+ + \ No newline at end of file From bd7c932b1ea98337837113a58d66b8809cc3b229 Mon Sep 17 00:00:00 2001 From: jane abdo Date: Sat, 15 Jun 2024 21:54:31 -0400 Subject: [PATCH 2/5] Added project sections --- README.md | 27 +++++++++++++++++++++++++++ 1 file changed, 27 insertions(+) diff --git a/README.md b/README.md index cc9b3c0..8f1608d 100644 --- a/README.md +++ b/README.md @@ -5,3 +5,30 @@ Hey! I'm Jane, a professional master's in biomedical engineering student at Polytechnique Montréal. I'm super excited to start my project! :) + +

Controlling machines with imagination

+

Introduction:

+A variety of movement types can be decoded from brain signals during movement execution, ex: wrist flexion and extension, grabbing, finger moving… (Volkova et al, 2019). These decoded signals can then be used to control external devices, such as a screen cursor, a mouse or a prosthetic limb. Certain handicapped populations, like paralyzed and amputated people, could largely benefit from the control of external devices. As they do not have brain signals associated with the execution of movement, other ways of controlling the external device are needed. Fortunately, studies have shown that motor imagery (imagining executing movement) and motor control (executing movement) share neural mechanisms, by activating similar brain regions (Guillot et al., 2009). +Hence, the question is: Can we decode movement types based on brain signals from imagined movement? +There are many ways of extracting brain signals, the least invasive of which (which is actually not invasive at all) is EEG (electroencephalography). However, because of the location of the electrodes on the scalp, the signal is distorted by the scalp and skull. Other more invasive techniques include EcoG (electrocorticography), in which the electrodes are placed on the surface of the cortex. This technique has been shown to yield better signal quality. +The question becomes: Can we decode movement types based on EcoG brain signals from imagined movement? +A group of students have laid the foundations for answering this question during an online computational neuroscience course (Neuromatch). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project. +

Main Objectives:

+Demonstrate better classifier accuracy through improved data processing +Implement other classifiers and compare performances +Apply acquired data visualization concepts on electrophysiological data + +

Personal Objectives:

+Develop an understanding of EcoG data (features, filtering, processing…) +Get familiar with Github/Git +Gain an understanding of open science notions +

Data:

+The raw data comes from a public source (Miller KJ. A library of human electrocorticographic data and analyses. Nat Hum Behav. 2019 Nov;3(11):1225-1235. doi: 10.1038/s41562-019-0678-3. Epub 2019 Aug 26. PMID: 31451738.). The data used for my project has been downloaded not from the original source (raw data), but from a preprocessed source coming from the Neuromatch Academy website for computational neuroscience (https://osf.io/ksqv8). +

Deliverables:

+Jupyter notebook containing data processing, classifiers and data visualization +Tools: +Methods: +Results: +Conclusion: + + From e41ccf35e830c0128fc16b1099473f507cc01919 Mon Sep 17 00:00:00 2001 From: jane abdo Date: Sat, 15 Jun 2024 22:02:47 -0400 Subject: [PATCH 3/5] modif --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 8f1608d..fbfe8f8 100644 --- a/README.md +++ b/README.md @@ -12,7 +12,7 @@ A variety of movement types can be decoded from brain signals during movement ex Hence, the question is: Can we decode movement types based on brain signals from imagined movement? There are many ways of extracting brain signals, the least invasive of which (which is actually not invasive at all) is EEG (electroencephalography). However, because of the location of the electrodes on the scalp, the signal is distorted by the scalp and skull. Other more invasive techniques include EcoG (electrocorticography), in which the electrodes are placed on the surface of the cortex. This technique has been shown to yield better signal quality. The question becomes: Can we decode movement types based on EcoG brain signals from imagined movement? -A group of students have laid the foundations for answering this question during an online computational neuroscience course (Neuromatch). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project. +A group of students have laid the foundations for answering this question during an online computational neuroscience course ([Neuromatch](https://compneuro.neuromatch.io/)). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project.

Main Objectives:

Demonstrate better classifier accuracy through improved data processing Implement other classifiers and compare performances From 0227869127a156f67686c3ff73c5529585e127e8 Mon Sep 17 00:00:00 2001 From: jane abdo Date: Sat, 15 Jun 2024 22:11:51 -0400 Subject: [PATCH 4/5] modif --- README.md | 24 +++++++++++++----------- 1 file changed, 13 insertions(+), 11 deletions(-) diff --git a/README.md b/README.md index fbfe8f8..197a10e 100644 --- a/README.md +++ b/README.md @@ -9,19 +9,21 @@ Hey! I'm Jane, a professional master's in biomedical engineering student at Poly

Controlling machines with imagination

Introduction:

A variety of movement types can be decoded from brain signals during movement execution, ex: wrist flexion and extension, grabbing, finger moving… (Volkova et al, 2019). These decoded signals can then be used to control external devices, such as a screen cursor, a mouse or a prosthetic limb. Certain handicapped populations, like paralyzed and amputated people, could largely benefit from the control of external devices. As they do not have brain signals associated with the execution of movement, other ways of controlling the external device are needed. Fortunately, studies have shown that motor imagery (imagining executing movement) and motor control (executing movement) share neural mechanisms, by activating similar brain regions (Guillot et al., 2009). -Hence, the question is: Can we decode movement types based on brain signals from imagined movement? -There are many ways of extracting brain signals, the least invasive of which (which is actually not invasive at all) is EEG (electroencephalography). However, because of the location of the electrodes on the scalp, the signal is distorted by the scalp and skull. Other more invasive techniques include EcoG (electrocorticography), in which the electrodes are placed on the surface of the cortex. This technique has been shown to yield better signal quality. -The question becomes: Can we decode movement types based on EcoG brain signals from imagined movement? -A group of students have laid the foundations for answering this question during an online computational neuroscience course ([Neuromatch](https://compneuro.neuromatch.io/)). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project. +
Hence, the question is: Can we decode movement types based on brain signals from imagined movement? +
There are many ways of extracting brain signals, the least invasive of which (which is actually not invasive at all) is EEG (electroencephalography). However, because of the location of the electrodes on the scalp, the signal is distorted by the scalp and skull. Other more invasive techniques include EcoG (electrocorticography), in which the electrodes are placed on the surface of the cortex. This technique has been shown to yield better signal quality. +
The question becomes: Can we decode movement types based on EcoG brain signals from imagined movement? +
A group of students have laid the foundations for answering this question during an online computational neuroscience course called [Neuromatch](https://compneuro.neuromatch.io/). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project.

Main Objectives:

-Demonstrate better classifier accuracy through improved data processing -Implement other classifiers and compare performances -Apply acquired data visualization concepts on electrophysiological data - +
    +
  • Demonstrate better classifier accuracy through improved data processing
  • +
  • Implement other classifiers and compare performances
  • +
  • Apply acquired data visualization concepts on electrophysiological data
  • +

Personal Objectives:

-Develop an understanding of EcoG data (features, filtering, processing…) -Get familiar with Github/Git -Gain an understanding of open science notions +
    +
  • Develop an understanding of EcoG data (features, filtering, processing…)
  • +
  • Get familiar with Github/Git
  • +
  • Gain an understanding of open science notions
  • Data:

    The raw data comes from a public source (Miller KJ. A library of human electrocorticographic data and analyses. Nat Hum Behav. 2019 Nov;3(11):1225-1235. doi: 10.1038/s41562-019-0678-3. Epub 2019 Aug 26. PMID: 31451738.). The data used for my project has been downloaded not from the original source (raw data), but from a preprocessed source coming from the Neuromatch Academy website for computational neuroscience (https://osf.io/ksqv8).

    Deliverables:

    From 77176f0fed8388e6a85c4a3db49c2abe1a38120b Mon Sep 17 00:00:00 2001 From: jane abdo Date: Sat, 15 Jun 2024 22:48:07 -0400 Subject: [PATCH 5/5] modif --- README.md | 25 +++++++++++++++---------- 1 file changed, 15 insertions(+), 10 deletions(-) diff --git a/README.md b/README.md index 197a10e..abc1edd 100644 --- a/README.md +++ b/README.md @@ -8,11 +8,11 @@ Hey! I'm Jane, a professional master's in biomedical engineering student at Poly

    Controlling machines with imagination

    Introduction:

    -A variety of movement types can be decoded from brain signals during movement execution, ex: wrist flexion and extension, grabbing, finger moving… (Volkova et al, 2019). These decoded signals can then be used to control external devices, such as a screen cursor, a mouse or a prosthetic limb. Certain handicapped populations, like paralyzed and amputated people, could largely benefit from the control of external devices. As they do not have brain signals associated with the execution of movement, other ways of controlling the external device are needed. Fortunately, studies have shown that motor imagery (imagining executing movement) and motor control (executing movement) share neural mechanisms, by activating similar brain regions (Guillot et al., 2009). -
    Hence, the question is: Can we decode movement types based on brain signals from imagined movement? -
    There are many ways of extracting brain signals, the least invasive of which (which is actually not invasive at all) is EEG (electroencephalography). However, because of the location of the electrodes on the scalp, the signal is distorted by the scalp and skull. Other more invasive techniques include EcoG (electrocorticography), in which the electrodes are placed on the surface of the cortex. This technique has been shown to yield better signal quality. -
    The question becomes: Can we decode movement types based on EcoG brain signals from imagined movement? -
    A group of students have laid the foundations for answering this question during an online computational neuroscience course called [Neuromatch](https://compneuro.neuromatch.io/). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project. +A variety of movement types can be decoded from brain signals during movement execution, ex: wrist flexion and extension, grabbing, finger moving… ( Volkova et al, 2019). These decoded signals can then be used to control external devices, such as a screen cursor, a mouse or a prosthetic limb. Certain handicapped populations, like paralyzed and amputated people, could largely benefit from the control of external devices. As they do not have brain signals associated with the execution of movement, other ways of controlling the external device are needed. Fortunately, studies have shown that motor imagery (imagining executing movement) and motor control (executing movement) share neural mechanisms, by activating similar brain regions ( Guillot et al, 2009). +
    Hence, the question is: Can we decode movement types based on brain signals from imagined movement? +
    There are many ways of extracting brain signals, the least invasive of which (which is actually not invasive at all) is EEG (electroencephalography). However, because of the location of the electrodes on the scalp, the signal is distorted by the scalp and skull. Other more invasive techniques include EcoG (electrocorticography), in which the electrodes are placed on the surface of the cortex. This technique has been shown to yield better signal quality. +
    The question becomes: Can we decode movement types based on EcoG brain signals from imagined movement? +
    A group of students have laid the foundations for answering this question during an online computational neuroscience course called [Neuromatch](https://compneuro.neuromatch.io/). Amongst other creations, they have developed a classifier to decode movement types from imagined as well as executed movement EcoG signals. This classifier will be the foundation of my project.

    Main Objectives:

    • Demonstrate better classifier accuracy through improved data processing
    • @@ -24,13 +24,18 @@ A variety of movement types can be decoded from brain signals during movement ex
    • Develop an understanding of EcoG data (features, filtering, processing…)
    • Get familiar with Github/Git
    • Gain an understanding of open science notions
    • +

    Data:

    -The raw data comes from a public source (Miller KJ. A library of human electrocorticographic data and analyses. Nat Hum Behav. 2019 Nov;3(11):1225-1235. doi: 10.1038/s41562-019-0678-3. Epub 2019 Aug 26. PMID: 31451738.). The data used for my project has been downloaded not from the original source (raw data), but from a preprocessed source coming from the Neuromatch Academy website for computational neuroscience (https://osf.io/ksqv8). +The raw data comes from a public source ( Miller et al). The data used for my project has been downloaded not from the original source (raw data), but from a preprocessed source coming from the Neuromatch Academy website for computational neuroscience (https://osf.io/ksqv8).

    Deliverables:

    Jupyter notebook containing data processing, classifiers and data visualization -Tools: -Methods: -Results: -Conclusion: +

    Tools :

    +

    Methods :

    +

    Results :

    +

    Conclusion :

    +

    References :

    +Volkova K, Lebedev MA, Kaplan A, Ossadtchi A. Decoding Movement From Electrocorticographic Activity: A Review. Front Neuroinform. 2019 Dec 3;13:74. doi: 10.3389/fninf.2019.00074. PMID: 31849632; PMCID: PMC6901702. +
    Miller KJ. A library of human electrocorticographic data and analyses. Nat Hum Behav. 2019 Nov;3(11):1225-1235. doi: 10.1038/s41562-019-0678-3. Epub 2019 Aug 26. PMID: 31451738. +
    Guillot A, Collet C, Nguyen VA, Malouin F, Richards C, Doyon J. Brain activity during visual versus kinesthetic imagery: an fMRI study. Hum Brain Mapp. 2009 Jul;30(7):2157-72. doi: 10.1002/hbm.20658. PMID: 18819106; PMCID: PMC6870928.