We have a few examples to help you get started. You can find them in the examples directory. Please be sure to adjust these examples to suit your purposes.
To run the existing example you will need to do a few things.
-
You will need an EMOTIV headset. You can purchase a headset in an online store
-
Next, download and install the Cortex service. Please note that currently, the Cortex service is only available for Windows and macOS.
-
We have updated our Terms of Use, Privacy Policy and EULA to comply with GDPR. Please login via the EMOTIV Launcher to read and accept our latest policies in order to proceed using the following examples.
-
Next, to get a client id and a client secret, you must connect to your Emotiv account on emotiv.com and create a Cortex app. If you don't have a EmotivID, you can register here.
-
Then, if you have not already, you will need to login with your Emotiv id in the EMOTIV Launcher.
-
Finally, the first time you run these examples, you also need to authorize them in the EMOTIV Launcher.
Here are some examples to get you started. Please be sure to refer to the documentation for more information.
subscribe.py
shows data stream fromHeadset
: EEG, Motion, Performance Metrics and Band Power.- For more details, please refer to the documentation.
facial_expression.py
shows facial expression training.mental_command.py
shows mental command training.- For more details, please refer to the documentation.
live_advance.py
shows the ability to get and set sensitivity of mental command action in live mode.- For more details, please refer to the documentation.
record.py
shows how to crate a record and export data to CSV or EDF format.- For more details, please refer to the documentation.
marker.py
shows how to inject marker during a recording.- For more details, please refer to the documentation.