![Logo](/Kelvin0023/PhysioLabXR/raw/master/physiolabxr/_media/readme/PhysioLabXR Overview.png)
A Python Platform for Real-Time, Multi-modal, Brain–Computer Interfaces and Extended Reality Experiments
Explore the docs »
View Demo
·
Report Bug
·
Request Feature
Table of Contents
PhysioLabXR is a Python-based App for visualizing, recording, and processing (i.e., make prediction) data streams. PhysioLabXR can help you build novel interaction interface like BCIs as well as aid you in running experiments. It works best with multi-modal (e.g., combining EEG and eyetracking, camera with speech), high-throughput (~500Mbps/sec), real-time data streams.
Download the latest release exe from here, available for Windows, Mac, and Linux.
Install PhysioLabXR's PYPI distribution with
pip install physiolabxr
Then run with
physiolabxr
Alternatively, you can clone the repo and run from source.
git clone https://github.com/PhysioLabXR/PhysioLabXR.git
cd PhysioLabXR
pip install -r requirements.txt
The entry point to PhysioLabXR is physiolabxr.py
, located in the folder named "physiolabxr". From the root folder, you can run it by:
python physiolabxr/physiolabxr.py
For more examples, please refer to the tutorials in the documentation
Tutorials have examples for:
- Real-time fixation detection (link)
- Real-time multi-modal event-related-potential classification with EEG and pupillometry
- P300 speller in Unity
- Stroop task with PsychoPy (link [https://physiolabxrdocs.readthedocs.io/en/latest/PsychoPy.html])
More are coming soon!
Distributed under the 3-Clause BSD License. See LICENSE.txt
for more information.
Ziheng 'Leo' Li - [email protected]
We would like to express our gratitude for the support from our colleagues at Columbia University and Worcester Polytechnic Institute. We would also like to thank all the community members who have contributed to PhysioLabXR.