Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Seeing is Believing: Connecting VR Environments to EEG Output Interpretation #15

Open
Remi-Gau opened this issue Nov 30, 2022 · 0 comments

Comments

@Remi-Gau
Copy link
Member

Added as an issue for book keeping

Source: https://brainhack.psychoinformatics.de/projects.html

Leader:
Siew-Wan Ohl; JiaHua Xu
[email protected]; [email protected]
Edu2VR
OpenBCI

For this project, we would like to see what EEG signals will be generated when different VR environments or scenarios are presented to the user. It is up to the creativity of the programmer to either generate interesting VR environments or directly import existing 360 degree photos or videos from online databases (e.g. YouTube). We will record EEG signals and attempt to interpret them in accordance to the VR scenarios being watched.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant