To try the ERP Recording scenarios, you need to recompile OpenViBE git version from 11.Jan.2017 or later. The scenarios are also included in the OpenViBE 2.0 BETA.
Event-Related Potentials (ERPs) are brain patterns occurring in response to some external event. For example, the user might be presented a picture or a sound which then affects the potentials measured by EEG. ERPs can be used to implement BCI with paradigms such as the P300 or they can be studied by themselves in order to get further understanding about brain operation.
Here we describe how you can easily do ERP recordings with OpenViBE. The actual analysis and use of the recorded data is beyond the scope of this example. In general, further scenarios can be designed to try to classify the data (e.g. by epoching by a chosen event marker of interest), or they can be analyzed with tools outside OpenViBE that support one of the file formats exportable from OpenViBE.
How to record ERPs with OpenViBE
The scenarios in folder
bci-examples/erp-recording/ illustrate this. If you are on Windows, before trying the recording scenario, remember to copy the scenario folder somewhere on the disk where you have write access.
The ERP recording is carried out by #2 of the following scenarios,
erp-1-monitor-signal.xml: A simple scenario to check that the signal looks alright.
erp-2-record-signal.xml: The main scenario presenting images and sounds and recording data.
erp-3-replay-file.xml: A simple scenario to look at a recorded file.
The main scenario is shown in the following,
Image: erp-2-record-signal.xml (click to enlarge)
To change what is presented to the user, simply edit the Display Cue Image and Sound Player boxes in the scenario, and modify the
experiment-timeline.lua script that controls what is presented to the user and when.
The Display Cue Image box can alternatively be configured to scale the presented images, or display them fullscreen (recommended). To add or remove images, right click on the box and select ‘modify settings’.
If you want to play several audio files, use several Sound Player boxes.
The recorded signal will have the events combined with the acquired signal. The combination is made by using TCP Tagging (software tagging). Each event is sent to Acquisition Server for alignment right after the stimulus presentation (stimulus onset).
Stimulus-driven recordings in general
With small modifications, the ERP recording scenario makes a nice little tool to rigorously collect data from the user related to different circumstances. Just replace the default images with ones providing instructions of your choice and use a wait duration (in the .lua script) for the user to perform the task. Examples,
- Artifact data: Instruct the user to make different kinds of trouble (eyeblinks etc).
- Instruct the user to perform specific mental tasks
- Etc …