OpenViBE Tracker (a prototype)

  • NB: document updated on 27.Nov.2018.


OpenViBE Tracker is a new OpenViBE desktop application to complement the Designer and the Acquisition Server. Before the OpenViBE 2.2.0 release, the Tracker is available only on git.

The purpose of the Tracker is to bring user-friendly batch processing and multi-file data analysis to OpenViBE. Users of traditional OpenViBE tools such as Designer may feel constrained by its streaming architecture where data is typically processed and shown as short segments. For example, Signal Display in Designer shows a real-time view of the data as its being passed through, and you cannot “move back in time” to view some past event of interest except by restarting the playback. OpenViBE Tracker addresses these shortcomings by providing a global view to your data and datasets. These can be freely explored in the Tracker, as well as subjected to computational processing and analysis provided by a variety of existing OpenViBE components, plugins and scenarios.

The Tracker design was inspired by multitrack audio software and should be immediately familiar to users who have used audio editors in the past. In OpenViBE Tracker, EEG recordings (.ov files) are visualized as “Tracks” that can contain one or more streams of different types. For example, a track can consist of a signal stream and a stimulation stream. Multiple EEG recordings can be imported as tracks to the same workspace to be handled together. These tracks can then be visually inspected and panned around, or you can apply some specific OpenViBE processing to them. Working on subsets of the tracks is also possible.


Early screenshot of the Tracker Early screenshot of the Tracker on Ubuntu. There are two tracks (.ov files), both have one stimulation and one signal stream.

Current Features:

  • Import one or more OpenViBE multistream EEG files (.ov) to the same ‘workspace’
  • Visualize and ‘browse around’  EEG signal data and other OpenViBE streams (stimulations, spectrum, …)
  • Examine the structure of your .ov recordings: get details of stream types, chunk structure, chunk timestamps
  • Apply “Box” and “Tracker” plugins to the currently loaded recordings
  • Route selected streams to Designer scenarios and get results back. You can make your own processing scenarios.
  • Process multiple files independently from each other, or concatenated together as one long file
  • Multithreaded! Multiple cores can be used to simultaneously to process data more efficiently.
  • Memory efficient: An optional memory save mode allows files to be only kept in memory while they are processed

Minor features:

  • Add new streams and tracks as results of processing
  • Option to select only a subset of streams for analysis
  • Option to get plugin results as new streams or to replace the originals in the workspace
  • Organize tracks and streams by moving, and move streams from one track (.ov file) to another
  • Remove streams from .ov files


NOTE: The Tracker is currently in a prototype stage. This means that its error handling is not very thorough, and various features have not been rigorously tested. The code base is also constantly changing. For users concerned with robustness, OpenViBE Designer and Acquisition Server remain the trusted workhorses.

If you are interested to give the Tracker a try before an official release, you can obtain it from the git branch master in the openvibe extras repository (git instructions here). This will require you to compile OpenViBE. We would be delighted to hear feedback and/or feature requests. If you have any suggestions, you can post them here.


Tl;dr; show me some examples?

We bundle a few examples with the Tracker. You can get these by selecting File/Open Workspace. After opening a workspace, you should see a track with a few streams added. Press Play and Designer should appear to do the data processing. Look at the comments in the scenarios.

workspaces/workspace-example-motor-imagery.ovw : This is an example that contains one real EEG motor imagery track. By pressing Play, the data should be filtered and displayed in Designer, after which the filtered result is sent back to the Tracker. It is an example of a ‘bidirectional’ use case (processor) where the Designer scenario both receives data from the Tracker and sends data to it.

workspaces/workspace-example-erp.ovw : This example features an artificial file that contains two types of trials. The processing scenario computes the averages inside each trial and visualizes them by the ERP plot. When the plotting box receives the end of file stimulation, it draws the current plot to a file. It is an example of a onedirectional use case.

We also bundle a few other example processors which don’t have a corresponding example workspace. The file processor-example-sink.xml shows how to send data to a processor from the Tracker, and processor-example-source.xml shows how to receive data in Tracker from a processor.

Getting started

To get started, Add Track(s) from the File menu. These tracks should be previously recorded .ov files. After you have a collection of tracks you would like to process or study, you can save this set as a Workspace. Again, this is done from the File menu. On its next startup, the Tracker will restore the  workspace that was previously open, in its last saved state. You can also load some other previously saved workspace.

Note: The Tracker will not modify your original .ov files. Instead, it will write copies into a workspace-specific path.

Exploring datasets and streams

When you have some tracks added, the Tracker will show you their basic visualizations (see the image above). You can use the widgets below the tracks to change the position and time scale. Holding right mouse button down on top of a signal display and moving the mouse will allow you to scale the y axis, e.g. zoom in and out.

From the Tracks menu you can find Show stream structure selection under each stream. This shows you how the stream content is segmented in chunks. Note that this segmentation is always the case in OpenViBE: the tracker simply makes a table of whats encoded in the .ov file. If the same file is loaded to Designer, the boxes will receive the data having the chunking as displayed by Tracker (unless re-epoched by the scenario).


openvibe-tracker-stream-structure Structure of the beginning of a stimulation stream in a P300 experiment timeline as visualized by the Tracker. Inspecting the stream structures can sometimes be useful in debugging your experiments.


Simple processing

Box plugins. To make some simple modifications to the track data, you can find various plugins to do this under the Apply Box Plugin option in the Tracks menu. These plugins are called “box plugins” because they are made by delegating the processing to existing OpenViBE plugins called “boxes”. Each such plugin works on a specific type of stream, as indicated in its name (e.g. “Signal: Temporal Filter”). Applying a box plugin will process all the currently selected streams that match the stream type that the plugin supports. The stream selection can be changed from the Edit menu.

For example, using Independent Component Analysis (or PCA) to examine your EEG data can be done easily by selecting it from the Apply box Plugin menu. The processing will work just as it would by using that box in Designer. The estimated independent components will appear as a new stream in your workspace.

As another example, you can do manual artifact removal: you can manually disable EEG trials with artifacts. Once you have located a contaminated trial by visual inspection (‘eyeballing’), you can simply filter out the stimulations of the corresponding trial using the Stimulation Filter plugin (from ‘Apply Box Plugin‘). For example, if you are wishing to remove artifacts from BCI classifier training, the removed stimulation should be the one that triggers the Stimulation Based Epoching in the corresponding training scenario. As a result, although the signal data remains, it won’t be taken into account by the BCI scenarios which train models only on the extracted epochs. Admittedly, using the Stimulation Filter here is not very convenient. In the future the Tracker might be developed further to allow disabling stimulations simply by clicking on them.

Currently the box plugins only process one stream of a specific type at a time. This limitation is because we haven’t implemented a user interface in the Tracker that’d allow the user to declare how the streams should be routed to the boxes, and boxes in OpenViBE cannot generally take an arbitrary set of input streams. But this kind of routing is already available in the OpenViBE Designer. If you want to do processing that requires multiple data sources (streams) or chaining of plugins, please see the Complex Processing section.

Tracker plugins. OpenViBE Tracker also supports another type of plugins called ‘Tracker plugins’. These plugins are native to the Tracker and can potentially be very powerful. The main difference is that box plugins wrap openvibe box code, and subsequently their code has to deal with the classical openvibe objects such as encoders, decoders, and so on. Also, a box plugin is only able to look at data sequentially, unless it caches all of it before starting the processing. In contrast, the Tracker Plugins can read and modify the loaded data freely, dealing largely with intuitive, typed C++ data representations. For example, a Signal stream can be accessed via a construct resembling std::vector< typename T::Buffer >. The Tracker plugins can have two capabilities:

  • Process tracks: For a plugin that declares to be able to process tracks, on Apply, the Tracker will call its process() function, once for each currently loaded track
  • Process workspaces: If a plugin declares ability to handle workspaces, the Tracker will call its process() giving a reference to the current workspace object. The plugin can then freely modify all tracks and track content in it, as well as add and remove them.

There are currently two example Tracker plugins included to illustrate how the ITrackerPlugin API works, but you can add more. The Tracker plugin programming interface is arguably more simple than the corresponding box interface. As a downside, the tracker plugins do not currently have visual configuration unless you implement it yourself in the plugin.

Complex processing

The simple processing is limited especially in a way that you cannot automatically chain several plugins, and the box plugins can only handle one stream at a time. However, you can also pass the data through more complex DSP chains that can feature several processing steps and stream types at the same time. This is achieved by allowing passing data between the Tracker and OpenViBE scenarios that you can fully customize. To route data to a scenario, you must configure a “processor”. Processor is simply an OpenViBE scenario which must contain one or two “External Processing” boxes. The data from Tracker is sent to one of them, and output data will be received from another. From Edit/Preferences you can find some example processors by clicking the ‘Open’ button next to the processor field. Note that the output connectors of the External Processing box in the processor scenario must match the streams that you have selected. For example, if you select a signal and a stimulation stream, the receiving external processing box outputs must be configured likewise in the scenario. There is currently no automatic mechanism to generate the right set of connectors. You can configure the processor in Designer simply by pressing the ‘crossed tools’ button next to the processor field (or Edit, in the workspace tab).

Once a processing scenario is set and properly configured, workspace tracks can be routed to it by pressing Play. The normal play mode tries to simulate real-time playback, whereas the Fast Forward mode will try to pass data in as quickly as possible — just as in Designer. The play will process only the currently selected streams, going from top to bottom in the track list.

For complex processing, the Tracker can either send data to the scenario as one long catenated set of streams, or each track independently. In the catenation mode, the timestamps of the streams in the second track will be incremented by the duration of the longest stream in the first track, and so on. This mode can be changed from Edit/Preferences. In the catenation mode, the result of the processing will be recorded into one very long track. In the separate mode, each source track will result in a corresponding new track. The default mode is the non-catenating mode. Note that the catenating mode does not try to address the likely discontinuities in the signal or chunking when moving from one track to the next.

Processing scenarios can be either one- or bidirectional: you can use scenario as a data source, a data sink, or as both (== pass-through, similar to the Apply box plugins). You can configure the first TCP/IP port p to communicate with the Designer. The Tracker may try to open multiple ports starting form p. If you have set multiple threads to process several tracks, the first thread will get port p to send and p+1 to receive. The next thread will have use port p+2 and so on.

The Tracker will pass several configuration tokens to the processing scenario that can be used to configure boxes. At the moment these tokens are:


Token name Explanation
Tracker_Workspace_File The filename of the workspace file, “None” if unset
Tracker_Workspace_Path The path of the workspace working directory, “None” if unset
Tracker_CurrentTrack_Number The number of the current track being played
Tracker_CurrentTrack_Source The filename of the current track
Tracker_CatenatePlayback Is the Tracker running in the catenating mode?
Tracker_Port_Send TCP/IP port for External Processing box. Direction: from Tracker to Scenario
Tracker_Port_Receive TCP/IP port for External Processing box. Direction: from Scenario to Tracker

Like with usual OpenViBE configuration tokens, these can be used as values in box settings. For example, a writer box might have its output filename parameter specified as  "C:/myfolder/result-${Tracker_CurrentTrack_Number}.ov". Note that since the filename of each track is passed to the scenario as a token, you could in principle configure boxes to read from these files instead of passing data to/from external processing boxes. However, we do not recommend this, as the file content may not be the same as displayed in the Tracker, unless the workspace has been saved since the last modification of the tracks. Also, the scenarios for Tracker are often designed to stop at the timeout of the External Processing box that reads the input. If you read a file instead, you need stop the processing according to that file.

You can also specify your own custom configuration tokens for the processor in the Tracker preferences. These are passed as command line switches to Designer that does the processing. The syntax is “--define token value” per token.

Revision handling

OpenViBE Tracker supports very basic revision handling. Basically, you can use the menu choice Files/Increment revision and save. This has the effect that the revision number will be incremented by one, and all tracks will be saved in their current states as new .ov files to the workspace directory. The files will have new postfixes indicating the new revision. A backup of the workspace configuration file will also be placed in the workspace directory. Then, if you wish to return to some previous revision, you can simply open the backup file.

More advanced use-case sketches

These use-cases will require you to write the corresponding scenarios that we don’t yet provide with the Tracker. Nevertheless, they should work as illustrations of what can be done. Basically you can think of the Tracker as a tool that extends processing by OpenViBE scenarios to span multiple files.

Multirecording analysis (e.g. ERP)

You can import multiple recordings from different users, and route these to a scenario featuring aggregating and the ERP Plot box. If you use catenate mode in the Tracker, the resulting ERP plot can be an aggregate of all the files. In most cases, the EEG headset configurations, stream configurations, etc. should be compatible for multirecording analysis making sense.

Training EEG classifiers from multiple datasets

The same approach as in multirecording analysis works if you want to train a BCI classifier, CSP or EOG artifact removal model using multiple datasets. In this case, the processing scenario should feature the corresponding Trainer box and epoching as needed. The model (e.g. classifier parameters) saved by the scenario can then be later used in another online scenario in Designer.

Testing classifiers in a multi-user study

If you wish to train/test EEG/BCI classification or DSP for several datasets, the workflow using Tracker could be imagined as follows. Here we assume that you have one training recording and one testing recording per user.

  1. Make a workspace for the training files. One track per user. Configure the processor to be a training scenario. If you need to train multiple models (e.g. CSP+Classifier), you could make one workspace for each purpose. Make the trainer box in the scenario save the model it estimated, using the token ${Tracker_CurrentTrack_Number} in the filename to uniquely identify which track the model belongs to. On play, Tracker will sequentially walk through all the tracks and train the models which will then be saved to files.
  2. Make a workspace for testing files, the users in the same order as in the training scenario. Now, make another processor that uses the previously built models, again using the tokens to tell the corresponding boxes to load the correct model. On play, Tracker will now walk through the tracks again, but this time testing the models.

In this use case, the scenarios can be one-directional, that is, they only receive data from the Tracker. With the current state of the Tracker features, you also need to implement some approach to get the scenario to exports its results. One way is to use CSV writer and the tokens to write per-file classifier predictions or accuracies (e.g. using Lua box). You will need to assemble these results elsewhere.

We’re currently thinking of ways to export/aggregate the results more automatically, as well as the possibility to add cross-validation to the Tracker, and further simplify this workflow.

Dataset format conversions

If you wish to convert a lot of files from .ov format to some other format, you can make a scenario with the corresponding writer box for the new format and then use a configuration token to specify the filename. Note that this may not be the most convenient way to achieve this result (you may consider the openvibe-convert script instead included in OpenViBE).

Tracker Preferences

The tracker has a few settings the user can configure in “Edit/Preferences“. Among these are


Setting Description
Memory save mode If on, Tracker will load tracks (.ov files) from disk on demand, one at a time, and never visualize them. Note that in this case, plugin Apply will affect the files on disk.
Modify tracks in Place When applying plugins or processing streams, the result will replace the originals in the workspace (and on disk, if you choose Save). Use with caution, as there is no Undo. If this setting is disabled, results will appear as new streams or tracks.
Catenate mode All selected tracks will be processed by the scenario one after another as one long track.
Workspace path This is where the Tracker will save all the tracks as .ov files when user chooses menu option ‘Save‘.
Start port TCP/IP port used to send data to the Processor scenario’s External Processing box. The tracker will assign ports to communicate for each processing (scenario playing) thread, so that thread K will have ports “start+2*(K-1)” and “start+2*(K-1)+1” to send and receive data, respectively. Unfortunately the code currently cannot test if the required ports are actually available.
Number of threads Tracker can use multiple threads to run plugins and to process data. However, this has not been thoroughly tested. Use ‘1’ if you encounter trouble.
Do Send? Does the processing scenario receive data? If not, pressing play will only record data from the scenario.
Do Receive? Does the processing scenario return data? If not, pressing play will only send data to the scenario.
Processor arguments Command line switches passed to Designer playing back the Processor scenario. You can use switches such as “--define MyHappyToken MyGreatValue” to pass custom configurations to the scenario.
No GUI If unticked, Designer will be shown on Play and you will have to exit it manually. Enabling GUI can be useful for debugging.

These settings are saved in the workspace’s configuration file (suffix “.ovw“) and hence can be different per workspace. Note that the sending and receiving settings must be matched in the scenario with the corresponding External Processing boxes with their ports configured by Tracker_Port_Send and Tracker_Port_Receive tokens. Some examples have been bundled with the Tracker to make this more clear.

A few words about using multiple threads

If more than one thread has been enabled in the preferences, that amount of threads will be run in parallel. The actual granularity of the computation will depend on the operation.

  • When applying Box Plugins processing “single streams” independently, one thread gets one stream to process at a time.
  • When applying Tracker Plugins processing “Tracks”, one thread gets to process one track at a time.
  • When applying Tracker Plugins processing “Workspace'”, the plugin code receives a “Parallel Executor” (thread manager) which it can then use as it likes
  • For Scenario-based processing (pressing Play)
    • In normal mode, each track gets routed to a different instance of Designer by a different thread
    • In catenation mode, everything will be processed by a single thread as all later data will depend on previous data
    • In recording mode (processor configured only as ‘Receive’), only single thread will be used

Note that the threading aspect has not been very well tested so far. A good starting point for the number of threads is to match the number of cores your CPU has.

Good Questions

  • I imported a file but the signal display is just black?! Some .ov recordings have very large DC offsets or scales. In such cases, the signal is likely outside the range that the track display tries to draw. This can be addressed by applying Temporal Filter on the stream in question (under ‘Tracks/Apply Box Plugin‘).
  • Where is the EEG data kept? When workspace is saved, each modified track will be stored as a separate .ov file to the workspace directory you chose. In memory save mode, each track is saved immediately after it has been recorded. Currently there is no ‘track export’ option implemented in the Tracker, but you can either get the tracks from the workspace directory, or by routing them to a processing scenario that does the export.
  • Where is UNDO? There is no undo feature implemented. Be careful! However, you can save the specific state of the Workspace as a revision, and then restore that (i.e. you first save a revision, then do some work, then restore the revision to revert — you cannot restore a revision you have not saved).
  • Can I edit my data? Currently this is only possible with plugins and processors you can apply. Whereas editing data (e.g. copying, pasting, cutting, or doing some local modifications) makes a lot of sense for data types such as audio, it is less clear if there are scientifically solid reasons to do such operations on recorded EEG. If you’d like some editing features, let us know (+ the rationale).
  • Does Tracker visualize non-continuous data correctly? No. The .ov streams can contain overlapping chunks (e.g. from Time Based Epoching box) or chunks that have gaps between them. Although the Tracker processes this data correctly (with Apply and by scenarios), it is not currently able to draw it meaningfully. In such cases, “About Workspace” will declare such tracks as ‘non-continuous’. Normal EEG data as recorded from the Acquisition Client box should always be continuous.
  • Pressing Play freezes. Tracker bugs aside, it is possible that your scenario .xml was not compatible with the expectations of the Tracker. For example, the stream connector sets in the External Processing boxes might have been incompatible with the stream selection you are sending from the Tracker, or they might have been in a wrong order. Note that you can change the order of the tracks and streams in the Tracker. Another possibility is that there are errors in the processor command line switches. To debug this, you can disable “No GUI” in Preferences, as well as look at the log file generated by Designer trying to run the scenario. This log will be “$APPDATA/openvibe-x.y.z/log/tracker-processor-dump.txt” on Windows, similarly located under “.config/” on Linux. Note that since the communication is over TCP/IP, the ports may already be in use or even have been blocked by an overeager firewall.
  • I try to add a lot of tracks and it crashes! If you are running a 32bit version of the Tracker, it is possible that the Tracker was not able to allocate enough memory for all the tracks. It will behave unelegantly if that happens. This can be mitigated by operating in the memory save mode. You can also try to use the 64bit version of the Tracker.
  • How do I write a new box plugin? Simply make a normal OpenViBE box that has one input and one output stream ( documentation has instructions on how to write boxes). Once the box has been compiled to some plugin .dll Tracker is loading on startup (test if you can see it in Designer), you can get it visible in the Tracker by registering it in the BoxPlugins.cpp file.
  • How do I write a new Tracker plugin? These plugins are native to the Tracker and can be registered in TrackerPlugins.cpp.  Unlike box plugins which are loaded from dlls, the Tracker plugins are simply compiled into the Tracker executable. The Tracker plugin interface is also very simple. Looking at the existing plugin file(s) should illustrate how they work.

Implementation details for developers

The following bits of information may be of interest to developers wishing to do something with the Tracker source code.

  • The Tracker is currently in a prototype stage, meaning that many of the features have been developed as proofs of concept, trying to minimize ‘dev time to usable example’ while keeping at least an elementary level of ‘design quality’. Code reuse from elsewhere in OpenViBE has been attempted to be maximized, even when this has not led to the cleanest possible solution.
  • The Tracker tries to reuse OpenViBE code as much as possible, with minimal modifications and no copy pasting. For this reason, when the Tracker is compiled, its CMakeLists.txt will grab a lot of cpp/h files from the OpenViBE SDK and Designer. This is not pretty, but since SDK and Designer do not currently expose the necessary methods via API calls, grabbing the code during build was the remaining feasible option in order to not downright copypaste the code.
  • Examples of re-use: The box plugins are wrapped OpenViBE boxes. The complex processing via scenarios is done by Designer (and communication from sdk/modules/communication/ to talk with External Processing Boxes over TCP/IP). The visualizations are wrapped from Mensia Advanced Visualization algorithms.
  • Despite these shortcuts, the Tracker also introduces novel code and interfaces. For example, the OpenViBE concepts such as a ‘StreamBundle’ (corresponds to a set of streams in an .ov file), ‘Stream’, ‘Type’ and ‘Chunk’ now for the first time have actual C++ class implementations. This will hopefully make it easier to handle such objects on the C++ level.

Classes introduced in the Tracker

The Tracker works as an incubation laboratory for several classes that may (or may not) be later integrated into the OpenViBE kernel. Here we describe these classes in more detail.

First some background. In OpenViBE, all data is processed in small, streamed chunks. The content of these chunks depends on the stream type. The streams inherit from each other as described in Stream Structures. However, on the C++ level, the chunks and the streams traditionally have had only an implicit representation. Lets consider an example: an OpenViBE plugin (box) that sends out a signal stream. To pass data out from the box, you need to make a stream-type specific encoder for each outputted stream, specify chunk data and parameters to it, and finally tell the encoder to encode either header, buffer or end, after which you can tell the kernel to send the encoded result downstream. The decoder works the opposite way around. Even this is a lot of template magic, building on lower level implicit representations, e.g. in EBML. The key point here is that the chunks and the stream types do not exist as classes or even structs in classic OpenViBE code! The OpenViBE Tracker has taken another track (no pun intended), trying to define these concepts as actual objects. The assumed benefit is that as objects, these concepts are more easy to understand, manipulate, encapsulate and pass around. For example, instead of managing encoders and their parameters, the developer could write something like “TypeSignal::Buffer buffer; fill(buffer); output[0].push(buffer);“. When we think of looking at files offline (i.e. all data already available), it is very convenient to have explicit class representations for the whole data. In the following we describe the classes the Tracker uses to achieve this.

TypeX. The starting point to build typed streams is a typed chunk. Streams in OpenViBE support three categories of chunks: Header, Buffer and End chunks. Since an aggregate of these three categories does not make a reasonable typed object (for example, we don’t want each and every Signal buffer to include a Header and an End as well), we’ve chosen to define three classes per stream type: Header, Buffer, and End which are encapsulated inside an abstract class. To replicate how the streams inherit from each other, we’ve made the inheritance on the chunk category level. This is best explained by the following example class definition.

class TypeSignal {
   class Header : public TypeMatrix::Header {
      uint32_t samplingRate; 
   class Buffer : public TypeMatrix::Buffer { } ; 
   class End    : public TypeMatrix::End    { } ;

So the abstract class TypeSignal has 3 concrete (data) subclasses that inherit from the subclasses of the similarly abstract parent TypeMatrix. Since a Header for Signal type is a composition of “Matrix Header + Sampling Rate”, we define the Signal Header to extend the header of the parent class (Matrix) and just add the missing samplingRate variable. At the very top of  this chunk inheritance hierachy is the class Chunk that each Header, Buffer and End ultimately derive from. The Chunk base class has just two members: timestamps start and end, which then will be accessible in all the derived classes.

Examples of TypeX are TypeMatrix, TypeSignal, and TypeStimulations, among others. Since the types are data classes that have no code, we have not implemented them as  templates such as Type<X>  as then we’d just need to write a specialization for each. Instead, we have made the Stream consisting of chunks a template.

Stream<T>. Stream is a template container for typed chunks defined above. In particular, the T must be one of the TypeX: it has to have the implicit interface containing subclasses Header, Buffer and End. Internally the stream will refer to the concrete types such as T::Buffer. In the Tracker, a Stream object is used to contain the data of a stream from its beginning to its end. Stream can also be used as a FIFO in the sense that it has a position, you can read from it advancing the position, and you can add chunks to the end. However, Tracker mostly uses Stream as an offline container allowing random access. Note that it would have been possible to implement an even more efficient way to handle EEG data by simply decoding the whole stream to one block of contiguous memory (a matrix). In this case we chose to follow the classical OpenViBE approach of small chunks so that the Tracker is more compatible with the existing OV conventions and code. As a result, it allows an explicit and rather exact look to the structure of the data contained in .ov files.

StreamBundle. StreamBundle is a collection of Streams, corresponding to an .ov file. In Tracker, it implements a multistream concept of a single Track.

ovtime_t. The Tracker typedefs uint64_t as ovtime_t. This is simply a timestamp in the 32:32bit fixed point time used commonly all around OpenViBE. Giving it a typename should help to clarify the intent of the variables to the reader.

Hierarchy of these classes in the Tracker is as follows: Workspace contains one or more StreamBundles (Tracks). A Track contains one or more Streams of different types. A stream contains a Header, a vector of Buffers, and an End. Each Header, Buffer, and End inherit from Chunk and subsequently contain two timestamps. However, this hierarchy still does not define ‘EEG channel’ as a class or object. Instead, for Signal type for example, each Buffer will contain a matrix, and its the rows of the matrix that are interpreted as channels (and its columns as samples) by components that work with the Signal type, for example the Stream Renderer for Signal.

The Tracker also features some other convenience classes:

EncodedChunk. This is a type containing encoded data and two timestamps. It is used to pass data between such classical openvibe components (and .ov files) that deal with encoded data. It does not fullfil the implicit interface of TypeX, since it lacks header, buffer and end, but instead features a vector of encoded bytes.


It this short manual we’ve briefly described OpenViBE Tracker, a new tool for batch processing and analysis of multiple datasets. We hope that the Tracker will become useful to the EEG community and enrich the OpenViBE experience with new possibilities.

Feedback is welcome!


This entry was posted in User documentation, Data-analysis, Tracker documentation. Bookmark the permalink.