Page 1 of 1

Re: Project: Open Vibe, OpenBCI, Robotic Arm

Posted: Tue Nov 08, 2016 3:28 pm
by jtlindgren
Hi Zach,

Laurent Bougrain & co. at least have used OpenViBE to control a JACO robotic arm in real time using motor imagery. Here's some links to their materials,

http://openvibe.inria.fr/openvibe/wp-co ... ticArm.pdf
https://hal.inria.fr/hal-00759545/document

Dieter Devlaminck did it earlier with P300,

http://www.ijbem.org/volume13/number1/2 ... _02-04.pdf

Basically you could adapt the Graz motor imagery paradigms in OpenViBE to send the feedback to the robotic arm. The online or realtime control is nothing special. You train a classifier offline using the training scenario, and then in the online scenario you process data in real time and direct the commands to the arm using some communication channel of your choice. All BCI pipeline demos shipped with OpenViBE are meant for real time use.


Happy hacking,
Jussi

Re: Project: Open Vibe, OpenBCI, Robotic Arm

Posted: Wed Mar 22, 2017 5:39 am
by madi
Hi All,

To add to this topic, I am also working on a similar project.
1. I would like to know if this is achievable using Hardware BrainMaster Discovery?
2. How to train the classifier? Do we need to set threshold values for each band (e.g. Mu-Beta) or we keep the threashold value of each rhythm as their values are from literature i.e. for Mu freq range 8-12 Hz and for Beta 13-30 Hz?
3. Is there any tutorial or video which explicitly demonstrate how to train a classifier?

I am relatively new in this field so any help would be very much appreciated

Thanks
Madi