Page 1 of 1

Neurosky Mindset arm movement recognition

Posted: Sun Nov 14, 2010 11:51 pm
by asrinivasan
Hello,

I am new to OpenVibe- I am considering it as an alternative to MATLAB for EEG signal processing, and as far as I've seen it looks very promising.

However, I had a question concerning hardware as well as one concerning OpenVibe.

a. Is it possible, using the Mindset and OpenVibe, to recognize a subject's arm movement? For example, if I were to give a cue, and the subject moved his/her arm up or down, or imagined doing so, would this register as a deviation from the "normal" signal? I am only asking this because I realize that the Mindset only has a single electrode.

b. If the above is possible, is there instruction somewhere (or could you provide it) on how to interface Mindset and OpenVibe? My goal is to achieve a result similar to the "museum" video (where the subject imagined moving his feet and the character in the application moved forward, correspondingly)- except that rather than making a character move in an application, the software types a corresponding ASCII character. Also, OpenVibe would need to "learn" the pattern associated with the movement, just like in the video.

My end result is this: when a subject imagines moving his/her arm, the software types a character. From here, I intend to use the character as a signal to a microcontroller to move a robotic arm.
I am a student, with very limited funds :), so it would be very helpful if the Mindset would suffice in this situation.

Thank you

Re: Neurosky Mindset arm movement recognition

Posted: Mon Nov 15, 2010 8:59 am
by lbonnet
Hello asrinivasan and welcome on board !
. Is it possible, using the Mindset and OpenVibe, to recognize a subject's arm movement? For example, if I were to give a cue, and the subject moved his/her arm up or down, or imagined doing so, would this register as a deviation from the "normal" signal? I am only asking this because I realize that the Mindset only has a single electrode.
Indeed, one electrode is not enough. Moreover, it is placed on frontal position, far from the motor cortices related to hand movements. I don't think you will find a recognizable pattern in the EEG.
My goal is to achieve a result similar to the "museum" video (where the subject imagined moving his feet and the character in the application moved forward, correspondingly)- except that rather than making a character move in an application, the software types a corresponding ASCII character.
For the museum application, we used a certified EEG headset and amplifier, with 16 electrodes. You will surely find the right signal processing, and do it with openvibe, but the hardware problem remains anyway... To build such BCI (motor imagery), you absolutely need more electrodes, and place it on the right position on the scalp (for example like the video shows). In my opinion the MindSet won't be enough for that purpose.

Laurent-