Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Using Emotiv Insight and Arduino to read mental commands and control motors
11-13-2015, 05:56 AM,
Using Emotiv Insight and Arduino to read mental commands and control motors
Hi there, Puzzlebox staff

Firstly I would like to express my gratitude for this innovating website and I look forward to contribute to the distribution of biomedical knowledge and the reception thereof. I am currently doing my Master's degree in Biomedical Engineering and have started working on a project that aims to assist MCA stroke survivors. Part of the project is to make use of an assisting device that will be used to send stimulation to the patient's limbs. In this case it is 4 vibration motors (left, right, top and bottom) on a brace that is worn on the limb(s).

The idea is to use the new Emotiv Insight, an Arudino controller and a laptop to control the vibration motors by reading the basic mental commands of left, right top and bottom. I have read the contents of the Electric Wheelchair project thoroughly, but still feel that I need some assistance on where to start with such a massive project. Thus far I know of the following scheduled tasks:

(1).   Build and compile the necessary software that provides a GUI and visual feedback to the user and issues control commands to the Arduino hardware.
(2).   Also required is the standard drivers and software included with the Emotiv EEG headset as well as
(3).   An Arduino Sketch which is a simple program written by me for the Arduino controller (which I already have)
(5).   The interface circuit (which I already built) activates the vibration motors
(6).   I need some application, which allows the wearer of the headset to cause characters to be "typed" on the laptop when a learning algorithm matches the user's current brainwave patterns to previously trained sequences.
(7).   These "typed" characters activate buttons displayed in the Puzzlebox Brainstorms GUI (or the backend control script) which tells the Arduino to activate the Interface Circuit which causes the vibration motors to activate.

If only I have some way to understand what my first step would be to receive and translate the mental commands received from the Emotiv Insight - which software to use, which software may help me design a GUI for this specific project etc., I would really appreciate it.

Thanking you in advance,

Kind regards
CM Heunis
12-15-2015, 02:37 AM,
RE: Using Emotiv Insight and Arduino to read mental commands and control motors
hello maybe we can help eachother, ive also made the wheelchair controlled by Insight, ive mapped keystrkes to arduino servos connected through rc tx/rx.
it works but i need arduino sketch help for serial keystroke to controll sabertooth motor control.

the thing you're looking for is emokey. its in their sdk

please reach out to me at armenvegas@yahoo or armenvegas@gmail

thank you, hope to hear back from you
12-15-2015, 03:10 AM,
RE: Using Emotiv Insight and Arduino to read mental commands and control motors
Hi there

Thank you so much for your response. I have had quite a bit of difficulty finding someone who could possibly help me out. Currently I am waiting for the arrival of my Insight, but as I have said, I would like to start with the programming of the controller and research about the Insight SDK as soon as possible. You have mentioned that I need the Insight's EmoKey? Do I then just basically connect the Insight and assign certain keys to certain emotions? 

If so, how do I then use that key outputs to be picked up by the Arduino in the serial port?

Thank you again,

Kind regards

Attached Files Thumbnail(s)

Forum Jump:

Users browsing this thread: 1 Guest(s)