Pages

Monday, October 18, 2010

Emotiv Headset and Locked In Syndrome

In March, a salesman working for IBM had a stroke, which left him with complete paralysis, unable to use his muscles, and without the ability to speak. His brain however is working fine - a condition called Locked-In Syndrome. His means of communicating is by his eyes - looking up for yes, and down for no. He has to wait for someone to ask him if he'd like to speak before being able to do so. Then, using a letter chart, that someone must point at letters one by one, until a confirmation is received. This is the first letter of the sentence. The process must be repeated for every other letter in the sentence, until the full sentence is spelt out.

The salesman in question is called Shah, and Sarah (my wife) is his Occupational Therapist (OT), who saw the mind-reading headset from
Emotiv that I was using in IBM and thought about the potential this could have for Shah. The headset was designed for the gaming industry, and measures facial expressions, excitement/boredom levels and can be trained to listen for particular thoughts, which can perform an action on the computer.

Sarah asked me to demonstrate it to the consultant, speech therapist, psychologist and hospital ward manager - who agreed it may have some potential, and who took it to the ethics committee. Shah was informed about the headset and what it can do - but that we weren't aware of any person with the symptom who had tried it before. Shah, being a bit of a techy, was up for pioneering it - so I met him last week for his first go.

Just like the many other people I have demonstrated the headset with, Shah was instantly able to train the system so that one action - the "push" action - would push the floating "Emotiv cube" into the distance. As we trained, the ability to push (and stop pushing) at will, improved.

Adding a second action adds a complication. Now the unit has to distinguish not just whether you think or don't think, but which thought you are having - much harder! And like everyone, it will take a bit of time to practise getting sufficiently good at this to control. However Shah is up for the challenge, and has a fantastically supportive family who are willing to help him train.

Last week, after one week of use, Shah had managed to train the headset very well - though when tired the Emotiv skill can go down as well as up. Working with the speech therapist, we have connected the output of the headset to the input of
"The Grid 2" - a piece of software by Sensory Software which allows a user, normally via eye tracking, keyboard, mouse or switches, to control their environment, write emails and surf the internet. We have initially set up 3 different menu items, and thinking "push" starts highlighted these options in turn at 10 second intervals. Thinking push again, will select one of the options. Sounds easy perhaps - but if I tell you to *not* think of a rabbit - you can't help but think of a rabbit! It's hard to think about when to start and stop thinking, and then change to think about the thought you are supposed to be thinking about! I think this will take some practice to achieve, but once achieved could be widened to many more options, with very little extra practice.

Often people ask about using the headset for people with different conditions, and in different places. What with Shah being a fellow IBMer who likes gadgets, and his OT being my wife, and with the particular medical condition he had, conditions were ideal to try out the headset. It will take time for us all to get it right - but for now it is looking good. I will try to keep you updated!