My wife works with people who have had head injuries or strokes, who cannot move from the neck down and sometimes cannot talk; some are mentally active, which must be even more frustrating. Even in this day of great technology, letters are drawn on paper, and letter by letter the person moves their head from one side to the other side to indicate if this is the next letter in the sentence. As you can imagine, this takes a very long time, and it cannot be used to initiate a conversation.
Recently, I've been playing with a very good piece of software aimed at eye tracking called Dasher. This allows letters, which move in from the right of the screen, to be selected one by one by moving the mouse up and down and gradually forming a sentance. It is also predictive, so if for example you have selected H and then e, llo would be close together. It runs on multiple operating systems.
Secondly, my colleage Dave CJ has created a fantastic python script to control the mouse pointer on a Ubuntu PC using the accelerometer on the Nokia N95 via bluetooth.
So... by combining the two, I can tilt the N95 from side to side, to move the mouse on the PC and select letters in a sentence. OK, so balancing a phone on your head to do this looks slightly stupid, but if it helps open up a means of communication without any specialist equipment, or the help of someone else, it surely must be a good thing!
Talking about IBM Watson (again)
1 week ago