Eye blinks, specifically deliberate ones, have been isolated and detected from EEG signals (https://ieeexplore.ieee.org/document/5164828). Additionally, both the opening and closing of the eyelids can be detected. Using this, the relative length of a certain blink can be deduced from EEG readings. Morse code consists of a series of dots and dashes and has been communicated through blinks in the past (https://www.archives.gov/exhibits/eyewitness/html.php?section=8), specifically by prisoners of war. Short blinks correspond to dots and longer ones correspond to dashes.
Stephen Hawking, upon losing his ability to speak, was gifted a communication device by Intel which uses movements in his cheek muscles to navigate a keyboard. However, according to one of the developers, much of the system’s design was “[hinged] on Stephen. [They] had to point a laser to study one individual.” (https://www.wired.com/2015/01/intel-gave-stephen-hawking-voice/). Therefore, it is necessary to develop a cheaper, less individualized system of communication for individuals who have quadriplegia and are mute, have certain types of cerebral palsy, or have forms of motor neuron disease. Many of the people with these diseases are able to blink/control their eye movements and have functioning brain capacity. Therefore, an EEG will easily be able to detect their blinks and translate them into morse code. From this, the morse code can be translated into English words, which can then be put into a text to speech program.
Our project will aid the community in that it will provide a simple means of communication for those that are unable to speak/move their limbs and can be used more widely than existing systems that are tailored to one specific person. This will prove to be a cheaper and more accessible means of AAC (augmentative and alternative communication). Currently, technology-based AAC systems are very expensive and those that are based on communication boards/books fail to capture everything necessary for effective communication. By giving them the means to effectively communicate any word of their choosing, my project can aid in the treatment of these individuals and allow them to use this speech in their everyday lives. These effects will be augmented especially as these individuals become more adept with morse code as use increases.
This can be done using the EMOTIV Insight 5 Channel Mobile Brainwear, which comes with an accompanying software that can give baseline information from the headset. However, the data must be inputted to a computing device (an arduino), and this requires the BCI-OSC program. Using this software, the data from the headset can be transmitted to an arduino using an OSC stream, which can decipher a specific line from the headset to see a pattern of (manual) blinks. From this pattern, the arduino can then use a predetermined spacing between the blinks to determine which are dashes or spaces between words. There also exists a necessity for a signal to start and stop the program, but that is of lower priority because of the wide variety of facial features that could be used for it. Once the morse code data has been collected, it can be translated to letters and words. From here, a text-to-speech program could be used with a speaker to speak the message out loud.