About 1 in 4 of people with cerebral palsy have difficulty speaking. Therefore, it is essential to provide them with the latest and most convenient technology for communication to enable inclusion and better quality of life. This research is focusing on the transformation of accessible communication devices to make them wearable, fast and easy to use.
Currently available assistive communication devices are often lacking in privacy, convenience, portability and social acceptability. To overcome these issues, in this project we are inventing a novel wearable device that combines 3 innovative technologies:
This PhD project starts with a systematic review of AAC using virtual reality. This, in turn, informed a novel design of an AAC device that utilised Mixed Reality (MR), Eye-Gaze Technology (EGT) and Brain and Computer Interfaces (BCI). The system was based on an MR environment provided by Microsoft HoloLens. The first generation of the prototype adopted Pupil Labs as the eye-tracking system whilst the second generation used the embedded eye-tracking system in HoloLens 2. The initial prototype was piloted and revised based on the feedback from people with CP and engineering students collected in an online survey.
The AAC device was developed to enable BCI by capturing an electroencephalograph (EEG), using dry electrode EEG (g.Nautilus) from g.Tec. Although the Signal-to-Noise Ratio (SNR) of such a dry electrode EEG system was lower than expected feedback from study participants who completed the experiments suggested an overall positive user experience. Datasets, from the MEDICON 2019 Scientific Challenge, were used to verify novel online and offline BCI algorithms.
Furthermore, the additional offline analysis implicated that conductive saline could improve the accuracy to an acceptable level (average 60%). Nevertheless, an ongoing time synchronisation problem was identified as the major issue of this system. This ongoing project will explore more about this problem and make the device more robust and reliable.