A PROPOSED SIGN LANGUAGE MODEL FOR SPEECHLESS PERSONS USING EEG SIGNALS
DOI:
https://doi.org/10.31987/ijict.4.3.148Keywords:
BCI, EEG, PCA, Deep learningAbstract
Recently, algorithms of machine learning are widely used with the field of electroencephalography (EEG)-Brain-Computer interfaces (BCI). In this paper, a sign language software model based on the EEG brain signal was implemented, to help the speechless persons to communicate their thoughts to others. The preprocessing stage for the EEG signals was performed by applying the Principle Component Analysis (PCA) algorithm to extract the important features and reducing the data redundancy. A model for classifying ten classes of EEG signals, including Facial Expression(FE) and some Motor Execution(ME) processes, had been designed. A neural network of three hidden layers with deep learning classifier had been used in this work. Data set from four different subjects were collected using a 14 channels Emotiv epoc+ device. A classification results with accuracy 95.75% were obtained for the collected samples. An optimization process was performed on the predicted class with the aid of user, and then sign class will be connected to the specified sentence under a predesigned lock up table.