Abstract
This research aims to revolutionize electroencephalogram (EEG) analysis by proposing and developing an innovative method for sentiment classification, introducing the groundbreaking "Optimized Chat Generative Pre-training EEG to Sentiment Classification (OGPTSC)." This novel approach harnesses the power of Generative Pre-training (GPT2) as the vocabulary language model while utilizing an intelligent hyper-parameter selection method to fine-tune and optimize the model's performance. Extensive testing is conducted on a widely recognized dataset to evaluate the OGPTSC's prowess, demonstrating its exceptional capabilities in sentiment classification. Notably, this research extends its scrutiny beyond the OGPTSC by applying the same hyper-parameter selection technique to well-established models such as BERT, BART, and Multi-Layer Perceptions, thus enhancing these models' overall reliability and generalizability. A key innovation of this research lies in improving model structure through a comprehensive tuning process. This process dynamically adapts the models' configurations by strategically leveraging classification loss and error during training. As a result, the refined models exhibit unprecedented levels of accuracy and robustness. The outcomes of the experiments vividly portray the superiority of the OGPTSC model over existing approaches that utilized the same dataset and techniques (BERT Model, BART Model, and Multi-Layer Perceptions). The OGPTSC's exceptional performance in sentiment classification firmly establishes it as a groundbreaking solution, surpassing previous benchmarks and setting a new standard in EEG analysis and sentiment classification. Furthermore, the optimized versions of the renowned BERT Model, BART Model, and Multi-Layer Perceptions also showcase remarkable improvements over their predecessors. These enhancements solidify their positions as formidable contenders in the sentiment classification domain, providing researchers and practitioners with an invaluable toolkit to tackle complex EEG-based sentiment analysis tasks. In summary, this research presents an impressive and influential contribution to EEG analysis and sentiment classification. The development and evaluation of the OGPTSC model and the enhancement of existing state-of- the-art models inspire a new wave of research and application possibilities in the field. These optimized models demonstrated superiority and versatility and laid the foundation for future advancements in natural language processing, sentiment analysis, and beyond.
Library of Congress Subject Headings
Electroencephalography--Data processing; Sentiment analysis; Natural language processing (Computer science)
Publication Date
8-2023
Document Type
Thesis
Student Type
Graduate
Degree Name
Electrical Engineering (MS)
Department, Program, or Center
Electrical Engineering
Advisor
Ali Sayyad
Advisor/Committee Member
Abdulla Ismail
Advisor/Committee Member
Jinane Mounsef
Recommended Citation
Hassan, Amira, "Optimized Generative Pre-training EEG to Sentiment Classification (OGPTSC)" (2023). Thesis. Rochester Institute of Technology. Accessed from
https://repository.rit.edu/theses/11672
Campus
RIT Dubai
Plan Codes
EEEE-MS