Facial action units (AUs) are numerical representations of indicative facial features drawn by facial landmark localization. Tiny muscular movements on the face can be reflected through facial AUs. In this project, using AUs to diagnose atrial fibrillation (AFib) in real-time is investigated for the first time, and AUs are shown to have a connection with AFib occurrence. Unlike ECG/PPG/VPG, the machine-learning-enabled AFib detector proposed in this project can realize real-time contactless monitoring of AFib continuously in home settings while maintaining the accuracy of AFib detection. ECG data and tablet image snapshots are synchronized to form the dataset that will be used. AUs extracted from the images serve as predictors, and the ECG data corresponds to a binary AFib label at a time instant. Due to limited data, the original dataset is balanced and augmented. AUs are engineered based on a feature selection method. AFib data are partitioned into a training/validation set and a test set, and modern machine-learning classifiers are used to fit the data. Upon five-fold cross-validation, random forest (RF) stands out and is further fine-tuned, achieving the best performance curve and the highest area under the curve (AUC). After RF has been fine-tuned, an F2-score of 0.811, recall of 0.782, and accuracy of 0.969 can be attained. For new subjects that have not been trained before in the test set, RF is also proved to be superior to a random classifier. As more training data are available, the RF classifier to detect AFib is promising for future real-life applications.
Library of Congress Subject Headings
Atrial fibrillation--Diagnosis; Facial expression; Face--Abnormalities; Machine learning
Electrical Engineering (MS)
Department, Program, or Center
Electrical Engineering (KGCOE)
Zhu, Chongyu, "Machine-learning Classifier to Detect Atrial Fibrillation based on Facial Action Units" (2022). Thesis. Rochester Institute of Technology. Accessed from
RIT – Main Campus