Ensuring Road Safety By Reading Driver's Facial Emotions Using Deep Learning

Main Article Content

Sharon Sofia S S, S. Jasmine Mary, Dr. T. C. Subbu Lakshmi

Abstract

Researchers studying facial expression detection using deep neural networks have published findings in recent years that demonstrate how these methods get around the drawbacks of traditional machine learning techniques. In this paper, VGG-19 is used to monitor a driver's emotions and to increase its accuracy, which is trained using the similarity of the sample data. Data preparation comes initially, followed by the extraction of geometric characteristics and the detection of facial landmarks in input photos. The performance of the suggested VGG-19 classifier was compared to state-of-the-art techniques after these feature vectors were incorporated into it for facial expression classification. The findings demonstrate that our suggested approach performs comparably to deep learning techniques, which obtained 97.83% accuracy, 96.97% F1-score, 96.5% recall, and 97.73% precision.

Article Details

Section
Articles