
International Journal on Science and Technology
E-ISSN: 2229-7677
•
Impact Factor: 9.88
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 16 Issue 2
2025
Indexing Partners



















Emotion Recognition Using Facial Expressions with deep learning : A Comprehensive Overview
Author(s) | Kondaplli Lakshminarsimha Rao Omsrivathsa, Sarraju Haripriya, Dilipkumar |
---|---|
Country | India |
Abstract | Emotion recognition plays a pivotal role in enhancing human-computer interaction (HCI), mental health diagnosis, surveillance, and entertainment applications. Among the various modalities for detecting emotional states, facial expressions remain one of the most natural and widely studied approaches. This paper presents an overview of the methodologies, challenges, and advancements in emotion recognition through facial expression analysis. The integration of computer vision, machine learning, and deep learning technologies has significantly improved recognition accuracy, although challenges related to variation in expression, occlusion, and cultural differences remain. |
Keywords | Mental Health Diagnosis,machine learning,Multimodal Emotion Detection |
Field | Computer > Artificial Intelligence / Simulation / Virtual Reality |
Published In | Volume 16, Issue 2, April-June 2025 |
Published On | 2025-04-23 |
Cite This | Emotion Recognition Using Facial Expressions with deep learning : A Comprehensive Overview - Kondaplli Lakshminarsimha Rao Omsrivathsa, Sarraju Haripriya, Dilipkumar - IJSAT Volume 16, Issue 2, April-June 2025. DOI 10.71097/IJSAT.v16.i2.3902 |
DOI | https://doi.org/10.71097/IJSAT.v16.i2.3902 |
Short DOI | https://doi.org/g9gp4j |
Share this


CrossRef DOI is assigned to each research paper published in our journal.
IJSAT DOI prefix is
10.71097/IJSAT
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.
