
International Journal on Science and Technology
E-ISSN: 2229-7677
•
Impact Factor: 9.88
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 16 Issue 4
October-December 2025
Indexing Partners



















Real-Time Emotion Recognition from Facial Expressions Using Deep CNNs
Author(s) | Ms. Manjula K, Dr. Yuvaraja A, Dr. Gayathri A C, Dr. Pooja P Yalavathi, Dr. Punya Patil G M |
---|---|
Country | India |
Abstract | Emotion recognition from facial expressions has emerged as a powerful tool in fields ranging from mental health assessment to intelligent tutoring systems and targeted marketing. Recent advances in deep learning, particularly convolutional neural networks (CNNs), have significantly improved the accuracy and speed of emotion detection in real-time scenarios. This paper explores the development and implementation of a real-time facial motion recognition system using deep neural networks. The system captures facial images, preprocesses them for landmark detection, and classifies emotions such as happiness, sadness, anger, surprise, and neutral expressions. The proposed approach leverages modern architectures trained on large-scale datasets, such as FER2013, to achieve high recognition rates. This study highlights the potential of AI-powered emotion detection in creating adaptive, human-centric applications. |
Keywords | Emotion Recognition, Facial Expressions, Deep Learning, Convolutional Neural Networks (CNNs), Real-time Detection, FER2013 Dataset, Landmark Detection, Human-Computer Interaction, Mental Health Monitoring, Intelligent Tutoring Systems, Computer Vision, Adaptive Applications |
Field | Computer Applications |
Published In | Volume 16, Issue 4, October-December 2025 |
Published On | 2025-10-18 |
Share this


CrossRef DOI is assigned to each research paper published in our journal.
IJSAT DOI prefix is
10.71097/IJSAT
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.
