Capstone Project Report CO6I
Capstone Project Report CO6I
ON
AI-Based Emotion Recognition
SUBMITTED IN PARTIAL FULFILLMENT OF THE
COMPUTER ENGINEERING
BY
Website: www.gpthane.org.in
Email: principal.gpthane@dtemaharashtra.gov.in
1
CERTIFICATE
This is to certify that following Third Year Computer Engineering
Students have successfully and satisfactorily completed their project
work, entitled “AI-Based Emotion Recognition”, in partial fulfilment
of the requirement for the diploma in Computer Engineering for the
academic year 2024- 2025.
Principal
(Dr. D. R. Mahajan)
External Examiner
1……………………………….
2
ACKNOWLEDGEMENT
We have taken efforts in this project. However, it would not have been
possible without the kind support and help of many individuals and
organizations. We would like to extend my sincere thanks to all of them.
We are highly indebted to Mr. J.R.Nikhade(HOD) sir for their
guidance and constant supervision as well as for providing necessary
information regarding the project & also for their support in completing
the project. We would like to express our gratitude towards my group
members of this project for their kind co-operation and encouragement
which help us in completion of this project.
We would like to express our special gratitude and thanks to subject
staffs for giving us such attention and time.
My thanks and appreciations also go to my colleague in developing the
project and people who have willingly helped me out with their
abilities.
3
ABSTRACT
Emotion recognition using artificial intelligence (AI) is an emerging
field that combines psychology, computer science, and machine
learning to analyse and interpret human emotions through various data
inputs. This technology primarily leverages deep learning algorithms
and neural networks to process data from facial expressions. The goal is
to classify emotional states such as happiness, sadness, anger, surprise,
and more, by detecting patterns in these data.
AI-based emotion recognition has significant applications in various
domains, including healthcare, marketing, customer service, and
human-computer interaction. In marketing, it helps in assessing
customer reactions to products or advertisements, enabling businesses
to tailor strategies based on emotional feedback. Human-computer
interaction systems use emotion recognition to create more
personalized and adaptive interfaces, improving user experience.
However, the technology faces challenges such as ethical concerns
regarding privacy, bias in training data, and the complexity of
accurately interpreting emotions across diverse cultural and individual
differences. Moreover, the subtlety and ambiguity of human emotions
present limitations to the precision of current models. Despite these
challenges, advances in AI continue to enhance the accuracy and
applicability of emotion recognition systems, making it a vital tool in
understanding and responding to human emotions in real time.
4
INDEX
2. Literature Review 9
3. Problem Statement 11
3.1 Introduction
3.2 Problem Description
3.3 Scope of Analysis
4. Model and Analysis 15
4.1 Model Overview
4.2 CNN Architecture
4.3 Model Training Optimization
4.4 Model Performance Analysis
4.5 Challenges and Limitations
4.6 Challenges with the Dataset
5. Methodology 18
5.1 System Architecture
5.2 Tools and Technologies
5.3 Algorithm Implementation
8. Conclusion 38
9. Reference 40
5
CHAPTER:1
INTRODUCTION
6
INTRODUCTION
Emotion recognition using artificial intelligence (AI) is a rapidly advancing technology that
focuses on understanding and interpreting human emotions through data analysis. This process
involves the use of AI algorithms to detect emotional cues from facial expressions. In this
report, we explore the development of an AI-based emotion recognition system, aiming to
enhance human-computer interaction by allowing machines to sense and respond to human
emotions. The proposed system utilizes machine learning techniques, with deep learning
playing a critical role in processing complex data from multiple sources. Convolutional Neural
Networks (CNNs) are employed to analyse facial features. These methods enable the system to
classify emotional states such as joy, anger, sadness, and surprise, among others. By integrating
data from various input sources, the AI system can recognize emotions more accurately, even
in complex scenarios.
AI-based emotion recognition has numerous potential applications. For example, it can be used
in educational environments to assess students’ emotional engagement, enabling teachers to
adjust their methods based on real-time emotional feedback. Additionally, it has practical use
in customer service, allowing companies to better understand and respond to customer
emotions during interactions, ultimately improving user satisfaction. Despite its promising
potential, the development and deployment of emotion recognition technology face challenges.
This report discusses these challenges and explores potential solutions, emphasizing the
importance of ethical AI development. Overall, this project aims to create a system that is not
only technologically advanced but also socially responsible, ensuring that AI-based emotion
recognition enhances human interactions in meaningful and ethical ways. The system’s ability
to detect and respond to emotions could significantly transform the way we interact with
machines, making human computer interaction more intuitive, personalized, and responsive.
7
8
CHAPTER:2
LITERATURE REVIEW
9
LITERATURE REVIEW
Facial emotion recognition using MATLAB
Abstract: The most important cognitive function that our brain effectively does is detecting
and interpret face and facial expressions during communication or any specific scenario with
minimal or no hassle. With the boom in artificial intelligence systems and machine learning in
recent years has led to development of various intelligent systems which can itself learn and
extract knowledge from the variety of data fed to the systems This led to the emergence of
intelligent systems which can learn, identify and understand human emotion via verbal
communication i.e. Speech or Text or non-verbal communication i.e. Facial expression and
body language. The goal is to develop a facial emotion recognition model which can understand
human facial expressions and detect the mood and mind state of a person based on input data.
It uses the concept of computer vision and machine learning to identify the emotions of the
person based on his/her facial expression.
FACIAL EMOTION RECOGNITION
We humans can expand their knowledge to adapt to the changing environment and to
accomplish it we must “learn”. Learning is a process of acquiring knowledge about something
through study, experience, or by being taught by some external agent and is a continuous
process by which a system improves performance or experience. Learning is a most important
attribute of all living animals on this planet but is most well developed and prominent in human
beings but this is not present in computers which work on binary data i.e. 0’s and 1’s. Then a
big question raised “How to make computers learn? April 2019.With the advent of Machine
Learning and Artificial Intelligence in recent years have brought in a new era of computing
systems which provide them advanced capabilities of self-learning artificially by analysing the
data sets given as input. When we say machine learns, we actually mean that the machine is
able to make predictions from past data and is able to process information on the basis of that
data. Due to increase in capability of hardware in recent years has led to tremendous increase
in computing power of a computer system which helped in the development of systems in
which are capable of performing such tasks which speed and precision. Machine learning made
computers bring one step closer to mimic human-like intelligence and giving them the ability
to think and process knowledge out of different types of input data like images, videos, text,
etc. Facial emotion recognition (FER) is an important topic in the fields of computer vision and
artificial intelligence owing to its significant academic and commercial potential. According to
different surveys, verbal components convey one-third of human communication, and
nonverbal components convey two-thirds. Therefore, it has its applications not only in the
perceptual and cognitive sciences but also in affective computing and computer animations. It
has also been increasing recently with the rapid development of artificial intelligent techniques,
including in human-computer interaction (HCI), virtual reality (VR), augment reality (AR),
advanced driver assistant systems (ADASs), and entertainment. It can also be used in day to
day life as in video calling mobile application for detecting emotions or in selfie applications
to apply filters to our detected face like Snapchat.
10
OBJECTIVES
The study was carried out in order to determine the following objectives:
a) To explore the present status of usage and applications of facial emotion recognition system
in the industries.
b) To find out the areas in which the facial emotion recognition system is being used in the
industry and day to day life.
c) To identify the merits and demerits of using facial emotion recognition system.
d) To check how the different type of facial emotions can be used to predict the
emotions of the person in the picture when given as an input to the model.
11
CHAPTER:3
PROBLEM STATEMENT
12
3.1. Introduction
Emotion recognition is a crucial aspect of human communication, allowing individuals to
express and interpret feelings through facial expressions. However, detecting emotions
accurately is a complex task due to the variability in human expressions influenced by
cultural, psychological, and contextual factors. Traditional emotion analysis methods rely on
manual observation, which is often subjective and inconsistent.
With advancements in artificial intelligence, machine learning, and deep learning, AI-based
emotion recognition aims to automate and enhance emotion detection. This technology has
applications in multiple domains, including healthcare, customer service, education, and
security. However, despite its potential, the accuracy, ethical concerns, and privacy issues
associated with AI-driven emotion recognition present significant challenges.
• Generalize across diverse individuals, languages, and cultural backgrounds without bias.
• Ensure ethical use by addressing concerns related to privacy, consent, and data security.
• Provide real-time and scalable solutions suitable for integration into real-world applications.
• Data Collection and Processing: Analysing how AI models gather and process data
from images, audio, and text.
• Challenges and Limitations: Identifying issues like bias, misclassification, and ethical
concerns.
13
• Applications and Use Cases: Evaluating its impact in healthcare, security, education,
marketing, and entertainment.
• Future Directions: Proposing improvements to enhance accuracy, fairness, and ethical
compliance in emotion recognition systems.
By addressing these aspects, this study aims to provide insights into the effectiveness and
limitations of AI-based emotion recognition and propose strategies for responsible
implementation.
14
CHAPTER:4
15
4.1. Model Overview
For this project, we implemented a Convolutional Neural Network (CNN) to classify human
emotions from facial images. CNN is a deep learning model specifically designed for image
processing, making it the most effective choice for emotion recognition tasks.
16
4.5. Challenges and Limitations
• Class Imbalance: Some emotions (e.g., Disgust) had fewer samples, affecting prediction
accuracy.
• Dataset Limitations: The images were low-resolution (48x48), which could impact fine
detail recognition.
• Expression Variability: Some emotions, such as fear and surprise, have subtle differences,
making them hard to distinguish.
• Cultural Bias: Facial expressions can vary across cultures, which may impact the model's
accuracy for diverse populations.
• Low Resolution: The small image size (48x48) may lead to loss of finer facial details that
could be useful for accurate classification.
17
CHAPTER:5
METHODOLOGY
18
METHODOLOGY
1. Data Collection:
2. Preprocessing:
3. Feature Extraction:
4. Model Training:
o A deep learning model (CNN) is trained for facial emotion recognition using
convolutional layers.
5. Emotion Classification:
o The trained models predict emotions such as happy, sad, angry, surprised, and
neutral. o The final emotion output is determined using classification
techniques and probability scores.
6. Performance Evaluation:
19
• Datasets: Kaggle website (Face expression recognition dataset with 35,900 images of 7
emotions)
o Convolutional Neural Networks (CNN) extract key facial features from image
data.
20
CHAPTER:6
21
DETAILS OF DESIGNS, WORKING AND PROCESS AND IT’S
IMPLEMENTATION
1. Dataset Overview
For this project, we used the Face Expression Recognition Dataset from Kaggle, which consists
of 35,900 images depicting human facial expressions across various emotional states. This
dataset is widely used for training machine learning models in facial emotion detection.
2. Data Composition
The dataset contains grayscale images of human faces categorized into different emotional
labels. The primary emotions included in the dataset are:
[Angry, Disgust, Fear, Happy, Neutral, Sad, Surprise]
Each image is labelled accordingly, allowing supervised learning models to be trained for
emotion classification.
22
4. Model Training Considerations
• Feature Extraction: CNN-based architectures is used for feature extraction and
classification.
• Splitting the Data: The dataset is typically divided into training (80%), validation
(10%), and test (10%) sets for model evaluation.
23
DEPLOYMENT AND REAL-TIME IMPLEMENTATION OF PROJECT
Once trained, the model can be deployed for real-time emotion recognition.
Real-Time Emotion Detection Workflow
1. Capture facial expressions from a webcam.
2. Capture speech audio via a microphone.
3. Process and classify emotions using the trained models.
4. Display detected emotions.
<using VS CODE we executed our project using OpenCV (which is an open-source library
for computer vision ,image processing and machine learning)also>
24
25
<OUTPUT>
26
<we have also included record logs as date and time where the camera has
captured the various emotions at a time>
27
CHAPTER:7
28
Applications
AI-based emotion recognition using facial images has various real-world applications:
Future Scope
As technology advances, image-based emotion recognition will improve in several ways:
29
• Emotion-based photo filters and stickers in social media apps.
• AI-powered content recommendations based on user emotions in video streaming
platforms.
30
CHAPTER: 8
CONCLUSION
31
CONCLUSION
The AI-Based Emotion Recognition project successfully integrates artificial intelligence
techniques to analyze and classify human emotions based on facial expressions. With the
increasing role of AI in everyday life, emotion recognition has emerged as a crucial
technology that enhances human-computer interaction, mental health monitoring, security,
and customer engagement. By leveraging deep learning models such as Convolutional Neural
Networks (CNNs) for image-based emotion detection, this system effectively identifies
emotions like happiness, sadness, anger, surprise, and neutrality with high accuracy. One of
the key strengths of this project is its ability to process and interpret complex emotional cues,
which traditional rule-based systems fail to capture. The integration of facial expression
analysis provides a more holistic approach to emotion detection, improving the overall
accuracy of the system. Additionally, the use of machine learning libraries such as
TensorFlow, Keras, OpenCV ensures efficient data processing and model training, making
the system robust and scalable for real-world applications.
Additionally, transformer-based deep learning models, generative AI techniques, and
improved real-time processing capabilities will contribute to higher accuracy and better
performance of emotion recognition systems.
In conclusion, this project demonstrates the feasibility and impact of AI-based emotion
recognition in transforming human-computer interactions and decision-making processes.
While there are challenges to overcome, the continuous evolution of AI, deep learning, and
ethical AI frameworks will pave the way for more efficient, accurate, and responsible
emotion recognition technologies in the future.
32
CHAPTER: 9
REFERENCE
33
REFERENCE
2. https://www.youtube.com/live/m0fWjP3yIEo?si=k76mBDXVQVx59P16// emotion
4. https://www.frontiersin.org/journals/computerscience/articles/10.3389/fcomp.2024.13594
RECOGNITION
34