0% found this document useful (0 votes)
63 views34 pages

Capstone Project Report CO6I

This capstone project report presents an AI-based emotion recognition system developed by students of Government Polytechnic, Thane, as part of their diploma in Computer Engineering. The project utilizes Convolutional Neural Networks (CNNs) to classify emotions from facial expressions, with applications in various fields such as healthcare and marketing. The report discusses the methodology, challenges, and ethical considerations associated with implementing emotion recognition technology.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
63 views34 pages

Capstone Project Report CO6I

This capstone project report presents an AI-based emotion recognition system developed by students of Government Polytechnic, Thane, as part of their diploma in Computer Engineering. The project utilizes Convolutional Neural Networks (CNNs) to classify emotions from facial expressions, with applications in various fields such as healthcare and marketing. The report discusses the methodology, challenges, and ethical considerations associated with implementing emotion recognition technology.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

A CAPSTONE PROJECT REPORT

ON
AI-Based Emotion Recognition
SUBMITTED IN PARTIAL FULFILLMENT OF THE

REQUIREMENT FOR THE DIPLOMA IN

COMPUTER ENGINEERING
BY

Roll no Name Enrollment no


02 Disha Hanumantha Chalwadi 2201160001
06 Avani Nitesh Mahendrakar 2201160005
07 Rashi Yashwant Bambugade 2201160006
14 Maithili Pravin Sonawane 2201160013

UNDER THE GUIDENCE OF

Mr. J.R. Nikhade

GOVERNMENT POLYTECHNIC, THANE


Phadke pada,Bharat Gears, Mumbra-Shil Rd, Diva, Thane 400612

Website: www.gpthane.org.in

Email: principal.gpthane@dtemaharashtra.gov.in

1
CERTIFICATE
This is to certify that following Third Year Computer Engineering
Students have successfully and satisfactorily completed their project
work, entitled “AI-Based Emotion Recognition”, in partial fulfilment
of the requirement for the diploma in Computer Engineering for the
academic year 2024- 2025.

Roll no Name Enrollment no


02 Disha Hanumantha Chalwadi 2201160001
06 Avani Nitesh Mahendrakar 2201160005
07 Rashi Yashwant Bambugade 2201160006
14 Maithili Pravin Sonawane 2201160013

Project Guide H.O.D.


(Mr. J. R. Nikhade) (Mr. J. R. Nikhade)

Principal
(Dr. D. R. Mahajan)

External Examiner
1……………………………….

2
ACKNOWLEDGEMENT
We have taken efforts in this project. However, it would not have been
possible without the kind support and help of many individuals and
organizations. We would like to extend my sincere thanks to all of them.
We are highly indebted to Mr. J.R.Nikhade(HOD) sir for their
guidance and constant supervision as well as for providing necessary
information regarding the project & also for their support in completing
the project. We would like to express our gratitude towards my group
members of this project for their kind co-operation and encouragement
which help us in completion of this project.
We would like to express our special gratitude and thanks to subject
staffs for giving us such attention and time.
My thanks and appreciations also go to my colleague in developing the
project and people who have willingly helped me out with their
abilities.

3
ABSTRACT
Emotion recognition using artificial intelligence (AI) is an emerging
field that combines psychology, computer science, and machine
learning to analyse and interpret human emotions through various data
inputs. This technology primarily leverages deep learning algorithms
and neural networks to process data from facial expressions. The goal is
to classify emotional states such as happiness, sadness, anger, surprise,
and more, by detecting patterns in these data.
AI-based emotion recognition has significant applications in various
domains, including healthcare, marketing, customer service, and
human-computer interaction. In marketing, it helps in assessing
customer reactions to products or advertisements, enabling businesses
to tailor strategies based on emotional feedback. Human-computer
interaction systems use emotion recognition to create more
personalized and adaptive interfaces, improving user experience.
However, the technology faces challenges such as ethical concerns
regarding privacy, bias in training data, and the complexity of
accurately interpreting emotions across diverse cultural and individual
differences. Moreover, the subtlety and ambiguity of human emotions
present limitations to the precision of current models. Despite these
challenges, advances in AI continue to enhance the accuracy and
applicability of emotion recognition systems, making it a vital tool in
understanding and responding to human emotions in real time.

4
INDEX

Sr. No. Title Page No.


1. Introduction 6

2. Literature Review 9

3. Problem Statement 11
3.1 Introduction
3.2 Problem Description
3.3 Scope of Analysis
4. Model and Analysis 15
4.1 Model Overview
4.2 CNN Architecture
4.3 Model Training Optimization
4.4 Model Performance Analysis
4.5 Challenges and Limitations
4.6 Challenges with the Dataset
5. Methodology 18
5.1 System Architecture
5.2 Tools and Technologies
5.3 Algorithm Implementation

6. Design and Implementation 22

7. Application and Future Scope 36

8. Conclusion 38

9. Reference 40

5
CHAPTER:1

INTRODUCTION

6
INTRODUCTION
Emotion recognition using artificial intelligence (AI) is a rapidly advancing technology that
focuses on understanding and interpreting human emotions through data analysis. This process
involves the use of AI algorithms to detect emotional cues from facial expressions. In this
report, we explore the development of an AI-based emotion recognition system, aiming to
enhance human-computer interaction by allowing machines to sense and respond to human
emotions. The proposed system utilizes machine learning techniques, with deep learning
playing a critical role in processing complex data from multiple sources. Convolutional Neural
Networks (CNNs) are employed to analyse facial features. These methods enable the system to
classify emotional states such as joy, anger, sadness, and surprise, among others. By integrating
data from various input sources, the AI system can recognize emotions more accurately, even
in complex scenarios.
AI-based emotion recognition has numerous potential applications. For example, it can be used
in educational environments to assess students’ emotional engagement, enabling teachers to
adjust their methods based on real-time emotional feedback. Additionally, it has practical use
in customer service, allowing companies to better understand and respond to customer
emotions during interactions, ultimately improving user satisfaction. Despite its promising
potential, the development and deployment of emotion recognition technology face challenges.
This report discusses these challenges and explores potential solutions, emphasizing the
importance of ethical AI development. Overall, this project aims to create a system that is not
only technologically advanced but also socially responsible, ensuring that AI-based emotion
recognition enhances human interactions in meaningful and ethical ways. The system’s ability
to detect and respond to emotions could significantly transform the way we interact with
machines, making human computer interaction more intuitive, personalized, and responsive.

7
8
CHAPTER:2

LITERATURE REVIEW

9
LITERATURE REVIEW
Facial emotion recognition using MATLAB
Abstract: The most important cognitive function that our brain effectively does is detecting
and interpret face and facial expressions during communication or any specific scenario with
minimal or no hassle. With the boom in artificial intelligence systems and machine learning in
recent years has led to development of various intelligent systems which can itself learn and
extract knowledge from the variety of data fed to the systems This led to the emergence of
intelligent systems which can learn, identify and understand human emotion via verbal
communication i.e. Speech or Text or non-verbal communication i.e. Facial expression and
body language. The goal is to develop a facial emotion recognition model which can understand
human facial expressions and detect the mood and mind state of a person based on input data.
It uses the concept of computer vision and machine learning to identify the emotions of the
person based on his/her facial expression.
FACIAL EMOTION RECOGNITION
We humans can expand their knowledge to adapt to the changing environment and to
accomplish it we must “learn”. Learning is a process of acquiring knowledge about something
through study, experience, or by being taught by some external agent and is a continuous
process by which a system improves performance or experience. Learning is a most important
attribute of all living animals on this planet but is most well developed and prominent in human
beings but this is not present in computers which work on binary data i.e. 0’s and 1’s. Then a
big question raised “How to make computers learn? April 2019.With the advent of Machine
Learning and Artificial Intelligence in recent years have brought in a new era of computing
systems which provide them advanced capabilities of self-learning artificially by analysing the
data sets given as input. When we say machine learns, we actually mean that the machine is
able to make predictions from past data and is able to process information on the basis of that
data. Due to increase in capability of hardware in recent years has led to tremendous increase
in computing power of a computer system which helped in the development of systems in
which are capable of performing such tasks which speed and precision. Machine learning made
computers bring one step closer to mimic human-like intelligence and giving them the ability
to think and process knowledge out of different types of input data like images, videos, text,
etc. Facial emotion recognition (FER) is an important topic in the fields of computer vision and
artificial intelligence owing to its significant academic and commercial potential. According to
different surveys, verbal components convey one-third of human communication, and
nonverbal components convey two-thirds. Therefore, it has its applications not only in the
perceptual and cognitive sciences but also in affective computing and computer animations. It
has also been increasing recently with the rapid development of artificial intelligent techniques,
including in human-computer interaction (HCI), virtual reality (VR), augment reality (AR),
advanced driver assistant systems (ADASs), and entertainment. It can also be used in day to
day life as in video calling mobile application for detecting emotions or in selfie applications
to apply filters to our detected face like Snapchat.

10
OBJECTIVES
The study was carried out in order to determine the following objectives:
a) To explore the present status of usage and applications of facial emotion recognition system
in the industries.
b) To find out the areas in which the facial emotion recognition system is being used in the
industry and day to day life.
c) To identify the merits and demerits of using facial emotion recognition system.
d) To check how the different type of facial emotions can be used to predict the
emotions of the person in the picture when given as an input to the model.

11
CHAPTER:3

PROBLEM STATEMENT

12
3.1. Introduction
Emotion recognition is a crucial aspect of human communication, allowing individuals to
express and interpret feelings through facial expressions. However, detecting emotions
accurately is a complex task due to the variability in human expressions influenced by
cultural, psychological, and contextual factors. Traditional emotion analysis methods rely on
manual observation, which is often subjective and inconsistent.

With advancements in artificial intelligence, machine learning, and deep learning, AI-based
emotion recognition aims to automate and enhance emotion detection. This technology has
applications in multiple domains, including healthcare, customer service, education, and
security. However, despite its potential, the accuracy, ethical concerns, and privacy issues
associated with AI-driven emotion recognition present significant challenges.

3.2. Problem Description


The core challenge in AI-based emotion recognition lies in developing an intelligent system
that can:

• Accurately detect and classify human emotions from facial expressions.

• Generalize across diverse individuals, languages, and cultural backgrounds without bias.

• Handle variations in emotional expressions due to external factors such as lighting,


background noise, or environmental influences.

• Ensure ethical use by addressing concerns related to privacy, consent, and data security.

• Provide real-time and scalable solutions suitable for integration into real-world applications.

3.3. Scope of Analysis


This study focuses on exploring AI-based emotion recognition through:

• Data Collection and Processing: Analysing how AI models gather and process data
from images, audio, and text.

• Machine Learning Techniques: Examining supervised and deep learning approaches


such as CNNs for facial recognition and NLP for sentiment analysis.

• Challenges and Limitations: Identifying issues like bias, misclassification, and ethical
concerns.

13
• Applications and Use Cases: Evaluating its impact in healthcare, security, education,
marketing, and entertainment.
• Future Directions: Proposing improvements to enhance accuracy, fairness, and ethical
compliance in emotion recognition systems.

By addressing these aspects, this study aims to provide insights into the effectiveness and
limitations of AI-based emotion recognition and propose strategies for responsible
implementation.

14
CHAPTER:4

MODEL AND ANALYSIS

15
4.1. Model Overview
For this project, we implemented a Convolutional Neural Network (CNN) to classify human
emotions from facial images. CNN is a deep learning model specifically designed for image
processing, making it the most effective choice for emotion recognition tasks.

4.2. CNN Model Architecture


Our CNN model consists of multiple layers designed to extract and analyse facial features
efficiently. The architecture includes:
• Input Layer: Accepts grayscale facial images (48x48 pixels) from the dataset.
• Convolutional Layers: Apply filters to extract important spatial features such as edges,
contours, and textures of facial expressions.
• Pooling Layers (Max Pooling): Reduce dimensionality while retaining essential features,
improving computational efficiency.
• Dropout Layers: Prevent overfitting by randomly deactivating neurons during training.
• Fully Connected Layers: Flatten extracted features and pass them through dense layers to
learn complex patterns.
• SoftMax Output Layer: Classifies images into one of the seven emotional categories:
Angry, Disgust, Fear, Happy, Neutral, Sad, and Surprise.

4.3. Model Training and Optimization


• Data Augmentation: Applied transformations like flipping, rotation, and zooming to
improve model generalization and performance.
• Training Configuration: The dataset was split into 80% training, 10% validation, and
10% testing.

4.4. Model Performance Analysis


• Accuracy: The model achieved an accuracy of 62% on the test set.
• Confusion Matrix: Analysed misclassified emotions, showing that fear and surprise were
sometimes misinterpreted due to their similarities.

16
4.5. Challenges and Limitations
• Class Imbalance: Some emotions (e.g., Disgust) had fewer samples, affecting prediction
accuracy.

• Dataset Limitations: The images were low-resolution (48x48), which could impact fine
detail recognition.

4.6. Challenges with the Dataset

• Expression Variability: Some emotions, such as fear and surprise, have subtle differences,
making them hard to distinguish.
• Cultural Bias: Facial expressions can vary across cultures, which may impact the model's
accuracy for diverse populations.
• Low Resolution: The small image size (48x48) may lead to loss of finer facial details that
could be useful for accurate classification.

17
CHAPTER:5

METHODOLOGY

18
METHODOLOGY

5.1. System Architecture


The proposed AI-based emotion recognition system follows a structured pipeline consisting of
the following stages:

1. Data Collection:

o Facial expression data is gathered from publicly available from KAGGLE


website where it contains 35,900 images.

2. Preprocessing:

o Facial Recognition: Images are resized, converted to grayscale, and normalized


for consistency.

3. Feature Extraction:

o CNN extracts spatial features from facial expressions.

4. Model Training:

o A deep learning model (CNN) is trained for facial emotion recognition using
convolutional layers.

o The models are trained on labelled datasets using TensorFlow.

5. Emotion Classification:

o The trained models predict emotions such as happy, sad, angry, surprised, and
neutral. o The final emotion output is determined using classification
techniques and probability scores.

6. Performance Evaluation:

o Accuracy, precision, recall, and F1-score are used to evaluate model


performance.

5.2 Tools and Technologies


• Programming Language: Python

• Libraries: TensorFlow, Keras, OpenCV, NumPy, Pandas

• Frameworks: TensorFlow and Keras for deep learning

19
• Datasets: Kaggle website (Face expression recognition dataset with 35,900 images of 7
emotions)

5.3 Algorithm Implementation

• Facial Emotion Recognition:

o Convolutional Neural Networks (CNN) extract key facial features from image
data.

20
CHAPTER:6

DESIGN AND IMPLEMENTATION

21
DETAILS OF DESIGNS, WORKING AND PROCESS AND IT’S
IMPLEMENTATION

1. Dataset Overview
For this project, we used the Face Expression Recognition Dataset from Kaggle, which consists
of 35,900 images depicting human facial expressions across various emotional states. This
dataset is widely used for training machine learning models in facial emotion detection.

2. Data Composition
The dataset contains grayscale images of human faces categorized into different emotional
labels. The primary emotions included in the dataset are:
[Angry, Disgust, Fear, Happy, Neutral, Sad, Surprise]
Each image is labelled accordingly, allowing supervised learning models to be trained for
emotion classification.

3. Data Collection and Preprocessing


• Image Format: The dataset consists of grayscale images with a resolution of 48x48
pixels, making it suitable for deep learning models while maintaining computational
efficiency.
• Data Cleaning: Since some images may contain noise, preprocessing techniques such
as histogram equalization, normalization, and augmentation (rotation, flipping, and
scaling) are applied to enhance model

22
4. Model Training Considerations
• Feature Extraction: CNN-based architectures is used for feature extraction and
classification.

• Splitting the Data: The dataset is typically divided into training (80%), validation
(10%), and test (10%) sets for model evaluation.

• Data Augmentation: To improve generalization and prevent overfitting,


transformations like brightness adjustments, cropping, and Gaussian noise are applied.

23
DEPLOYMENT AND REAL-TIME IMPLEMENTATION OF PROJECT
Once trained, the model can be deployed for real-time emotion recognition.
Real-Time Emotion Detection Workflow
1. Capture facial expressions from a webcam.
2. Capture speech audio via a microphone.
3. Process and classify emotions using the trained models.
4. Display detected emotions.

<using VS CODE we executed our project using OpenCV (which is an open-source library
for computer vision ,image processing and machine learning)also>

24
25
<OUTPUT>

26
<we have also included record logs as date and time where the camera has
captured the various emotions at a time>

27
CHAPTER:7

APPLICATION & FUTURE SCOPE

28
Applications
AI-based emotion recognition using facial images has various real-world applications:

1. Healthcare & Mental Health Monitoring


• AI analyzes facial expressions to detect signs of depression, anxiety, and stress.
• Used in telemedicine for remote diagnosis of mental health conditions.
2. Smart Surveillance & Security
• Facial emotion analysis helps in detecting suspicious behaviour in public places.
• Used in law enforcement for lie detection and criminal profiling.
3. Customer Experience & Marketing
• Retail stores analyze customer emotions to personalize advertisements.
• AI tracks facial responses to ads, improving marketing strategies.
4. Human-Computer Interaction (HCI)
• AI-powered smart assistants and robots adjust their responses based on facial
emotions.
• Used in video games to make NPCs (non-playable characters) react dynamically.
5. Education & E-Learning
• AI monitors students' facial expressions to assess engagement levels.
• Helps teachers in virtual classrooms understand students' emotions in real-time.

Future Scope
As technology advances, image-based emotion recognition will improve in several ways:

1. Multimodal Emotion Detection


• Integrating facial images with body language and gestures for higher accuracy.
• Combining with speech and physiological signals to enhance emotion recognition.
2. Real-Time Emotion Analysis in Wearable Devices
• Smart glasses and AR/VR headsets will detect user emotions for immersive
experiences.
• AI-powered assistive devices for visually impaired individuals to recognize others'
emotions.
3. Bias-Free & Ethical AI
• Improving fairness in emotion recognition models to avoid racial or cultural biases.
• Enhancing data privacy to protect personal emotional data from misuse.
4. Advanced AI Models for High Accuracy
• Use of transformer-based models (Vision Transformers - ViTs) for better emotion
detection.
• Integration of 3D facial expression analysis for a deeper understanding of emotions.
5. AI in Social Media & Entertainment

29
• Emotion-based photo filters and stickers in social media apps.
• AI-powered content recommendations based on user emotions in video streaming
platforms.

30
CHAPTER: 8

CONCLUSION

31
CONCLUSION
The AI-Based Emotion Recognition project successfully integrates artificial intelligence
techniques to analyze and classify human emotions based on facial expressions. With the
increasing role of AI in everyday life, emotion recognition has emerged as a crucial
technology that enhances human-computer interaction, mental health monitoring, security,
and customer engagement. By leveraging deep learning models such as Convolutional Neural
Networks (CNNs) for image-based emotion detection, this system effectively identifies
emotions like happiness, sadness, anger, surprise, and neutrality with high accuracy. One of
the key strengths of this project is its ability to process and interpret complex emotional cues,
which traditional rule-based systems fail to capture. The integration of facial expression
analysis provides a more holistic approach to emotion detection, improving the overall
accuracy of the system. Additionally, the use of machine learning libraries such as
TensorFlow, Keras, OpenCV ensures efficient data processing and model training, making
the system robust and scalable for real-world applications.
Additionally, transformer-based deep learning models, generative AI techniques, and
improved real-time processing capabilities will contribute to higher accuracy and better
performance of emotion recognition systems.
In conclusion, this project demonstrates the feasibility and impact of AI-based emotion
recognition in transforming human-computer interactions and decision-making processes.
While there are challenges to overcome, the continuous evolution of AI, deep learning, and
ethical AI frameworks will pave the way for more efficient, accurate, and responsible
emotion recognition technologies in the future.

32
CHAPTER: 9

REFERENCE

33
REFERENCE

1. https://youtu.be/G1Uhs6NVi-M?si=txq5tEYZP4jKaxFn// edureka video of emotion

recognition using python

2. https://www.youtube.com/live/m0fWjP3yIEo?si=k76mBDXVQVx59P16// emotion

detection using python

3. https://ieeexplore.ieee.org/document/9938357// article about emotion detection

4. https://www.frontiersin.org/journals/computerscience/articles/10.3389/fcomp.2024.13594

71/full// emotion detection through artificial intelligence

5. :http://www.jetir.org/papers/JETIR1904P30.pdf // journal on FACIAL EMOTION

RECOGNITION

34

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy