0% found this document useful (0 votes)
52 views12 pages

Ai Based Electronic Gadget Recommendation System

Uploaded by

akiladharsh.ad20
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views12 pages

Ai Based Electronic Gadget Recommendation System

Uploaded by

akiladharsh.ad20
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

AI BASED PERSONALISED ELECTRONIC GADGETS

RECOMMENDATION SYSTEM

AKIL ADHARSH N, INDRAKUMAR R S,


HARISH B

ABSTRACT

KEYWORDS: Recommendation system, Similarity, personalization,


content-based filtering, hybrid recommendation

In this era of rapid technological advancements, personalized electronic


gadget recommendation systems powered by ai are gaining prominence. such
systems leverage machine learning algorithms to analyse user preferences,
behaviour, and historical data to provide tailored recommendations for
electronic gadgets. by considering factors like user demographics, past
purchases, reviews, and specifications of gadgets, these systems aim to
deliver accurate and relevant suggestions to individual users. the
recommendation process typically involves several steps. firstly, user data is
collected, including demographic information, browsing history, and previous
purchases. collaborative filtering techniques compare a user's preferences
with those of other similar users, recommending gadgets that have been liked
or purchased by users with comparable tastes. additionally, explicit feedback
such as ratings and reviews may be incorporated. next, the system utilizes
various ai techniques like collaborative filtering, content-based filtering, or
hybrid approaches to process and analye this data.to enhance the
personalization aspect, ai models can be trained to adapt to individual user
behaviour over time of period. privacy and data security are critical
considerations in personalized recommendation systems. user consent and
anonymization techniques are employed to protect personal data and ensure
compliance with data protection regulations. the ultimate goal of an ai-based
personalized electronic gadget recommendation system is to simplify the
decision-making process for users and provide them with a curated list of
options that best match their preferences and needs. by leveraging ai
algorithms, these systems strive to enhance user satisfaction, increase
customer engagement, and improve the overall shopping experience in the
electronic gadget domain
INTRODUCTION knowledge into the client's inclinations and
requirements.
In this period of quick mechanical
headways, customized electronic Protection and information security are
contraption suggestion frameworks basic contemplations in customized
controlled by man-made intelligence are proposal frameworks. Client assent and
acquiring huge prominence. These anonymization procedures are utilized to
frameworks influence AI calculations to safeguard individual information and
examine client preferences, behavior , guarantee consistence with information
and verifiable information, empowering assurance guidelines. It is fundamental for
them to give custom-made proposals for clients to believe that their information is
electronic devices. By taking into dealt with dependably what's more, with
account factors like client deference for their protection.
socioeconomics, past buys, audits, and
details of contraptions, these frameworks A definitive objective of a man-made
plan to convey exact and pertinent ideas intelligence based customized electronic
to individual clients. device suggestion framework is to work on
the dynamic cycle for clients and give them
The proposal interaction ordinarily with an organized rundown of choices that
includes a few stages. Client best match their inclinations and
information, right off the bat, is necessities. By utilizing artificial
gathered, including segment data, intelligence calculations, these frameworks
perusing history, and past buys. endeavor to improve client fulfillment,
Cooperative sifting strategies contrast a increment client commitment, and further
client's inclinations and those of other develop the general shopping experience in
comparative clients, suggesting the electronic contraption space.
contraptions that have been enjoyed or
bought by clients with equivalent As innovation keeps on developing, we can
preferences. Furthermore, unequivocal anticipate these proposal frameworks to
input, for example, appraisals and audits turn out to be considerably more complex
might be integrated. Then, the and significant apparatuses for the two
framework uses different artificial buyers and organizations in the electronic
intelligence strategies like cooperative device industry.
sifting, content-based separating, or half
breed ways to deal with process and Ai-based personalized electronic gadget
dissect this information. recommendation system is to simplify the
decision-making process for users and
To upgrade the personalization
perspective, simulated intelligence provide them with a curated list of options
models can be prepared to adjust to that best match their preferences and needs.
individual client conduct throughout by leveraging ai algorithms, these systems
some undefined time frame. This strive to enhance user satisfaction, increase
constant learning empowers the customer engagement, and improve the
framework to give progressively exact overall shopping experience
proposals as it acquires bits of
RELATED WORK scikit-learn, Surprise, and Keras. And
also it commences with an overview of
● Saurav Anand , “Recommender System recommender systems, elucidating the
Using Amazon Reviews”, Kaggle shows various types, including collaborative
how to build a recommender system using filtering, content-based filtering, and
Amazon reviews. The notebook uses the hybrid approaches. It then moves on to
Amazon Fine Food Reviews dataset, explain the concept of matrix
which contains over 500,000 reviews of factorization, a crucial technique in
food products. The notebook first imports collaborative filtering models.
the necessary libraries and then reads the
dataset into a Pandas DataFrame. The Throughout the tutorial, the author
DataFrame is then explored to get a better provides step-by-step code examples and
understanding of the data.The next step is explanations, making it easy for readers
to build the recommender system. to follow along and understand the
process of building recommender
The notebook uses two different systems. The data used for the examples
algorithms: popularity-based and content- is not specified, but it likely involves
based filtering. The popularity-based user-item interactions or ratings that are
algorithm simply recommends the most common in recommendation
popular products. The content-based scenarios.
filtering algorithm recommends products
that are similar to the products that the ● Machine Learning - Advanced courses ,
user has already rated.The notebook then “Recommendation System” , Google
evaluates the performance of the two provides a guide on how to build
algorithms. recommendation systems using machine
learning. The guide covers the basics of
The popularity-based algorithm is shown recommendation systems, including the
to be more effective at recommending different types of recommendation
products to new users, while the content- systems, the algorithms that can be used
based filtering algorithm is more effective to build recommendation systems, and
at recommending products to existing how to evaluate the performance of
users. The notebook concludes by recommendation systems.
discussing the limitations of the two
algorithms and suggesting ways to The guide also includes a number of
improve them. Overall, the notebook resources, such as code samples, tutorials,
provides a good introduction to the basics and datasets. The key takeaways from the
of recommender systems. guide:
a. Recommendation systems are a
● Pathairush Seeda , “A Complete Guide powerful tool for increasing user
To Recommender Systems “, Towards engagement and sales.
Data Science is a comprehensive guide to b. There are three main types of
building recommender systems using recommendation systems: collaborative
popular machine learning libraries. filtering, content-based filtering, and
Authored by an undisclosed writer, the hybrid systems.
tutorial delves into the implementation of c. Collaborative filtering systems
recommender systems using three widely recommend items based on the ratings
used libraries: or preferences of other users.
d. Content-based filtering systems it is divided into two files:
recommend items based on the * products.csv: This file contains
content of the items themselves. information on all the products in the
e. Hybrid systems combine collaborative dataset.
filtering and content-based * reviews.csv: This file contains
filtering. information on the reviews for each
f. The performance of a recommendation product.
system can be evaluated using
a variety of metrics, such as accuracy, The products.csv file contains the
precision, and recall. following columns:
* Product ID: The unique identifier for
the product.
● Qian Zhang1 , Jie Lu1 , Yaochu Jin, * Title: The title of the product.
"Artificial Intelligence in recommender * Price: The price of the product in SAR.
systems ",Complex & Intelligent Systems * Rating: The average rating of the
- Springer provides a comprehensive product.
overview of the use of artificial * Review Count: The number of reviews
intelligence (AI) in recommender systems. for the product.
* Image URL: The URL of the product
The article begins by discussing the image.
challenges of recommender systems, such * Category: The category of the product.
as data sparsity and cold start problems. It * Subcategory: The subcategory of the
then reviews the different AI techniques product.
that have been used to address these
challenges, such as collaborative filtering, The reviews.csv file contains the
content-based filtering, and deep learning. following columns:
* Product ID: The unique identifier for
The article also discusses the future of AI the product.
in recommender systems. It argues that AI * Reviewer ID: The unique identifier for
has the potential to revolutionize the reviewer.
recommender systems by making them * Review Title: The title of the review.
more accurate, personalized, and * Review Text: The text of the review.
engaging. The article concludes by * Rating: The rating given by the
providing a number of recommendations reviewer.
for future research in AI-based * Date: The date the review was posted.
recommender system
This dataset can be used for a variety of
tasks, such as:
Dataset and Methodology * Analyzing the price trends of products
on AliExpress.
Dataset * Identifying the most popular products
on AliExpress.
The Ali Express Data dataset on Kaggle is * Determining the factors that influence
a collection of product data from product ratings.
AliExpress. It contains information on * Analyzing the sentiment of reviews for
over 900,000 products, including their products on AliExpress.
title, price, rating, reviews, and other * Identifying product trends by country or
details. The dataset is in CSV format and region.
2. Preprocessing Stage
model's first input layer. In order to
extract features from the picture, a
succession of convolutional layers
In this step, we utilized the TF-IDF
(Conv2D) and pooling layers
library class from the NLTK toolkit,
(MaxPooling2D) are applied to the input
which allows the vectorization of the
image. The pooling layers down-sample
product description data to produce a
the feature maps that the convolutional
fresh set of vectorized keywords that
layers output after applying filters to the
can be used for the model .
input picture. The features are then
Furthermore , we have deleted
flattened and sent through two dense
unnecessary column like query , storeID
layers (Dense), which modify the
and store name . These data doesn’t
features in non-linear ways by activating
contribute the model and it essential to
them using the 'ReLU' activation
get these things removed.The data with
function. The final prediction is then
less number of sales have not been
generated by the output layer using the
assigned with any ratings . Thus based
sigmoid activation function, which
on the store and class name sales of
converts the input to a probability-like
similar products , we have weighted a
output between 0 and 1. The model is
rating value which will be assigned to
built using the binary cross-entropy loss
the unrated products . These ratings will
function and the 'Adam' optimizer, and it
be palying a major role in filtering and
is trained using the fit technique on the
clustering of the data for
recommendation.

3. Proposed Model

A high-level neural network API


called Keras, which is based on
TensorFlow, was used to develop the
suggested deep learning model [37].
The model is intended for binary
classification, with the objective of
determining whether or not an input
picture belongs to a certain class. The
visualisation of our model for a
binary classification challenge is
shown in Figure 4. A 150 x 150 x 3
(height, width, and depth) picture that
depicts a colour image with three
channels (red, green, and blue) is the
training set of data. A loss of 0.061 and
an accuracy of 0.993 are shown in the
Figure 6. Model visualization to multi-classification task.
training results, demonstrating the
model's capacity to provide precise
Algorithm:
predictions based on the training data.
Step 1: BEGIN
The executive overview of the
suggested binary classification model
Step 2: INPUT: dataset_directory,
Figure 4. Model visualization to binary classification training_percentage,image_
task
augmentation_parameters,model_
parameters, optimizer, loss_
3.4. Proposed Deep Model for Multi-Classification
function, performance_metrics.
ľhe model in this study accepts
images of dimensions (150, 150, 3), Step 3: Load input dataset from
indicating that each image is 150 x dataset_directory
150 pixels with thíee coloí channels
(íed, gíeen, blue). ľhe model applies
a sequence of Conv2D and
Step 4: Split the dataset into training set
MaxPooling2D layeís to decíease the and validation set with
spatial dimensions of the image and training_percentage
extíact significant featuíes. ľhese
extíacted featuíes aíe then flattened Step 5: Instantiate an
and passed thíough two dense layeís
ImageDataGenerator object with
with ‘ReLU’ and ‘SoftMax’ activation
functions.
image_augmentation_parameters
ľhe ‘SoftMax’ activation function
píovides the final píobability scoíes foí Step 6:
each class in the classification task. 6.1 IF model_parameters is a pre-
ľhe model is compiled with an ‘Adam’ trained model THEN
optimizeí and a categoíical cíoss-
6.2 Load pre-trained model
entíopy loss function. It is tíained on
the tíaining data foí 100 epochs and
6.3 ELSE
evaluated on the validation data, 6.4 Define a deep learning model
achieving an accuíacy of 96%. ľhe using Keras with model_parameters
model’s visualization foí multi- 6.5 ENDIF
classification tasks is shown in Figuíe
6.
experiments. A standard approach of
Step 7: Compile the model using cross-validation (10-CV) is used for
optimizer and loss_function training and testing to ensure a fair and
reliable evaluation of the proposed AD
Step 8: Train the model on the training detection model. The approach is
set for a number of epochs with the implemented on a computer equipped
compiled model and Image Data with an NVIDIA Tesla T4 GPU and 14
Generator object. GB DDR4 RAM, using Keras, a Python-
based library. The ‘Adam’ optimizer is
Step 9: applied for training the neural network,
9.1 FOR each epoch in the training with binary cross-entropy as the loss
process DO. function for model 1 and Categorical
9.2 Evaluate the model on the Crossentropy as the loss function for
validation set using model 2. Four evaluation measures are
performance_metrics. used in this study: Accuracy, Precision,
9.3 IF the validation accuracy is not Recall, and F1-score.
improving THEN.
9.4 Reduce learning rate. 𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦=𝑇𝑃+𝑇𝑁/𝑇𝑃+𝑇𝑁+𝐹𝑃+𝐹𝑁

9.5 ENDIF. (1)


𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛=𝑇𝑃/𝑇𝑃+𝐹𝑃
9.6 ENDFOR (2)
𝑅𝑒𝑐𝑎𝑙𝑙=𝑇𝑃/𝑇𝑃+𝐹𝑁
(3)
Step 10: Test the final model on a 𝐹1-
separate test set to evaluate its 𝑠𝑐𝑜𝑟𝑒=2×𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛×𝑅𝑒𝑐𝑎𝑙𝑙/
𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛+𝑅𝑒𝑐𝑎𝑙𝑙 (4)
generalization performance using
performance_metrics.

Step 11: OUTPUT is the where TP denotes true positives, FP


performance_metrics of the proposed denotes false positives, TN denotes true
method and existing methods for negatives and FN denotes false
comparison. negatives.

Step 12: END


1. Experimental Analysis

RESULTS AND DISCUSSION: In this paper, two experiments are


evaluated using four metrics. The first
Results experiment is based on the first model,
which is used for a binary classification
task. The second experiment is based on
The proposed deep model is trained on
the second model, which is used for
the Kaggle dataset through multiple
multi-classification tasks. The paper
provides details and analysis of each
experiment.
A. The first experiment
The proposed deep model
was used to classify input MRI
images into two groups for a
binary challenge (AD or Normal).
The confusion matrix of the
proposed approach for detecting
AD is shown in Figure 9, where
class 0 represents normal
instances and class 1 represents
AD patients.

Figure 10. Loss curves (upper) and accuracy


curves (lower) for the training and testing data for
the proposed model for binary classification task.

B. The second experiment


The proposed deep model
was used for multi-classification
to categorize input MRI images
into four categories: Mild
Demented, Moderately
Demented, Non-Demented, and
Figure 9. Confusion matrix of the proposed method to Very Mild Demented. The
detect AD for binary classification tasks.
confusion matrix of the proposed
method for detecting demented
cases is shown in Figure 11. In
According to the confusion matrix this matrix, Class 0 refers to Non-
shown in Figure 10, it can be seen that Demented cases, Class 1 refers to
Very Mild Demented cases, Class
1081 normal MRI images were correctly 2 refers to Mild Demented cases,
detected as normal, while 0.3% of and Class 3 refers to Moderate
Demented cases.
normal cases were detected as AD cases.
Additionally, 98% of AD cases were
correctly detected as AD, while 13 MRI
images were incorrectly detected as
normal cases.
Figure 12. Loss curves (upper) and accuracy curves
Figure 11. Confusion matrix of the proposed model for (lower) for the training and testing data for the
multi-classification task. proposed model for multi-classification task.

According to the previous confusion DISCUSSION:


matrix in Figure 11, 653 Non-Demented
cases were correctly detected as Non- Alzheimer's disease (AD) early-stage
Demented; 2 MRI images of Non- prediction is a hotly debated issue in the
Demented cases were incorrectly world of medical research. Alzheimer's
detected as Mild Demented cases, and 6 disease is difficult to anticipate in its
MRI images were correctly detected as early stages, yet early therapy is more
Moderately Demented cases. We can also
efficient and results in less minor
find that 100% of the Very Mild
Demented cases are correctly detected as damage than late treatment1. In order to
Very Mild Demented cases. In addition, determine the most accurate parameters
we can observe that 93% of the Mild for Alzheimer's disease prediction, a
Demented cases are correctly detected as variety of algorithms including Decision
Mild Demented, 4.9% of the images are Tree, Random Forest, Support Vector
wrongly detected as Moderate Machine, Gradient Boosting, and Voting
Demented, and 2.1% are wrongly
classifiers have been used. The Open
detected as
Non-Demented cases. Finally, we can Access Series of Imaging Studies
also observe from the confusion matrix (OASIS) data is used to generate
that 91.7% of the Moderate Demented predictions for Alzheimer's disease, and
cases are correctly detected as Moderate the performance of ML models is
Demented, 42 MRI images are wrongly gauged using metrics including
detected as Mild Demented, 1.29% are Precision, Recall, Accuracy, and F1-
wrongly detected as Non-Demented
score. Machine learning (ML), a
cases and 0.16% of the images are
wrongly detected as Very Mild subfield of Artificial Intelligence (AI),
Demented cases. uses various probabilistic and
optimization techniques to help
computers learn from huge and
complicated data sets. To diagnose AD
in its early stages, researchers generally
use machine learning. The survey
provides a broad overview of current
research in this field and analyses the
classification methods used by
researchers working with ADNI data
sets. It discusses essential research
topics such as the data sets used, the
evaluation measures employed, and the
machine learning methods used. The
proposed
classification scheme can be used by has important implications for early
clinicians to make diagnoses of these diagnosis and treatment. However, there
diseases. It is highly beneficial to lower are some limitations to this study that
annual mortality rates of Alzheimer’s should be considered. The dataset used
disease in early diagnosis with these ML is relatively small and may not be
algorithms. The proposed work shows representative of the entire population.
better results with the best validation Additionally, only a single modality
average accuracy of 83% on the test data (MRI images) was considered, and
of AD. This test accuracy score is future studies could explore the use of
significantly higher in comparison with other imaging modalities in combination
existing works. with deep learning models.
Future work could focus on addressing
these limitations and exploring the use
CONCLUSIONS: of deep learning models in other areas of
medical imaging. The development of
The goal of this study is to assess the more explainable deep learning models
performance of deep learning models in that can provide insights into the
detecting and classifying Alzheimer's underlying biological mechanisms of
disease (AD) using MRI images. The AD could further advance our
results obtained in the binary understanding of this disease.
classification task, with an accuracy of
99.30%, and in the four-class
REFERENCES:
classification task, with an accuracy of
95.96%, demonstrate the potential of ● S. Pavalarajan, B. A. Kumar, S.
deep learning models for accurately S. Hammed, K. Haripriya, C.
detecting and differentiating between the Preethi and T. Mohanraj review
different stages of AD. The use of image of Detection of Alzheimer's
data with a shape of 150 x 150 x 3, as disease at Early Stage using
well as image augmentation techniques Machine Learning. 2022
and a SoftMax activation function with a International Conference on
dense four-output layer, were found to Advanced Computing
be critical factors in achieving these Technologies and Applications
results. This study contributes to the (ICACTA).
growing body of literature on the use of
deep learning models for AD detection
● K. Bhatt, N. Jayanthi and M.
and classification. Specifically, it
Kumar. Machine Learning Based
demonstrates the potential of using MRI
Optimal Feature Selection
images and deep learning models to
Technique for Early Stage
accurately detect and classify AD, which
Prediction of Alzheimer's Disease Communication Systems
,pp. 715-719. (ICACCS), pp. 101-104.

M. H. Memon, J. Li, A. U. Haq ● T. Subetha, R. Khilar and S. K.


and M. Hunain Memon, "Early Sahoo, "An Early Prediction and
Stage Alzheimer’s Disease Detection of Alzheimer's Disease:
Diagnosis Method," 2019 16th A Comparative Analysis on
International Computer Various Assistive Technologies,"
Conference on Wavelet Active 2020 International Conference on
Media Technology on pp. 222- Computational Intelligence for
225. Smart Power System and
Sustainable Energy (CISPSSE),
● M. Y. Marusina and A. D. pp. 1-5.
Bukhalov, "Convolutional
Neural Networks for Early ● P. Jadhao, P. Palsodkar, R. Raut,
Prediction of Alzheimer's K. Chaube, D. Rathod and P.
Diseases”, 2021 International Palsodkar, "Prediction of Early
Conference on Quality Stage Alzheimer’s using Machine
Management, Transport and Learning Algorithm," 2023 4th
Information Security, International Conference for
Information Technologies Emerging Technology (INCET),
(IT&QM&IS), pp. 394-397. pp. 1-5.

● M. Lavanya, R. R. Chandan, P. ● R. Sivakani and G. A. Ansari,


Rajasekar, P. R. Rham, M. "Machine Learning Framework
Deivakani and A. S. Mahesh for Implementing Alzheimer’s
Kumar, "Machine Learning- Disease," 2020 International
based Alzheimer’s Disease Conference on Communication
Prediction using Personalized and Signal Processing (ICCSP),
Methods," 2022 3rd International doi: 10.1109.
Conference on Smart Electronics
and Communication (ICOSEC),
pp. 1278-1283.

● J. Neelaveni and M. S. G.
Devasana, "Alzheimer Disease
Prediction using Machine
Learning Algorithms," 2020
6th International Conference on
Advanced Computing and

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy