FYP Thesis Template 2022
FYP Thesis Template 2022
HERE
Font: Times New Roman
Size: 24
by
Name
AU-02-05-08491
Name
AU-02-05-08493
Supervised by
Name
1
This work, entitled
“Project Title” has been approved for the award of
Supervisor: Name
2
Intellectual Property Right Declaration
Anti-Plagiarism Declaration
3
This is to declare that the above publication produced under the supervision of
Name having title: Name is the sole contribution of the author(s) and no part
hereof has been reproduced on as it is basis (cut and paste) which can be
considered as Plagiarism. All referenced parts have been used to argue the
idea and have been cited properly. I/We will be responsible and liable for any
consequence if violation of this declaration is determined.
Author(s):
4
ACKNOWLEDGMENTS
In the name of Allah, the Most Gracious and the Most Merciful
Alhamdulillah, all praises to Creator of the universe for the strengths and His
blessing in completing this project. This study is nothing, but an effort to
understand and articulate the principles of one of the several hundred thousand
phenomena, with a tool, the brain, a precious gift from Almighty.
We would like to express our sincere gratitude to our advisor Nam for the
continuous support of our BS Project, for her patience, motivation,
enthusiasm, and immense knowledge. Her guidance helped us in all the time
of research and writing of this thesis. I could not have imagined having a
better advisor and mentor for this project.
(Name)
Registration No
(Name)
Registration No
5
DEDICATION
We dedicate this project to the sake of Allah, our creator, and our great teacher
and messenger, Muhammad (May Allah Bless and Grant him), who taught us
the purpose of life. Our beloved Supervisor, Name leads us through the valley
of darkness with light of hope and support. Our great parents never stopped
giving of themselves in countless ways. Our friends have encouraged and
supported us. All the people in our life who helped us in completing this
project, we dedicate this project to them.
6
TABLE OF CONTENTS
Contents
ACKNOWLEDGMENTS V
DEDICATION Vi
LIST OF FIGURES Xi
ABSTRACT Xiii
Chapter 1- INTRODUCTION 1
7
2.1.2.1 Clients, Customer And Users 10
3.2.1 Sensors 15
8
4.3.1 Use Case Diagram for User 24
Chapter 5 – IMPLEMENTATION 28
5.1 Introduction 28
5.1.1 28
5.1.2 28
5.2 Algorithm 30
5.3 Implementation 50
5.4 Output 52
9
5.4.1.3 Color Picker 54
6.1 Testing 60
6.4 Evaluation 64
7.1 Conclusion 65
APPENDICES 67
Flows in Node-Red 67
Arduino’s Code: 69
Refrences 70
10
LIST OF FIGURES
Figure 2.1: Flow diagram 9
Figure 2.2: Illustration of Gestures 10
Figure 3.1: Manual Functioning 15
Figure 4.1: Architecture of Robo Arm 17
Figure 4.2: Working of Robo Arm 18
Figure 4.3: Program Flow 18
Figure 4.4: Arduino UNO 19
Figure 4.5: PCA9685 Arduino Servo Shield 20
Figure 4.6: HM10 Bluetooth 4.0 BLE 21
Figure 4.7: TCS3200 Colour Sensor module 22
Figure 4.8: 6DOF Mechanical Arm 23
Figure 4.9.1: Use Case Diagram for modes 24
Figure 4.10: Use Case Diagram for User 24
Figure 4.11: System Use Case Diagram 25
Figure 4.12: Class Diagram 26
Figure 4.13: Circuit Diagram 26
Figure 4.14.1: Sequence Diagram 27
Figure 4.15.2: System Sequence Diagram 27
Figure 5.1.: Node Red Interface 29
Figure 5.2: Node Red Interface Connectivity with Hardware 30
Figure 5.3: Node Red Interface 50
Figure 5.4: Node Red Interface 51
Figure 5.5: User Interface Control 51
Figure 5.6: Bluetooth Module Implementation 52
Figure 5.7: Sensor Based Implementation 52
Figure 5.8.1: Mode Selection Interface 53
Figure 5.9.2: Mode Selection Interface 53
Figure 5.10: Sensor Mode 54
Figure 5.11.1: Colour Picker 54
Figure 5.12.2: Colour Picker 55
Figure 5.13.1: Manual Mode 55
11
Figure 5.14.2: Manual Mode 56
Figure 5.15.3: Manual Mode 56
Figure 5.16: Components of the six DOF robotic arm kit 57
Figure 5.17: Robot arm screws and where to place them 57
Figure 5.18: connecting the shoulder servo to the arm 58
Figure 5.19: Elbow connected to the arm 58
Figure 5.20: Inferior wrist servo (left) and superior wrist servo (right) 58
Figure 5.21 Gripper connected to the wrist 59
Figure 5. 22: Assemble a Robo Arm on a Dashboard 59
Figure 5.23: FINAL HARDWARE 59
Table 6.3.1: Test Case 1 (Login) 61
Table 6.3.2: Test Case 2 (Manual Mode) 62
Table 6.3.3: Test Case 3 (Sensor Mode) 63
Table 6.3.4: Test Case 4 (Colour Picker) 64
12
ABSTRACT
about half a page long. Somebody unfamiliar with your project should have a
good idea of what it's about having read the abstract alone and will know
Keywords:
In alphabetical order
Chapter 1 INTRODUCTION
Robotics technology deals with plan, advancement, and benefit. Robotics
nowadays centers on the advancement of frameworks that give affectability,
modularity, adaptability, repetition and resistance of deficiencies. The
automation plays a noteworthy portion within the program to spare human
endeavors. This framework is valuable for customary tasks that are carried out
frequently. Vaishnav and Tiwari (2015) focused that picking and placing of
objects from source to goal is one of the major and most commonly done task
of robots. By giving sensor based to the robot Arming this profoundly creating
industry and man control are basic imperatives for completion of work.
According to Khajone et al. (2015) in order to spare human endeavors, the
computerization is playing imperative part in this framework. This framework
is valuable for normal and as often as possible carried works.
Robo Arm is and articulated mechanical robotic arm that can be moved from
mobile phone physically or by sensors of phone. User can be able record and
play the movement. This arm demonstrates the working in three modes:
Manual
13
Each of six servos can be adjusted to desired position.
Sensor Slaved
Color Picker
The arm picks the object based on color selected and is automated.
Gesture Robo Arm is the core component of the project. To operate this
robotic arm the user grips connected Smartphone, and lets his hand rotate or
move, in any direction. The measuring instrument at intervals that is the
phone, is controlled in order to come up with most minimum price for moving
the hand in three dimensional co-ordinates. This process relies upon the
conditions of external environment. The android application will perform the
task of sensing the standardization of measuring instrument and producing the
smallest values from it. On the basis of obtained values, the determinant value
is sent to the Arduino UNO by it, using the Bluetooth. The module: HM10
Bluetooth 4.0 BLE accepts the knowledge and transmits this knowledge to
Arduino UNO wherever, it checks determinant price and moves the Arduino
consequently. This complete method is under the associate degree of infinite
loop; therefore, it is supposed to run as long because, power is equipped.
Hence, the output relies on the computing mechanism that is used for the
management of the Robo arm.
14
current work done. This document will describe all the processing,
composition and working of the Robo Arm.
15
accelerator is additionally given for the wrist and elbow developments. A
handy human mechanical arm with seven degrees of opportunity has been
planned. The utilization of this module limits the farther get to of the robotic
arm to many meters. And, in another system proposed and analyzed
performance of a wireless robotic is a using of flex sensors Brahmani et al.
(2013). The designing and development of a robotic arm having five fingers is
used for the patients of leprosy as it has a smaller degree of dexterity. The
robotic hand for tele-surgery using haptic technology Pawar and Lad (2016)
was also implemented. But the major limiting factor was the time delay
between the instructions. However, our project deals with the efficient
working of BLE Bluetooth Module. And, deals with the proper working of
robotic arm with the accessibility of gesture, sensor or color picking module.
16
drivers, motors, controlling elements and sensors are the parameters that must
be considered. Recognizability, compatibility, implementation and user
interfaces are the characteristics of software design for the robotic arm. In this
section of the documentation, the previous work and research related to robots,
gesture control and robotic arms has been discussed.
Khajone et al. (2015) stated that Pedro Neo proposed the technique of
industrial robots based on the accelerometers. The accelerometers are attached
to arms of human beings and the behavior is captured. This approach is easy to
work but the absence of gyroscope in the system makes it inefficient.
Brahmani et al. (2013) used the flex sensors for analyzing the performance of
wireless robotic hand. This solution was very precise and cost effective also. It
aided in treating the patients of leprosy. This approach was also used in tele
surgeries through haptic techniques and technology. But its major issue was
17
that the time required for executing the instructions was higher and this delay
can lead serious threats to patient’s life.
Wakode et al. (2019) focused on developing robotic arm by the use of Flex
sensors and direct current motors that are connected via Arduino UNO. This
kind of robotic arms are easy to operate and cheap. They are free from wired
connections but they still require enhancement and addition of new ideas.
Vaishnav and Tiwari (2015) gave a detailed overview of such kind of robots
that are operated and controlled by the handicaps or hand gloves via Bluetooth
in wireless manner. The input is supplied through sensor, LCD display and
Bluetooth device, while the output is generated by NXT microcontroller,
motor and the camera. The Bluetooth used is classic one that is not secure and
reliable.
Katal et al. (2013) discussed the process of designing and implementing the
synchronized robotic arm. This robotic arm performs the functions of picking
and placing the objects but unfortunately, this robotic arm does not contain
any sensor in it.
Pawar and Lad (2016) discussed and proposed the haptic technology in order
to design a robotic arm based on this technology. The Potentio meters are
responsible for providing the haptic feedback. It does not use the sophisticated
system. But this technology has an issue that the feedback provided is not
reliable.
In order to fill all of these gaps we have proposed the Robo Arm. Robo Arm
moves according to the movement or gestures of hands or by remotely
accessing the Robo Arm with application. Robo Arm is an articulated arm
which can be controlled via Bluetooth, hand gestures and an Android
application simultaneously. There are three modes in the Arm. First, the
Manual mode in which each of six servos can be adjusted to desired position.
You can simply move the arm with Bluetooth in your portable phone by
moving the sliders on app. The sliders consist of rotates, clutching, lifting,
neck, swing or elbow. At the same time, you'll be able to save your positions
18
and you will be able to play them later. Second is Motion Mode that
employments your phone's Orientation Sensor and Nearness Sensor. When
they are changed, app sends commands to Arduino. Third is the color picking
mode in which the arm picks the object of selected color and is automated.
The Center module chosen is HC-06 Bluetooth 4.0 module. By utilizing this
Bluetooth module, Bluetooth highlights can be rapidly included to Arduino.
The manufacturing plant setting is slave mode, but the module can be set to
ace mode so that other Bluetooth 4.0 gadgets can be associated. The
Bluetooth 4.0 module gets commands from the application on the android
gadget and sends it to the controller.
19
1.8 Scope of this Project
Robo Arm is designed in such a way that it can provide help in surgery,
agricultural fields, and industrial areas and in rescuing the bomb squad. The
doctors (can do surgical treatment of the patient even they are far from
operation theater by just controlling the robot through hand held application),
agricultural fields (famers can plug the flowers or fruits in desired location).
Industrial areas (employs can perform the action of picking or placing the
things from one place to another through Robo arm), and in rescue (robotic
arm is used as a key element in bomb squad that can carry and place the bomb
at a safe location by just handling the arm from far away).
• NODE RED
• Arduino
20
Chapter 2 SYSTEM ANALYSIS
2.1 Data Analysis
2.1.1 Flow Diagram
21
Figure 2.1: Flow diagram
Trainable Robo arm is able to detect the five gestures i.e. right, backwards,
left, and forward and stop. Illustration of these gestures is given below.
Arm should move with Bluetooth in cell phone with the movement of sliders
in mobile app.
22
Picking and Placement of objects:
The robotic arm should pick and place the objects based on given color.
The robotic arm should be controlled from a long distance such as from a site
distance and the communication range is100m.
The software user interface should be compatible to android mobile phone and
also web based, with the help of which users can control the robotic arm. The
software implementation can be done through any software or programming
tools like, Node Red for Android and Windows application development in a
graphic development environment.
Hardware Interface
The android mobile phone must have BLE (Bluetooth low energy) for lower
power consumption. Arduino microcontroller is an interface which is used to
control the robotic arm. Node-Red is the hardware interface for the system to
run on the network and low-cost hardware like Raspberry Pi and cloud as well.
Software Interface
The Software interface requires a Mobile Application for Manual and Gesture
modes. Node-Red is required for browser-based flow editor for flows of wires.
Communications Interfaces
23
● User interface should be efficient and friendly.
● All the widgets of mobile app should be user friendly.
● Both of the modes i.e. Manual and Sensor, should be clearly visible for
better performance.
Node-Red
Use Node Red for easy and browser-based setup the wires of hardware.
Sensors
Bluetooth
24
PCA9685 Arduino Servo Shield
Use PCA9685 Arduino Servo Shield as, it can control twenty servo
motors at a time. Six, servo motors have been used and every servo
motor has three wires which in a whole makes a large number of wires
that is difficult to manage.
25
Chapter 3 DESIGN CONSIDERATIONS
3.1 Design Constraints
3.1.1 Hardware and software environment
Robo Arm is using an Android phone as well as computing module like Node
Red for the development of application that will be used for the working of
robotic arm. It can make a connection with Arduino UNO and it provides
connectivity of application via Bluetooth. Mobile phone supporting the
accelerometer sensor and gyro sensor support the application as well. When
user start the application, Bluetooth is connected. Through Bluetooth module
connectivity, the movement of arm can be held in a smooth manner.
Product
User
Organization
26
2. Saving man power
3. Saving cost of organization
4. Security organizations can use this robotic arm for bomb squad and
save human life in case of any danger
27
3.2.5 Future enhancements/plans
In Future we will implement this wearable gadget to the hospitals so
the doctor can monitor patient at his home to save time. And also include
some more functionality to our product like human movement, cough sensing
and also made an android application for the people.
28
Chapter 4 SYSTEM DESIGN
4.1.1 System Architecture and Program Flow
Architecture of robotic arm is shown in the diagrams given below:
29
Figure 4.2: Working of Robo Arm
Manual
Sensor
Select Mode
User Color
Perform Actions Picker
User
Login Recognize the
Interface
actions
System Do as directed
Results
30
4.2 Detailed System Design
4.2.1 Detailed component description
(Explain all the modules or components of your project)
31
4.3.3 System Use Case Diagram
32
4.4 Class Diagram
33
Figure 4.14.1: Sequence Diagram
34
Chapter 5 IMPLEMENTATION
5.1 Introduction
Write how you have done the implementation of your project
5.2 Algorithm
Robo Arm is an articulated arm which can be controlled via Bluetooth, hand
gestures and an Android or Windows application simultaneously. The
algorithm for this project is developed in such a way that first of all, the main
application performs the working of manual mode. In the Manual Mode each
of six servos can be adjusted to desired position and the user can simply move
the arm with Bluetooth by moving the sliders on the mobile app. The sliders
are specified as rotate, clutch, lift, neck, swing and elbow. At the same time,
the user can save positions and play them in future. Second is Motion Mode
that employments the phone's Orientation Sensor and Nearness Sensor. When
they are changed, app sends commands to Arduino. Third, is the color picking
mode in which the arm picks the selected color and is automated. By utilizing
HC-06 Bluetooth 4.0 module Bluetooth highlights can be rapidly included to
Arduino. The manufacturing plant setting is slave mode, but the module can
be set to ace mode so that other Bluetooth 4.0 gadgets can be associated. The
Bluetooth 4.0 module, gets commands from the application on the android
gadget and sends it to the controller.
5.3 Implementation
35
5.3.1 Main Screen of Node Red
36
Figure 5.5: User Interface Control
37
5.3.4 Sensor based Implementation
5.4 Output
5.4.1 Software Implementation:
5.4.1.1 Mode Selection
38
5.4.1.2 Sensor Mode
39
Figure 5.12.2: Colour Picker
40
Figure 5.14.2: Manual Mode
41
5.4.2 Final Hardware
42
Figure 5.18: connecting the shoulder servo to the arm
Figure 5.20: Inferior wrist servo (left) and superior wrist servo (right)
43
Figure 5.21 Gripper connected to the wrist
44
Chapter 6 TESTING AND EVALUATION
6.1 Testing
After development and implementation, the testing of software is most
difficult task of the software development process. The testing phase involves
evaluation of all the components of software, making changes and improving
the software if required and after that finding errors and resolving them so that
they may not harm the whole system. Basic objective behind testing phase is
to find out such problems and gap in the developed system that lets the
software to not fulfil the requirements. It is notable that there must be a
planned strategy for testing that must be followed in an accurate manner to
make ensure that the end product is efficient, stable and reliable. And the end
product must be delivered on time. Hence all the timeline must meetup for
successful completion of the project.
45
6.3 Test Cases
6.3.1 Test Case 1 (Login)
Steps to Perform:
2. Go to login section
46
Objective: To test whether the
system can be controlled by the user Test ID: 2
Manually or not.
Steps to Perform:
2. Go to Manual Mode
47
Steps to Perform:
48
Steps to Perform:
Expected Results: The robotic arm will pick the chosen colour.
6.2 Evaluation
49
Chapter 7 CONCLUSION AND FUTURE WORK
7.1 Conclusion
The Robotics technology inquiries about creating robots that measures quality,
adaptability, repetition, fault-tolerance and some other analysts are centered on
totally robotizing a creating handle or an assignment, by giving sensor based
to the robot. In this project robot is developed that works according to human
hand in sensor slaved mode. The robot shows appropriate responses every
time the user moves his hand. The hand gestures to move the robotic arm in
specific directions are left, right, forward, backward. We have used the
technique of picking and placing and this system is used to pick object from
source location and places them at anticipated location. Device can be
controlled in more natural way when the system of hand gesture controlled
robotic is used. The robotic arm follows the commands that are based on the
hand gestures of the user for navigating in the specific directions. In this
project the robotic arm can be controlled via hand gestures instead of using
any other external hardware for the input. This project focuses on
implementing such robotic arm that would be efficient, user friendly and
secure. For this purpose, gesture controlled robotic arm is designed and
implemented. All the three modules of the project (manual, color picker and
app) have been developed successfully. The software implementation is done
through Node Red and Arduino to work the robotic arm properly. This Robo
arm has been made exceptionally carefully and in a point by point way so that
the development of the robot can be controlled precisely. This automated arm
control strategy will be supportive in numerous angles to form human life.
In future, this robotic arm can be utilized for observation reason. Instead of
using Bluetooth to access the robotic the robotic arm from distant area Wi-Fi
can also be used. Edge sensor can be consolidated to the robotic arm in order
to save it from falling from the surface. A few cameras can be introduced in
the system which can be used to record the information and send it to adjacent
50
computer or mobile phone. Along with this, another idea that can be
implemented in the industrial field is the use of robotic arm with conveyor belt
for the picking and placing job in industrialization area.
51
APPENDICES
52
53
Arduino’s Code:
54
55
56
References
Atre, P., Bhagat, S., Pooniwala, N., & Shah, P. (2018, June). Efficient and
feasible gesture controlled robotic arm. In 2018 Second
International Conference on Intelligent Computing and Control
Systems (ICICCS) (pp. 1-6). IEEE.
Brahmani, K., Roy, K. S., & Ali, M. (2013). Arm 7 Based Robotic Arm
Control by Electronic Gesture Recognition Unit Using
Mems. International Journal of Engineering Trends and
Technology, 4(4), 50-63
Clabaugh, C., & Matarić, M. (2018). Robots for the people, by the people:
Personalizing human-machine interaction. Science Robotics, 3(21),
eaat7451.
Chanda, P., Mukherjee, P., Modak, S., & Nath, A. (2016). Gesture
Controlled Robot using Arduino and Android. International
Journal, 6(6).
Guo, S., Diao, Q., & Xi, F. (2017). Vision based navigation for Omni-
directional mobile industrial robot. Procedia Computer
Science, 105, 20-26.
57
Ishak, M. K., Roslan, M. I., & Ishak, K. A. (2018). Design of robotic arm
controller based on Internet of Things (IoT). Journal of
Telecommunication, Electronic and Computer Engineering, 10(2-
3), 5-8.
Patil, C., Sharma, S., & Singh, S. (2019). Design and Implementation of
Gesture Controlled Robot with a Robotic Arm.
58
Pascal, C., Raveica, L. O., & Panescu, D. (2018, October). Robotized
application based on deep learning and Internet of Things. In 2018
22nd International Conference on System Theory, Control and
Computing (ICSTCC) (pp. 646-651). IEEE.
Tao, Z., Zhang, T., Qi, M., & Ji, J. (2017, May). Research and
Implementation of a new 6-DOF Light-weight Robot.
In Proceedings of the IOP Conference Series, Earth and
Environmental Science, Chengdu, China (pp. 26-28).
Yaseen, S., Fathima, S., Surendra, K. V., Jebran, P., & Sanober, S. (2017,
August). Gesture controlled touch less response using image
processing. In 2017 International Conference on Energy,
Communication, Data Analytics and Soft Computing (ICECDS)
(pp. 2610-2614). IEEE.
59