0% found this document useful (0 votes)
51 views4 pages

Wa0007.

The document discusses touchless touch screen technology using hand gestures. It provides background on touch screens and issues with frequent touching. The technology discussed uses sensors and optical pattern recognition to detect hand motions and movements for touchless control without contacting the screen.

Uploaded by

Sharath Hn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views4 pages

Wa0007.

The document discusses touchless touch screen technology using hand gestures. It provides background on touch screens and issues with frequent touching. The technology discussed uses sensors and optical pattern recognition to detect hand motions and movements for touchless control without contacting the screen.

Uploaded by

Sharath Hn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

ISSN XXXX XXXX © 2019 IJESC

Research Article Volume 9 Issue No. 4

Survey on Touchless Touch Screen Technology using Hand


Gestures
Kavitha. G1, Veena. M2
Department of CSE
Sapthagiri College of Engineering, VTU, India
Abstract:
Touch Screen is an important source of input or output device layered on top of an electronic visual device. A user gives the input
or control the information processing through single or multi-touch gestures by touching the screen. Frequent touching a
touchscreen display with a pointing device such as a finger or if there is any scratch caused due to major problems can result in
the gradual de-sensitization. To avoid this, a simple user interface for Touchless control of electrically operated equipments being
developed. Unlike other systems sensor or sensor selection depends on distance of hand or finger motions, a hand wave in a
certain direction, or a flick of the hand in one area, or holding the hand in one area or pointing with one finger.
1. INTRODUCTION assistant, satellite navigation devices, mobile phones and video
games[1]. Human hand gestures provide the significant and
Touch Screen is an important source of input or output device effective resources of non-verbal communication with the
layered on top of an electronic visual device. A user gives the human and computer. Hand gestures are meaningful expressive
input or control the information processing through single or body motions that are actions of fingers and hands. Hand
multi-touch gestures by touching the screen. It enables the user gestures ranges from simple identical or static gestures that are
to interact directly with what is displayed, rather than using used to point to the objects around, with the more complex
any intermediate device. The Touch less touch screen sounds gestures or dynamic gestures that express person’s thoughts
like it would be nice and easy, however after closer and allows to communicate with others.Several Hand gesture
examination it looks like it could be quite a workout. This recognition techniques have already exited and most of them
screen is made by TouchKo, White Electronics Designs, and are based on Hidden Markov Models [6]. Gesture is the
Group 3D. It works by detecting your hand movements or physical movement might be small or huge right from the clap
hand wave in certain directions in front of it. To avoid of hand to a roundhouse kick, or a nod of head, sometimes
malfunctions of Touch Screen, a simple user interface for voice is also considered as a gesture. Gesture recognition is the
Touchless control of electrically operated equipments being ability of a devise to capture the human body movements and
developed. Elliptic Labs innovative technology make ous to compute the data or command given by the user and execute
control ourgadgets like Computers, MP3 players or mobile the output accordingly. Gesture is usually used are a form of
phones without touching them. A simple user interface for input to a devise, this makes the interaction of human more
Touchless control of electrically operated equipment is used. natural with the computers. Gesture input is the most
Unlike other systemssensor or sensor selection depends on comfortable way to convey to computer what command we are
distance of hand or finger motions, a hand wave in a certain willing to execute. Gesture recognition is widely implemented
direction, or a flick of the hand in one area, or holding the hand in 3D-gaming technology, virtual reality and simulation
in one area or pointing with one finger for example. The device modeling environment [7].
is based on optical pattern recognition using a solid state
optical matrix sensor with a lens to detect hand motions and 2. LITERATURE SURVEY
movements. The device is based on optical pattern recognition
using a solid state optical matrix sensor with a lens to detect User interface exists everywhere e.g. washing machine which
hand motions and movements. The sensor in matrix is then we use on the top surface there are various buttons that is
connected to a digital image processorand output the results as nothing but a UI. You can stop the machine or turn ON just by
signals to control fixtures, appliances, machinery or any device the selecting proper button on the UI. User interface comes in
controllable through electrical signals. This system depends on very handy where user can’t actually go e.g. chemical industry.
the finger or hand motions, hand wave in certain direction, User interface should be easy to understand, speedy in
with this your hand doesn’t have to come in contact with the operation. Just for understanding a UI no special training
screen. It requires a sensor, the sensor can be either placed near should be required. If we considered our smart phone UI they
the screen or on the table. Elliptic Labs named as “Touch Less are easy to understand even a small baby play with it and
Human or Machine User Interface for 3D Navigation”. Touch under- stand. In this paper different types of UI are explained
screen technology is an electronic visual display is used to and com- pared. Also why touch screen is best it is explained
detect the presence and location of a touch within the display further. Different touch screen technologies are compared and
area. The term generally refers to touching the screen of the why capacitive touch screen is preferred is explained. Also UI
device with finger or hand. Touch-screens can also sense other is designed and developed also few parameters like threshold,
passive objects such as stylus,etc. Touch screen are common in response time and crosstalk are explained in the paper. The
devices such as computers, tablet computers and smart-phone. user interface or UI is nothing but helps the user to control the
This touch screen displays can be attached to computers or to machine. The UI are classified as follows:-Button type, GUI
networks as terminals. They also spend time in a prominent type, Touchtype. The touch screen based UI are preferred
role in the design of digital appliances such as personal digital

International Journal of Engineering Science and Computing, April 2019 21740 http://ijesc.org/
because they are compact in size, speedy in operation and just
by the symbol or the label you can understand the
operation.From this touch screen technology infrared and
capacitive is famous [2]. In recent years, a mobile device is
more friendly and powerful. The trends of new developed
mobile device are focusing on big screen and thinness. The
intuitive operation input is gradually becoming an important
topic. With the advance of PC and mobile device technology,
more and more new human machine interfaces are invented.
Electronic mouse enables the user to control the position of the
mouse cursor on the screen and give commands such as menu
selection or editing the document on the screen. Touchscreen
sensor technologies, including capacitive, resistive, magnetic
or surface acoustic wave types, let the user be able directly to
point out a position on the screen and move the objects across
the screen. Some other devices, such as air mouse, 3D mouse
or IR LED and proximity sensor are applied to control the
cursor of the screen by detecting the movement in the air. The
most intuitive way to interact with devices is to operate right in
the center of the screen. In our system, we use the proximity
sensor to detect the inferred from IR LED. The operation area
in our system is limited between the sensors. The maximum 3. WORKING
operation distance from devices is affected by the emitting
power of IR LED. The proposed system allows a bare hand or Figure consists of a IR sensors which are mounted near the
finger to trace a screen position touchless in a certain distance screen. When the light strikes to the 3D object, the light gets
from a device. In some situation, the touchless control ability is reflected. It consists of a solid state optical matrix sensor with
useful. For example, when the hand is dirty after performing a lens which recognizes the optical pattern the hand motions
mechanic works, or greasy after handling thefood, users will with the help of that reflected light. In each of these sensors
have the requirement of interacting with the PC or mobile there are matrix pixels. Each matrix pixel is coupled to
devices touchlessly [4]. Computer information technology is photodiodes incorporating charge storage regions. The
increasingly able to make a way into the hospital domain. One reflected IR light enters to the sensors and hits the pixel matrix.
such hospital domain where in the information technology has When the photon of sufficient energy present in the light
been unavoidable is Interventional Radiology. Interventional strikes the photo diode, it create electron-hole pair. If the
Radiology (IR) is one of the rapidly growing areas of medicine absorption occurs in depletion region, this carriers are swept
that provides solutions to common problems affecting men and from the junction by the built-in electric field of depletion
women of all ages. This is a minimally invasive treatment for region. Thus holes move towards the anode and electrons
vascular and non-vascular disease, using small catheters and towards the cathode, and current is produced which results in
catheterbased instruments guided by radiological imaging the electric charge. Which is given by,I = Q/t Where, I =
techniques such as x-rays, fluoroscopy, ultrasound, MRI and Current
CT. These non-surgical techniques are advancing medicine and Q = Charge
improving outcomes for a range of patients with life- t = time
threatening conditions. Interventional radiologists are Thus, the sensor generates electric signals. This signals are in
physicians who specialize in minimally invasive, targeted the form of analog. Thus these signals are converted into
treatments that are performed using imaging guidance. The digital signals with the help of analog to digital converter for
interventional radiologist or surgeon performing this procedure further processing. The digital output of ADC (Analog to
needs to interact frequently with an increasing number of Digital Converter) is given to the host controller (HC). The
computerized medical systems before and during surgeries in host controller controls the transmission of packets on the bus.
order to review medical images and records. However, the Frames of 1msec are used. At the start of each frame in the
computers and their peripherals are difficult to sterilize, so transmission the host controller generates a Start of Frame
usually during a surgery, an assistant or nurse operates the (SOF) packet. To synchronize the start of the frame and to
mouse and keyboard for such interactions. This mouse and keep track of the frame number SOF packet is used. It also
keyboardinteraction suffers from communication problems and controls depth map i.e. an image that contains information
misunderstandings. This is one of the main reasons why, in relating to the distance of the surfaces of scene objects from a
recent years, touch-less interactions has been considered for view point. Host controller gives its output to the sequence
use in operation theatres [3]. Nowadays, the interest has been controller. Sequence controller controls the user actions and
increased in creating wearable device interaction approach computer logic that initiate, interrupt, or terminate transaction.
Technology like Novel emerging user interface have the Sequence controller allow users to take initiative and control
capacity to significantly affect market share in PC, their interaction with the computer; try to anticipate user
smartphones, tablets and latest wearable devices such as head requirements and provide appropriate user control options and
wearable device (HWD), i.e Google Glass, since the computer responses in all cases. The output of sequence
miniaturization of mobile computing devices permits anywhere controller is given to the both pixel matrix and modulator for
access to the information. Therefore, displacing these controlling the action. The digital modulator maps the input
technologies in smart devices is becoming a hot topic. Google binary sequence of 1’s and 0’s to analog signal waveform. It
Glass has many impressive characteristics, and will not meet modulates the digital output of sequence controller. Thus the
the occlusion problem and the fat finger problem, which 3D movement are detected and interpreted into the electric
frequently occurs in direct touch controlling mode [5]. signals which are processed by the digital image processor to

International Journal of Engineering Science and Computing, April 2019 21741 http://ijesc.org/
provide output to the devices, thus controlling the navigation 5. RESULTS
according to the user's hand gestures. In this way the touchless
screen technology works. The digital modulator maps the input binary sequence of 1’s
and 0’s to analog signal waveform and it modulates the digital
4. PROPOSED METHOD output of sequence controller. Thus the 3D movement are
detected and interpreted into the electric signals which are
(i) Gesture of Hand Movements: processed by the digital image processor to provide output to
System depends on the finger or hand motions, hand wave in the devices. And outputs the results as signals to control
certain direction, with this your hand doesn’t have to come in fixtures, appliances, machinery, or any other devices which are
contact with the screen. Leap motion controller is used to controllable through electrical signals.
translate hand movements into computer commands. Initial
tests were conducted to establish how the controller worked
and to understand their basic interaction. The controller is used
to test the recognition of sign language. The finger spelling
alphabet was chosen for the relatively simplicity of individual
signs, and for the diverse range of movements involved in the
alphabet. The main focus of these tests is to evaluate the
capabilities and accuracy of the

Controller to recognize hand movements. There is a particular


meaning for different motions or gestures.

(ii) Detection of Movements using Sensors:


Sensors are placed around the display that is being used, by
interacting in the line-of-sight of these sensors the motion is
The standard deviations of the x-axis and y-axis are 0.06 cm
detected and interpreted into on screen movements. The device
in touchless screen is based on optical pattern recognition 0.024 cm respectively at a fixed reference point. The
using a solid state optical matrix sensor to detect hand motions positioning accuracy is affected by undesired objects.
with the help of lens. The device in touchless screen is based
on optical pattern recognition using a solid state optical matrix 6. CONCLUSION
sensor. This sensor is then connected to a digital image
Today’s thoughts are again around user interface. Efforts are
processor, which interprets the patterns of motion, movements
and outputs the result as signals. In each of these sensors there put to better the technology day-in and day-out. The Touchless
is a pixels of matrix. Each pixel in matrix is coupled to screen user interface can be used effectively in computers, cell
phones, webcams and laptops. May be few years down the
photodiodes incorporating charge storage regions. The
line, our body can be transformed into a virtual mouse, virtual
reflected IR light enters to the sensors and hits the pixel matrix.
keyboard and what not, Our body may be turned in to an input
When the photon of sufficient energy present in the light
strikes the photo diode, it creates electron-hole pair. If the device. The distance information can be directly obtained
absorption occurs in depletion region, these carriers are swept between each sensor and the hand. Since the positioning
from the junction by the built-in electric field of depletion algorithm fully utilizes all distance information from all
sensors, the positioning results of hands is much more reliable.
region. Thus holes move towards the anode and electrons
towards the cathode, and current is produced which results in 7. REFERENCES
the electric charge. Thus, the sensor generates electric signals.
[1]. AnjulJain(1), Diksha Bhargava(2), Anjani Rajput(3)
(iii) Convert Electric Signals using DIP: “TOUCH-SCREEN TECHNOLOGY” International Journal of
The digital modulator maps the input binary sequence of 1’s Advanced Research in Computer Science and Electronics
and 0’s to analog signal waveform. It modulates the digital Engineering (IJARCSEE) Volume 2, Issue 1, January 2013.
output of sequence controller. Thus the 3D movement are
detected and interpreted into the electric signals which are [2]. Nikita S Ranadive, Prof A. R. Suryawanshi, Mr Ajinkya
processed by the digital image processor to provide output to Gautame “Effective Touch Screen Based User Interface”
the devices, thus controlling the navigation according to the International Journal of Engineering Technology Science and
user's hand gestures. Research (IJETSR) Volume 4, Issue 5:May 2017.

International Journal of Engineering Science and Computing, April 2019 21742 http://ijesc.org/
[3]. K. R. Sivaramakrishnan, Gattamaneni Kumar Raja, Chegu
Girish Kumar2015 International Conference on Automation,
Cognitive Science, Optics, Micro Electro-Mechanical System,
and Information Technology (ICACOMIT), Bandung,
Indonesia, October 29–30, 2015

[4].Cheng-Ta Chuang, Tom Chang, Pei-Hung Jau, Fan-Ren


Chang Touchless Positioning System Using LED Sensors”
2014 IEEE International Conferenee on System Seieneeang
Engineering (ICSSE) July 11-13 2014,Shanghai, China.

[5]. ZhihanL v, Liangbing Feng, Extending Touch-less


Interaction on Vision Based Wearable Device”, IEEE Virtual
Reality Conference 2015.

[6]. S. Prakasam, M. Venkatachalam, M. Saroja, N. Pradheep


“Gesture Recognition Using a Touchless Sensor to Reduce
Driver Distraction” International Research Journal of
Engineering and Technology (IRJET) Volume: 03 Issue: 09
Sep-2016.

[7]. Minal Almas. D, [2] Sana.L, [3] Sunitha S “Gesture


Recognition Using a Proximity (Touch less) Sensor and Haptic
Technology” International Journal of Engineering Research in
Computer Science and Engineering (IJERCSE) Vol 5, Issue 4,
April 2018.

International Journal of Engineering Science and Computing, April 2019 21743 http://ijesc.org/

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy