0% found this document useful (0 votes)
81 views42 pages

Technical Seminar Documentation (1) D8

The document summarizes the history of touchscreen technology. It details some of the early developments including the first capacitive touchscreen developed in 1965 and the first transparent touchscreen created by CERN engineers in the early 1970s. It also discusses the development of resistive touchscreens in the 1970s and optical touchscreens used in PLATO IV systems in the 1970s and 1980s. The summary provides an overview of the origins and progress of touchscreen technology.

Uploaded by

Boddu Nikitha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views42 pages

Technical Seminar Documentation (1) D8

The document summarizes the history of touchscreen technology. It details some of the early developments including the first capacitive touchscreen developed in 1965 and the first transparent touchscreen created by CERN engineers in the early 1970s. It also discusses the development of resistive touchscreens in the 1970s and optical touchscreens used in PLATO IV systems in the 1970s and 1980s. The summary provides an overview of the origins and progress of touchscreen technology.

Uploaded by

Boddu Nikitha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 42

TOUCHSCREEN TECHNOLOGY

REPORT
Seminar Report submitted in partial fulfilment of the requirements for the
award of the degree of B.Tech. in Vignan Institute of Technology & Science
affiliated to

JNTU Hyderabad

By

DONURU SWETHA
20891A04D8

DEPARTMENT OF ELECTRONICS &


COMMUNICATION ENGINEERING
VIGNAN INSTITUTE OF TECHNOLOGY & SCIENCE

DECEMBER-2024.

1 1
CERTIFICATE

This is to certify that the Seminar entitled TOUCHSCREEN TECHNOLOGY


presented by DONURU SWETHA bearing Registration No. 20891A04D8 of
Vignan Institute of Technology & Science in JNTUH has been completed
successfully.

This in partial fulfilment of the requirements of the Bachelor degree in


Electronics & Communication Engineering, Vignan Institute of Technology &
Science under Jawaharlal Nehru Technological University Hyderabad,
Telangana.

MR MEENAIAH DR P A HARSHA VARDHINI


Technical Seminar Coordinators Head of the Department

1
Acknowledgements

I would like to express my deep and sincere gratitude to my technical seminar


Coordinators, MR MEENAIAH, Professors in ECE Department, Vignan institute of
Technology & science engineering for their unflagging support and continuous encouragement
throughout the seminar work. Without their guidance and persistent help this report would not
have been possible.

I would like to thank our Head of the Department of Electronics and Communication
Engineering, DR P A HARSHA VARDHINI, a distinguished and eminent personality, whose
strong recommendation, immense support and constant encouragement has been great help to
me.

I convey my sincere thanks to Dr. G Durga Sukumar, Principal of our institution for
providing me with the required infrastructure and a very vibrant and supportive staff.
I thank our beloved Chairman, Dr. L Rathaiah, who gave us great encouragement to
work.

I thank our beloved CEO, Mr. Boyapati Shravan, we remember him for his valuable
ideas and facilities available in college during the development of this technical seminar
report.

D SWETHA
20891A04D8
ECE Department

3
Abstract

It was the touch screens which initially created great furore. Gone are the days when
you have to fiddle with the touch screens and end scratching up. Touch screen displays
are ubiquitous worldwide. Frequent touching a touchscreen display with a pointing
device such as a finger can result in the gradual de-sensitization of the touchscreen to
input and can ultimately lead to failure of the touchscreen. To avoid this a simple user
interface for Touchless control of electrically operated equipment is being developed.
Elliptic Labs innovative technology lets you control your gadgets like Computers, MP3
players or mobile phones without touching them.

A simple user interface for Touchless control of electrically


operated equipment. Unlike other systems which depend on distance to the sensor or
sensor selection this system depends on hand and or finger motions, a hand wave in a
certain direction, or a flick of the hand in one area, or holding the hand in one area or
pointing with one finger for example. The device is based on optical pattern recognition
using a solid state optical matrix sensor with a lens to detect hand motions. This sensor
is then connected to a digital image processor, which interprets the patterns of motion
and outputs the results as signals to control fixtures, appliances, machinery, or any
device controllable through electrical signals.

3
TABLEOFCONTENTS
TITLE PAGE NO.

I. CERTIFICATE
II. ACKNOWLEDGEMENT

1. INTRODUCTION TO TOUCHSCREEN 6

2. HISTORY OF TOUCHSCREEN 7

3. WORKING OF TOUCHSCREEN 9

4. ADVANTAGES OF TOUCHSCREEN 12

5. DISADVANTAGES OF TOUCHSCREEN 13

6. INTRODUCTION TO TOUCHLESS TOUCHSCREEN 14

7. TOUCHLESS MONITOR 17

8. TOUCHWALL 20

9. WORKING OF TOUCHLESS TOUCHSCREEN 22

10. TOUCHJESS UI 26

11. MINORITY REPORT INSPIRED TOUCHLESS SCREEN 30

12. ADVANTAGES OF TOUCHLESS TOUCHSCREEN 35

13. APPLICATIONS 36

14. LIMITATIONS OF TOUCHLESS TOUCHSCREEN 37

15. CONCLUSION 38

16. FUTURE SCOPE 39

17. REFERENCES 40
1.INTRODUCTION TO TOUCHSCREEN

A touchscreen is an important source of input device and output device normally layered
on the top of an electronic visual display of an information processing system. A user
can give input or control the information processing system through simple or multi-
touch gestures by touching the screen with a special stylus and/or one or more fingers.
Some touchscreens use ordinary or specially coated gloves to work while others use a
special stylus/pen only. The user can use the touchscreen to react to what is displayed
and to control how it is displayed; for example, zooming to increase the text size. The
touchscreen enables the user to interact directly with what is displayed, rather than
using a mouse, touchpad, or any other intermediate device (other than a stylus, which is
optional for most modern touchscreens).
Touchscreens are common in devices such as game consoles, personal
computers, tablet computers, electronic voting machines, point of sale systems, and
smartphones. They can also be attached to computers or, as terminals, to networks.
They also play a prominent role in the design of digital appliances such as personal
digital assistants (PDAs) and some e-readers.
The popularity of smartphones, tablets, and many types of information appliances is
driving the demand and acceptance of common touchscreens for portable and
functional electronics. Touchscreens are found in the medical field and in heavy
industry, as well as for automated teller machines (ATMs), and kiosks such as museum
displays or room automation, where keyboard and mouse systems do not allow a
suitably intuitive, rapid, or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying controller-based
firmware have been made available by a wide array of after-market system integrators,
and not by display, chip, or motherboard manufacturers. Display manufacturers and
chip manufacturers worldwide have acknowledged the trend toward acceptance of
touchscreens as a highly desirable user interface component and have begun to
integrate touchscreens into the fundamental design of their products.
Optical touchscreens are a relatively modern development in touchscreen
technology, in which two or more image sensors are placed around the edges (mostly
the corners) of the screen. Infrared backlights are placed in the camera's field of view on
the opposite side of the screen. A touch blocks some lights from the cameras, and the
location and size of the touching object can be calculated. This technology is growing in
popularity due to its scalability, versatility, and affordability for larger touchscreens.

5
2.HISTORY OF TOUCHSCREEN

* E.A. Johnson of the Royal Radar Establishment, Malvern described his work on
capacitive touchscreens in a short article published in 1965 and then more fully- With
photographs and diagrams in an article published in 1967.

* The applicability of touch technology for air traffic control was described in an
article published in 1968.

* Frank Beck and Bent Stumpe, engineers from CERN, developed a transparent
touchscreen in the early 1970s, based on Stumpe's work at a television factory in the
early 1960s.

* Then manufactured by CERN, it was put to use in 1973.A resistive touchscreen


was developed by American inventor George Samuel Hurst, who received US patent
#3,911,215 on October 7, 1975.

* The first version was produced in 1982.

* In 1972, a group at the University of Illinois filed for a patent on an optical


touchscreen that became a standard part of the Magnavox Plato IV Student Terminal.

* Thousands were built for the PLATO IV system. These touchscreens had a
crossed array of 16 by 16 infrared position sensors, each composed of an LED on one
edge of the screen and a matched phototransistor on the other edge, all mounted in
front of a monochrome plasma display panel.

* This arrangement can sense any fingertip-sized opaque object in close proximity
to the screen.

* A similar touchscreen was used on the HP-150 starting in 1983; this was one of
the world's earliest commercial touchscreen computers.

6
* HP mounted their infrared transmitters and receivers around the bezel of a 9"
Sony Cathode Ray Tube (CRT).

* In 1984, Fujitsu released a touch pad for the Micro 16, to deal with the complexity
of kanji characters, which were stored as tiled graphics.

* In 1985, Sega released the Terebi Oekaki, also known as the Sega Graphic
Board, for the SG-1000 video game console and SC-3000 home computer.

* It consisted of a plastic pen and a plastic board with a transparent window where
the pen presses are detected. It was used primarily for a drawing software application.

* A graphic touch tablet was released for the Sega AI Computer in 1986. Touch-
sensitive Control-Display Units (CDUs) were evaluated for commercial aircraft flight
decks in the early1980s.

* Initial research showed that a touch interface would reduce pilot workload as the
crew could then select waypoints, functions and actions, rather than be "head down"
typing in latitudes, longitudes, and waypoint codes on a keyboard.

* An effective integration of this technology was aimed at helping flight crews


maintain a high-level of situational awareness of all major aspects of the vehicle
operations including its flight path, the functioning of various aircraft systems, and
moment-to-moment human interactions. Most user interface books would state that
touchscreens selections were limited to targets larger than the average finger.

* As users touch the screen, feedback is provided as to what will be selected,


users can adjust the position of the finger, and the action takes place only when the
finger is lifted off the screen.

* This allowed the selection of small targets, down to a single pixel on a VGA
screen (standard best of the time). Sears et al. (1990) gave a review of academic
research on single and multi-touch human.

7
3.WORKING OF TOUCHSCREEN

Figure 3.1 Working of touchscreen

A resistive touchscreen panel comprises several layers, the most important of


which are two thin, transparent electrically resistive layers separated by a thin space.
These layers face each other with a thin gap between. The top screen (the screen that
is touched) has a coating on the underside surface of the screen. Just beneath it is a
similar resistive layer on top of its substrate. One layer has conductive connections
along its sides, the other along top and bottom.

A voltage is applied to one layer, and sensed by the other. When an object,
such as a fingertip or stylus tip, presses down onto the outer surface, the two layers
touch to become connected at that point: The panel then behaves as a pair of voltage

8
dividers, one axis at a time. By rapidly switching between each layer, the position of a
pressure on the screen can be read.

A capacitive touchscreen panel consists of an insulator such as glass,


coated with a transparent conductor such as indium tin oxide (ITO). As the human body
is also an electrical conductor, touching the surface of the screen results in a distortion
of the screen's electrostatic field, measurable as a change in capacitance. Different
technologies may be used to determine the location of the touch. The location is then
sent to the controller for processing.

Unlike a resistive touchscreen, one cannot use a capacitive touchscreen


through most types of electrically insulating material, such as gloves. This disadvantage
especially affects usability in consumer electronics, such as touch tablet PCs and
capacitive smartphones in cold weather. It can be overcome with a special capacitive
stylus, or a special-application glove with an embroidered patch of conductive thread
passing through it and contacting the user's fingertip.

3.2 TOUCHSENSOR

A touch screen sensor is a clear glass panel with a touch responsive surface.
The sensor generally has an electrical current or signal going through it and touching
the screen causes a voltage or signal change.

9
Figure 3.2 Touch Sensor

3.3 CONTROLLER

The controller is a small PC card that connects between the touch sensor and the
PC. The controller determines what type of interface/connection you will need on the

PC.

Figure 3.3 Controller

3.4 DRIVER

10
The driver is a software that allows the touch screen and computer to work
together. Most touch screen drivers today are a mouse-emulation type driver.

Figure 3.4 Diagrammatically working of touchscreen

4.ADVANTAGES OF TOUCHSCREEN

1. Direct pointing to the objects.

2. Fast.

3. Finger or pen is usable (No cable required).

4. No keyboard necessary.

5. Suited to: novices, application for information retrieval etc.

11
5.DISADVANTAGES OF TOUCHSCREEN

1. Low precision by using finger.

2. User has to sit or stand closer to the screen.

3. The screen may be covered more by using hand.

4. No direct activation to the selected function.

12
6.INTRODUCTION TO TOUCHLESS TOUCHSCREEN

Touch less control of electrically operated equipment is being developed by Elliptic


Labs. This system depends on hand or finger motions, a hand wave in a certain
direction. The sensor can be placed either on the screen or near the screen. The
touchscreen enables the user to interact directly with what is displayed, rather than
using a mouse, touchpad, or any other intermediate device (other than a stylus, which is
optional for most modern touchscreens). Touchscreens are common in devices such as
game consoles, personal computers, tablet computers, electronic voting machines,
point of sale systems ,and smartphones. They can also be attached to computers or, as
terminals, to networks. They also play a prominent role in the design of digital
appliances such as personal digital assistants (PDAs) and some e-readers.

13
The popularity of smartphones, tablets, and many types of information appliances is
driving the demand and acceptance of common touchscreens for portable and
functional electronics. Touchscreens are found in the medical field and in heavy
industry, as well as for automated teller machines(ATMs), and kiosks such as museum
displays or room automation, where keyboard and mouse systems do not allow a
suitably intuitive, rapid, or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying controller-based firmware
have been made available by a wide array of after-market system integrators, and not
by display, chip, or motherboard manufacturers. Display manufacturers and chip
manufacturers worldwide have acknowledged the trend toward acceptance of
touchscreens as a highly desirable user interface component and have begun to
integrate touchscreens into the fundamental design of their products.

The touch less touch screen sounds like it would be nice and easy, however after
closer examination it looks like it could be quite a workout. This unique screen is made
by TouchKo, White Electronics Designs, and Groupe 3D.The screen resembles the
Nintendo Wii without the Wii Controller. With the touchless touch screen your hand
doesn’t have to come in contact with the screen at all, it works by detecting your hand
movements in front of it. This is a pretty unique and interesting invention, until you break
out in a sweat. Now this technology doesn’t compare to the hologram-like IO2
Technologies Helio display M3, but that for anyone that has $18,100laying around.

14
Figure 6.1 Touchless touchscreen

You probably won’t see this screen in stores any time soon. Everybody loves a
touch screen and when you get a gadget with touch screen the experience is really
exhilarating. When the I- phone was introduced, everyone felt the same. But gradually,
the exhilaration started fading. While using the phone with the finger tip or with the
stylus the screen started getting lots of finger prints and scratches. When we use a
screen protector; still dirty marks over such beautiful glossy screen is a strict no-no.
Same thing happens with I-pod touch. Most of the time we have to wipe the screen to
get a better unobtrusive view of the screen.

Thanks to Elliptic Labs innovative technology that lets you control your gadgets like
Computers, MP3 players or mobile phones without touching them. Simply point your
finger in the air towards the device and move it accordingly to control the navigation in
the device. They term this as “Touchless human/machine user interface for 3D
navigation”.

15
Figure 6.2 3D Navigation of Hand Movements in Touchless Screen

Touchless touch screen technology employs finger motions without the use of a
screen. It simply uses a hand wave in one direction or a flick of the hand in one area. If
the touchscreen display is cracked, we cannot operate the device by simply touching
the display.

Modern electronic devices have been designed to allow many types of human–machine
interfaces with flexibility. One of the user-friendly navigations is interfacing a human
hand, but the implementation using conventional devices, such as additional controllers
or image sensors, requires a larger size or a higher power. In this paper, a small-size
and low-power 3D touchless hand navigation sensor (HNS) is presented. The
experimental results show that the proposed HNS is superior to other types of
navigation sensors in terms of size and power.

16
7.TOUCHLESS MONITOR

This monitor is made by TouchKo. Touchless touchscreen your hand doesn’t have to
come in contact with the screen at all, it works by detecting your hand movements in
front of it.

Sure, everybody is doing touchscreen interfaces these days, but this is the first time
I’ve seen a monitor that can respond to gestures without actually having to touch the
screen. The monitor, based on technology from TouchKo was recently demonstrated by
White Electronic Designs and Tactyl Services at the CeBIT show. Designed for
applications where touch may be difficult, suchas for doctors who might be wearing
surgical gloves, the display features capacitive sensors thatcan read movements from
up to 15cm away from the screen. Software can then translate gesturesinto screen
commands.

17
Figure 7.1(a) Touch monitor

Touchscreen interfaces are great, but all that touching, like foreplay, can be a
little bit of a drag. Enter the wonder kids from Elliptic Labs, who are hard at work on
implementing a touchless interface. The input method is, well, in thin air. The technology
detects motion in3D and requires no special worn-sensors for operation. By simply
pointing at the screen, users can manipulate the object being displayed in 3D. Details
are light on how this actually functions, but what we do know is this:

It obviously requires a sensor but the sensor is neither hand mounted nor
present on the screen. The sensor can be placed either on the table or near the screen.
And the hardware setup is so compact that it can be fitted into a tiny device like a MP3
player or a mobile phone. It recognizes the position of an object from as 5 feet.

Figure 7.1(b) Touch monitor

18
Point your finger in the air towards the device and move it accordingly to control the
navigation in the device. Designed for applications where touch may be difficult, such as
for doctors who might be wearing surgical gloves.

Figure 7.2 Doctor using Touchless Touchscreen

Touchless interaction with medical images lets surgeons maintain sterility during
surgical procedures. A gesture interface is developed for users, such as
doctors/surgeons, to browse medical images in a sterile medical environment.

19
8.TOUCHWALL

 Touch Wall it is the first multi touch product.


 It refers to the touch screen hardware setup itself ; the corresponding software to
run touch wall, which is built on a standard version of vista, is called plex.
 Touch Wall and Plex are superficially similar to Microsoft Surface, a multi-touch
table computer that was introduced in 2007 and which recently became
commercially available in select AT&T stores.

Figure 8.1 Touch Wall

20
Touch Wall consists of three infrared lasers that scan s surface. By using a projector
entire walls can easily be turned into a multi touch user interface.

Figure 8.2 Kinect used for Touch Wall

To convert a table into a touch display, the Kinect sensor would face downwards from
the ceiling. On a wall, the Kinect sensor needs be placed between one to two metres
from the surface. Ubi's "vision engine" detects a person's finger or hand in relation to
the surface.

21
9.WORKING OF TOUCHLESS TOUCHSCREEN

9.1 Block diagram

The system is capable of detecting movements in 3-dimensions without ever having


to put your fingers on the screen. Sensors are mounted around the screen that is being
used, by interacting in the line-of-sight of these sensors the motion is detected and
interpreted into on-screen movements. The device is based on optical pattern
recognition using a solid state optical matrix sensor with a lens to detect hand motions.

This sensor is then connected to a digital image processor, which interprets the
patterns of motion and outputs the results as signals to control fixtures, appliances,
machinery, or any device controllable through electrical signals. You just point at the
screen (from as far as 5 feet away), and you can manipulate objects in 3D. It consists of
three infrared lasers which scan a surface. A camera notes when something breaks
through the laser line and feed that information back to the Plex software.

The Leap Motion controller sensor device that aims to translate hand
movements into computer commands. The controller itself is an eight by three
22
centimeter unit that plugs into the USB on a computer. Placed face up on surface, the
controller senses the area above it and is sensitive to a range of approximately one
meter. To date it has been used primarily in conjunction with apps developed specifically
for the controller.

One factor contributing to the control issues is a lack of given gestures, or


meanings for different motion Controls when using the device, this means that different
motion controls will be used in different apps for the same action, such as selecting an
item on the screen.

Leap Motion are aware of some of the interaction issues with their controller,
and are planning solutions. This includes the development of standardized motions for
specific actions, and an improved skeletal model of the hand and fingers.

Figure 9.2 Touchless Touchscreen

9.3 Optical matrix sensor

23
This is based on optical pattern recognition using a solid state optical matrix sensor.
This sensor is then connected to a digital image processor, which interprets the patterns
of motion and outputs the results as signals. In each of these sensors there is a matrix
of pixels. Each pixel is coupled to photodiodes incorporating charge storage regions.

9.4 GBUI (Gesture-Based Graphical User Interface)

A movement of part of the body, especially a hand or the head, to express an idea or
meaning based graphical user interphase.

A Leap Motion controller was used by two members in conjunction with a laptop
and the Leap Motion software development kit. Initial tests were conducted to establish
how the controller worked and to understand basic interaction. The controller is used to
tested for there cognition of sign language. The finger spelling alphabet was used to test
the functionality of the controller. The alphabet was chosen for the relative simplicity of
individual signs, and for the diverse range of movements involved in the alphabet.

Figure 9.4 GBUI


Symbols

The focus of these tests is to evaluate the capabilities and accuracy of the
controller to recognize hand movements. This capability cannow be discussed in terms
of the strengths and weakness of the controller.

GBUI is a partnership of Member States, academic and research institutions,


donors, nongovernmental organizations, WHO and others. This initiative is dedicated to

24
raising awareness about Buruli ulcer disease in the context of NTDs at all levels in order
to increase its profile and obtain commitments from governments, policy-makers and
donors for research and control; and for support and development of capacity for the
health systems of endemic countries to improve access to early diagnosis, treatment
and prevention of disability.

GBUI also seeks to strengthen Buruli ulcer surveillance systems and


assess the disease burden at local, national and global levels as well as in promoting
and supporting priority research to develop better tools for diagnosis, treatment and
prevention of Buruli ulcer.

10.TOUCHLESS UI

The basic idea described in the patent is that there would be sensors arrayed around
the perimeter of the device capable of sensing finger movements in 3-D space.

25
Figure 10.1 Touchless UI

UI in their Redmond headquarters and it involves lots of gestures which allow you to
take applications and forward them on to others with simple hand movements. The
demos included the concept of software understanding business processes and helping
you work. So after reading a document - you could just push it off the side of your
screen and the system would know to post it on an intranet and also send a link to a
specific group of people.

10.2 Touch less-SDK

The Touchless SDK is an open source SDK for .NET applications. It enables
developers to create multi-touch based applications using a webcam for input. Color
based markers defined by the user are tracked and their information is published
through events to clients of the SDK.

Figure 10.2(a) Touchless SDK

26
In a nutshell, the Touchless SDK enables touch without touching. Well, Microsoft
Office Labs has just released “Touchless,” a webcam-driven multi-touch interface SDK
that enables “touch without touching.” Using the SDK lets developers offer users “a new
and cheap way of experiencing multi-touch capabilities, without the need of expensive
hardware or software.

Figure 10.2(b) Digital matrix

All the user needs is a camera,” to track the multi-colored objects as defined by
the developer. Using the SDK lets developers offer users “a new and cheap way of
experiencing multi-touch capabilities, without the need of expensive hardware or
software. All the user needs is a camera,” to track the multi-colored objects as defined
by the developer. Just about any webcam will work. Using the SDK lets developers offer
users “a new and cheap way of experiencing multi-touch capabilities, without the need
of expensive hardware or software. All the user needs is a camera,” to track the multi-
colored objects as defined by the developer.

27
Figure 10.2(c) Touch-less SDK

10.3 Touch-less demo

The Touch less Demo is an open source application that anyone with a webcam can
use to experience multi-touch, no geekiness required.

28
The demo was created using the Touch less SDK and Windows Forms with C#.
There are 4 fun demos: Snake - where you control a snake with a marker, Defender up
to 4 player version of a pong-like game, Map - where you can rotate, zoom, and move a
map using 2 markers, and Draw the marker is used to guess what….draw! Mike

demonstrated Touch less at a recent Office Labs’ Productivity Science Fair where it was
voted by attendees as “most interesting project.” If you wind up using the SDK, one
would love to hear what use you make of it!

29
Figure 10.3 Touchless demo

11.MINORITY REPORT INSPIRED TOUCHLESS TECHNOLOGY

11.1 Tobii Rex

Tobii Rex is an eye-tracking device from Sweden which works with any computer
running on Windows 8. The device has a pair of infrared sensors built in that will track
the user’s
eyes .

30
Figure 11.1 Tobii Rex

11.2 Elliptic Labs

Elliptic Labs allows you to operate your computer without touching it with the
Windows 8Gesture Suite.

F
igure 11.2 Elliptic Labs

11.3 Airwriting

31
Airwriting is a technology that allows you to write text messages or compose
emails by writing in the air.

Figure 11.3
Airwriting

11.4 Eyesight

Eyesight is a gesture technology which allows you to navigate through your devices
by just pointing at it.

32
Figure 11.4 Eye Sight

11.5 MAUZ
MAUZ is a third party device that turns your iphone into a trackpad or mouse.

33
Figure 11.5 MAUZ

11.6 POINT GRAB


Point Grab is something similar to eye sight, in that it enables users to navigate on
their computer just by pointing at it.

Figure 11.6 Point Grab


11.7 LEAP MOTION
Leap Motion is a motion sensor device that recognizes the user’s fingers with its

34
Infrared LEDs and cameras.

Figure 11.7 Leap Motion

11.8 MYOELECTRIC ARMBAND


Myoelectric armband or MYO armband is a gadget that allows you to control your
other bluetooth enabled devices using your finger or your hands.

Figure 11.8 Myoelectric armband


11.9 MICROSOFT KINECT

35
It detects and recognizes a user’s body movement and reproduces it within the
video game that is being played.

Figure 11.9 Microsoft Kinect

12.ADVANTAGES OF TOUCHLESS TOUCHSCREEN

 No de-sensitization of screen.

 Can be controlled from a distance.

 Useful for physically handicapped people.

 Easier and satisfactory experience

 Gesturing and cursor positioning.

 No drivers required

36
13.APPLICATIONS
i. Touch less monitor

ii. Touch less user interface

iii. Touch less SDK

iv. Touch less

v. Mobile phones and tablets

vi. Games consoles

vii. Bank cash machines, station ticket machines

37
14.LIMITATIONS OF TOUCHLESS TOUCHSCREEN

 Proper ambience is required

 Public interaction has be monitored

 Initial cost is very high

 Used in sophisticated environment

38
15.CONCLUSION
Touchless Technology is still developing. Many Future Aspects. With this in few years
our body can become a input device. The Touch less touch screen user interface can be
used effectively in computers, cell phones, webcams and laptops. May be few years
down the line, our body can be transformed into a virtual mouse, virtual keyboard, Our
body may be turned in to an input device.
It appears that while the device has potential, the API supporting the device is
not yet ready to interpret the full range of sign language. At present, the controller can
be used with significant work for recognition of basic signs, However it is not appropriate
for complex signs, especially those that require significant face or body contact.
As a result of the significant rotation and line-of sight obstruction of digits during
conversational signs become inaccurate and indistinguishable making the controller (at
present) unusable for conversational However, when addressing signs as single entities
there is potential for them to be trained into Artificial Neural Networks.

39
16.FUTURE SCOPE
The future scope of touchless touchscreen technology holds potential for diverse
applications, including interactive displays, retail kiosks, healthcare interfaces, and
public spaces. Advancements may focus on improved gesture recognition, enhanced
user experience, and integration with emerging technologies like augmented reality.
Additionally, increased hygiene awareness may drive the adoption of touchless
interfaces in various industries.

40
17.REFERENCES

 K. O'Hara et al., "Touchless interaction in surgery", Communications of the ACM,


vol. 57, no. 1, pp. 70-77, 2014.

 J. Wachs, H. Stern, Y. Edan, M. Gillam, C. Feied, M. Smith, J. Handler, "Real-


Time Hand Gesture Interface for Browsing Medical Images", IC-MED, vol. 1,
no.3, pp. 175-185.

 A.K. Jain, A. Ross, K. Nandakumar, Introduction to Biometrics, Springer, 2011.

 P. Peltonen, E. Kurvinen, A. Salovaara, G. Jacucci, T. Ilmonen, J. Evans,


A.Oulasvirta, P. Saarikko, "It's mine don't touch!: interactions at a large multi-
touch display in a city centre", CHI '08: Proceeding of the twenty-sixth annual
SIGCHI conference on Human factors in computing systems, ACM, pp. 1285-
1294,2008.

41

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy