Technical Seminar Documentation (1) D8
Technical Seminar Documentation (1) D8
REPORT
Seminar Report submitted in partial fulfilment of the requirements for the
award of the degree of B.Tech. in Vignan Institute of Technology & Science
affiliated to
JNTU Hyderabad
By
DONURU SWETHA
20891A04D8
DECEMBER-2024.
1 1
CERTIFICATE
1
Acknowledgements
I would like to thank our Head of the Department of Electronics and Communication
Engineering, DR P A HARSHA VARDHINI, a distinguished and eminent personality, whose
strong recommendation, immense support and constant encouragement has been great help to
me.
I convey my sincere thanks to Dr. G Durga Sukumar, Principal of our institution for
providing me with the required infrastructure and a very vibrant and supportive staff.
I thank our beloved Chairman, Dr. L Rathaiah, who gave us great encouragement to
work.
I thank our beloved CEO, Mr. Boyapati Shravan, we remember him for his valuable
ideas and facilities available in college during the development of this technical seminar
report.
D SWETHA
20891A04D8
ECE Department
3
Abstract
It was the touch screens which initially created great furore. Gone are the days when
you have to fiddle with the touch screens and end scratching up. Touch screen displays
are ubiquitous worldwide. Frequent touching a touchscreen display with a pointing
device such as a finger can result in the gradual de-sensitization of the touchscreen to
input and can ultimately lead to failure of the touchscreen. To avoid this a simple user
interface for Touchless control of electrically operated equipment is being developed.
Elliptic Labs innovative technology lets you control your gadgets like Computers, MP3
players or mobile phones without touching them.
3
TABLEOFCONTENTS
TITLE PAGE NO.
I. CERTIFICATE
II. ACKNOWLEDGEMENT
1. INTRODUCTION TO TOUCHSCREEN 6
2. HISTORY OF TOUCHSCREEN 7
3. WORKING OF TOUCHSCREEN 9
4. ADVANTAGES OF TOUCHSCREEN 12
5. DISADVANTAGES OF TOUCHSCREEN 13
7. TOUCHLESS MONITOR 17
8. TOUCHWALL 20
10. TOUCHJESS UI 26
13. APPLICATIONS 36
15. CONCLUSION 38
17. REFERENCES 40
1.INTRODUCTION TO TOUCHSCREEN
A touchscreen is an important source of input device and output device normally layered
on the top of an electronic visual display of an information processing system. A user
can give input or control the information processing system through simple or multi-
touch gestures by touching the screen with a special stylus and/or one or more fingers.
Some touchscreens use ordinary or specially coated gloves to work while others use a
special stylus/pen only. The user can use the touchscreen to react to what is displayed
and to control how it is displayed; for example, zooming to increase the text size. The
touchscreen enables the user to interact directly with what is displayed, rather than
using a mouse, touchpad, or any other intermediate device (other than a stylus, which is
optional for most modern touchscreens).
Touchscreens are common in devices such as game consoles, personal
computers, tablet computers, electronic voting machines, point of sale systems, and
smartphones. They can also be attached to computers or, as terminals, to networks.
They also play a prominent role in the design of digital appliances such as personal
digital assistants (PDAs) and some e-readers.
The popularity of smartphones, tablets, and many types of information appliances is
driving the demand and acceptance of common touchscreens for portable and
functional electronics. Touchscreens are found in the medical field and in heavy
industry, as well as for automated teller machines (ATMs), and kiosks such as museum
displays or room automation, where keyboard and mouse systems do not allow a
suitably intuitive, rapid, or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying controller-based
firmware have been made available by a wide array of after-market system integrators,
and not by display, chip, or motherboard manufacturers. Display manufacturers and
chip manufacturers worldwide have acknowledged the trend toward acceptance of
touchscreens as a highly desirable user interface component and have begun to
integrate touchscreens into the fundamental design of their products.
Optical touchscreens are a relatively modern development in touchscreen
technology, in which two or more image sensors are placed around the edges (mostly
the corners) of the screen. Infrared backlights are placed in the camera's field of view on
the opposite side of the screen. A touch blocks some lights from the cameras, and the
location and size of the touching object can be calculated. This technology is growing in
popularity due to its scalability, versatility, and affordability for larger touchscreens.
5
2.HISTORY OF TOUCHSCREEN
* E.A. Johnson of the Royal Radar Establishment, Malvern described his work on
capacitive touchscreens in a short article published in 1965 and then more fully- With
photographs and diagrams in an article published in 1967.
* The applicability of touch technology for air traffic control was described in an
article published in 1968.
* Frank Beck and Bent Stumpe, engineers from CERN, developed a transparent
touchscreen in the early 1970s, based on Stumpe's work at a television factory in the
early 1960s.
* Thousands were built for the PLATO IV system. These touchscreens had a
crossed array of 16 by 16 infrared position sensors, each composed of an LED on one
edge of the screen and a matched phototransistor on the other edge, all mounted in
front of a monochrome plasma display panel.
* This arrangement can sense any fingertip-sized opaque object in close proximity
to the screen.
* A similar touchscreen was used on the HP-150 starting in 1983; this was one of
the world's earliest commercial touchscreen computers.
6
* HP mounted their infrared transmitters and receivers around the bezel of a 9"
Sony Cathode Ray Tube (CRT).
* In 1984, Fujitsu released a touch pad for the Micro 16, to deal with the complexity
of kanji characters, which were stored as tiled graphics.
* In 1985, Sega released the Terebi Oekaki, also known as the Sega Graphic
Board, for the SG-1000 video game console and SC-3000 home computer.
* It consisted of a plastic pen and a plastic board with a transparent window where
the pen presses are detected. It was used primarily for a drawing software application.
* A graphic touch tablet was released for the Sega AI Computer in 1986. Touch-
sensitive Control-Display Units (CDUs) were evaluated for commercial aircraft flight
decks in the early1980s.
* Initial research showed that a touch interface would reduce pilot workload as the
crew could then select waypoints, functions and actions, rather than be "head down"
typing in latitudes, longitudes, and waypoint codes on a keyboard.
* This allowed the selection of small targets, down to a single pixel on a VGA
screen (standard best of the time). Sears et al. (1990) gave a review of academic
research on single and multi-touch human.
7
3.WORKING OF TOUCHSCREEN
A voltage is applied to one layer, and sensed by the other. When an object,
such as a fingertip or stylus tip, presses down onto the outer surface, the two layers
touch to become connected at that point: The panel then behaves as a pair of voltage
8
dividers, one axis at a time. By rapidly switching between each layer, the position of a
pressure on the screen can be read.
3.2 TOUCHSENSOR
A touch screen sensor is a clear glass panel with a touch responsive surface.
The sensor generally has an electrical current or signal going through it and touching
the screen causes a voltage or signal change.
9
Figure 3.2 Touch Sensor
3.3 CONTROLLER
The controller is a small PC card that connects between the touch sensor and the
PC. The controller determines what type of interface/connection you will need on the
PC.
3.4 DRIVER
10
The driver is a software that allows the touch screen and computer to work
together. Most touch screen drivers today are a mouse-emulation type driver.
4.ADVANTAGES OF TOUCHSCREEN
2. Fast.
4. No keyboard necessary.
11
5.DISADVANTAGES OF TOUCHSCREEN
12
6.INTRODUCTION TO TOUCHLESS TOUCHSCREEN
13
The popularity of smartphones, tablets, and many types of information appliances is
driving the demand and acceptance of common touchscreens for portable and
functional electronics. Touchscreens are found in the medical field and in heavy
industry, as well as for automated teller machines(ATMs), and kiosks such as museum
displays or room automation, where keyboard and mouse systems do not allow a
suitably intuitive, rapid, or accurate interaction by the user with the display's content.
Historically, the touchscreen sensor and its accompanying controller-based firmware
have been made available by a wide array of after-market system integrators, and not
by display, chip, or motherboard manufacturers. Display manufacturers and chip
manufacturers worldwide have acknowledged the trend toward acceptance of
touchscreens as a highly desirable user interface component and have begun to
integrate touchscreens into the fundamental design of their products.
The touch less touch screen sounds like it would be nice and easy, however after
closer examination it looks like it could be quite a workout. This unique screen is made
by TouchKo, White Electronics Designs, and Groupe 3D.The screen resembles the
Nintendo Wii without the Wii Controller. With the touchless touch screen your hand
doesn’t have to come in contact with the screen at all, it works by detecting your hand
movements in front of it. This is a pretty unique and interesting invention, until you break
out in a sweat. Now this technology doesn’t compare to the hologram-like IO2
Technologies Helio display M3, but that for anyone that has $18,100laying around.
14
Figure 6.1 Touchless touchscreen
You probably won’t see this screen in stores any time soon. Everybody loves a
touch screen and when you get a gadget with touch screen the experience is really
exhilarating. When the I- phone was introduced, everyone felt the same. But gradually,
the exhilaration started fading. While using the phone with the finger tip or with the
stylus the screen started getting lots of finger prints and scratches. When we use a
screen protector; still dirty marks over such beautiful glossy screen is a strict no-no.
Same thing happens with I-pod touch. Most of the time we have to wipe the screen to
get a better unobtrusive view of the screen.
Thanks to Elliptic Labs innovative technology that lets you control your gadgets like
Computers, MP3 players or mobile phones without touching them. Simply point your
finger in the air towards the device and move it accordingly to control the navigation in
the device. They term this as “Touchless human/machine user interface for 3D
navigation”.
15
Figure 6.2 3D Navigation of Hand Movements in Touchless Screen
Touchless touch screen technology employs finger motions without the use of a
screen. It simply uses a hand wave in one direction or a flick of the hand in one area. If
the touchscreen display is cracked, we cannot operate the device by simply touching
the display.
Modern electronic devices have been designed to allow many types of human–machine
interfaces with flexibility. One of the user-friendly navigations is interfacing a human
hand, but the implementation using conventional devices, such as additional controllers
or image sensors, requires a larger size or a higher power. In this paper, a small-size
and low-power 3D touchless hand navigation sensor (HNS) is presented. The
experimental results show that the proposed HNS is superior to other types of
navigation sensors in terms of size and power.
16
7.TOUCHLESS MONITOR
This monitor is made by TouchKo. Touchless touchscreen your hand doesn’t have to
come in contact with the screen at all, it works by detecting your hand movements in
front of it.
Sure, everybody is doing touchscreen interfaces these days, but this is the first time
I’ve seen a monitor that can respond to gestures without actually having to touch the
screen. The monitor, based on technology from TouchKo was recently demonstrated by
White Electronic Designs and Tactyl Services at the CeBIT show. Designed for
applications where touch may be difficult, suchas for doctors who might be wearing
surgical gloves, the display features capacitive sensors thatcan read movements from
up to 15cm away from the screen. Software can then translate gesturesinto screen
commands.
17
Figure 7.1(a) Touch monitor
Touchscreen interfaces are great, but all that touching, like foreplay, can be a
little bit of a drag. Enter the wonder kids from Elliptic Labs, who are hard at work on
implementing a touchless interface. The input method is, well, in thin air. The technology
detects motion in3D and requires no special worn-sensors for operation. By simply
pointing at the screen, users can manipulate the object being displayed in 3D. Details
are light on how this actually functions, but what we do know is this:
It obviously requires a sensor but the sensor is neither hand mounted nor
present on the screen. The sensor can be placed either on the table or near the screen.
And the hardware setup is so compact that it can be fitted into a tiny device like a MP3
player or a mobile phone. It recognizes the position of an object from as 5 feet.
18
Point your finger in the air towards the device and move it accordingly to control the
navigation in the device. Designed for applications where touch may be difficult, such as
for doctors who might be wearing surgical gloves.
Touchless interaction with medical images lets surgeons maintain sterility during
surgical procedures. A gesture interface is developed for users, such as
doctors/surgeons, to browse medical images in a sterile medical environment.
19
8.TOUCHWALL
20
Touch Wall consists of three infrared lasers that scan s surface. By using a projector
entire walls can easily be turned into a multi touch user interface.
To convert a table into a touch display, the Kinect sensor would face downwards from
the ceiling. On a wall, the Kinect sensor needs be placed between one to two metres
from the surface. Ubi's "vision engine" detects a person's finger or hand in relation to
the surface.
21
9.WORKING OF TOUCHLESS TOUCHSCREEN
This sensor is then connected to a digital image processor, which interprets the
patterns of motion and outputs the results as signals to control fixtures, appliances,
machinery, or any device controllable through electrical signals. You just point at the
screen (from as far as 5 feet away), and you can manipulate objects in 3D. It consists of
three infrared lasers which scan a surface. A camera notes when something breaks
through the laser line and feed that information back to the Plex software.
The Leap Motion controller sensor device that aims to translate hand
movements into computer commands. The controller itself is an eight by three
22
centimeter unit that plugs into the USB on a computer. Placed face up on surface, the
controller senses the area above it and is sensitive to a range of approximately one
meter. To date it has been used primarily in conjunction with apps developed specifically
for the controller.
Leap Motion are aware of some of the interaction issues with their controller,
and are planning solutions. This includes the development of standardized motions for
specific actions, and an improved skeletal model of the hand and fingers.
23
This is based on optical pattern recognition using a solid state optical matrix sensor.
This sensor is then connected to a digital image processor, which interprets the patterns
of motion and outputs the results as signals. In each of these sensors there is a matrix
of pixels. Each pixel is coupled to photodiodes incorporating charge storage regions.
A movement of part of the body, especially a hand or the head, to express an idea or
meaning based graphical user interphase.
A Leap Motion controller was used by two members in conjunction with a laptop
and the Leap Motion software development kit. Initial tests were conducted to establish
how the controller worked and to understand basic interaction. The controller is used to
tested for there cognition of sign language. The finger spelling alphabet was used to test
the functionality of the controller. The alphabet was chosen for the relative simplicity of
individual signs, and for the diverse range of movements involved in the alphabet.
The focus of these tests is to evaluate the capabilities and accuracy of the
controller to recognize hand movements. This capability cannow be discussed in terms
of the strengths and weakness of the controller.
24
raising awareness about Buruli ulcer disease in the context of NTDs at all levels in order
to increase its profile and obtain commitments from governments, policy-makers and
donors for research and control; and for support and development of capacity for the
health systems of endemic countries to improve access to early diagnosis, treatment
and prevention of disability.
10.TOUCHLESS UI
The basic idea described in the patent is that there would be sensors arrayed around
the perimeter of the device capable of sensing finger movements in 3-D space.
25
Figure 10.1 Touchless UI
UI in their Redmond headquarters and it involves lots of gestures which allow you to
take applications and forward them on to others with simple hand movements. The
demos included the concept of software understanding business processes and helping
you work. So after reading a document - you could just push it off the side of your
screen and the system would know to post it on an intranet and also send a link to a
specific group of people.
The Touchless SDK is an open source SDK for .NET applications. It enables
developers to create multi-touch based applications using a webcam for input. Color
based markers defined by the user are tracked and their information is published
through events to clients of the SDK.
26
In a nutshell, the Touchless SDK enables touch without touching. Well, Microsoft
Office Labs has just released “Touchless,” a webcam-driven multi-touch interface SDK
that enables “touch without touching.” Using the SDK lets developers offer users “a new
and cheap way of experiencing multi-touch capabilities, without the need of expensive
hardware or software.
All the user needs is a camera,” to track the multi-colored objects as defined by
the developer. Using the SDK lets developers offer users “a new and cheap way of
experiencing multi-touch capabilities, without the need of expensive hardware or
software. All the user needs is a camera,” to track the multi-colored objects as defined
by the developer. Just about any webcam will work. Using the SDK lets developers offer
users “a new and cheap way of experiencing multi-touch capabilities, without the need
of expensive hardware or software. All the user needs is a camera,” to track the multi-
colored objects as defined by the developer.
27
Figure 10.2(c) Touch-less SDK
The Touch less Demo is an open source application that anyone with a webcam can
use to experience multi-touch, no geekiness required.
28
The demo was created using the Touch less SDK and Windows Forms with C#.
There are 4 fun demos: Snake - where you control a snake with a marker, Defender up
to 4 player version of a pong-like game, Map - where you can rotate, zoom, and move a
map using 2 markers, and Draw the marker is used to guess what….draw! Mike
demonstrated Touch less at a recent Office Labs’ Productivity Science Fair where it was
voted by attendees as “most interesting project.” If you wind up using the SDK, one
would love to hear what use you make of it!
29
Figure 10.3 Touchless demo
Tobii Rex is an eye-tracking device from Sweden which works with any computer
running on Windows 8. The device has a pair of infrared sensors built in that will track
the user’s
eyes .
30
Figure 11.1 Tobii Rex
Elliptic Labs allows you to operate your computer without touching it with the
Windows 8Gesture Suite.
F
igure 11.2 Elliptic Labs
11.3 Airwriting
31
Airwriting is a technology that allows you to write text messages or compose
emails by writing in the air.
Figure 11.3
Airwriting
11.4 Eyesight
Eyesight is a gesture technology which allows you to navigate through your devices
by just pointing at it.
32
Figure 11.4 Eye Sight
11.5 MAUZ
MAUZ is a third party device that turns your iphone into a trackpad or mouse.
33
Figure 11.5 MAUZ
34
Infrared LEDs and cameras.
35
It detects and recognizes a user’s body movement and reproduces it within the
video game that is being played.
No de-sensitization of screen.
No drivers required
36
13.APPLICATIONS
i. Touch less monitor
37
14.LIMITATIONS OF TOUCHLESS TOUCHSCREEN
38
15.CONCLUSION
Touchless Technology is still developing. Many Future Aspects. With this in few years
our body can become a input device. The Touch less touch screen user interface can be
used effectively in computers, cell phones, webcams and laptops. May be few years
down the line, our body can be transformed into a virtual mouse, virtual keyboard, Our
body may be turned in to an input device.
It appears that while the device has potential, the API supporting the device is
not yet ready to interpret the full range of sign language. At present, the controller can
be used with significant work for recognition of basic signs, However it is not appropriate
for complex signs, especially those that require significant face or body contact.
As a result of the significant rotation and line-of sight obstruction of digits during
conversational signs become inaccurate and indistinguishable making the controller (at
present) unusable for conversational However, when addressing signs as single entities
there is potential for them to be trained into Artificial Neural Networks.
39
16.FUTURE SCOPE
The future scope of touchless touchscreen technology holds potential for diverse
applications, including interactive displays, retail kiosks, healthcare interfaces, and
public spaces. Advancements may focus on improved gesture recognition, enhanced
user experience, and integration with emerging technologies like augmented reality.
Additionally, increased hygiene awareness may drive the adoption of touchless
interfaces in various industries.
40
17.REFERENCES
41