SlideShare a Scribd company logo
5
Most read
10
Most read
11
Most read
1
INDEX
1. INTRODUCTIONTO TOUCHSCREEN----------------------------------5
2. HISTORY OF TOUCHSCREEN--------------------------------------------6
3. WORKING OF TOUCHSCREEN-------------------------------------------9
3.1 TOUCH SENSOR-------------------------------------------------------------------------9
3.2 CONTROLLER--------------------------------------------------------------------------10
3.3 DRIVERS----------------------------------------------------------------------------------10
4. ADVANTAGE OF TOUCHSCREEN-------------------------------------11
5. DISADVANTAGE OF TOUCHSCREEN--------------------------------12
6. INTRODUCTIONTO TOUCHLESS TOUCHSCREEN-------------13
7. TOUCHLESS MONITOR---------------------------------------------------14
8. TOUCHWALL-----------------------------------------------------------------15
9. WORKING OF TOUCHLESS TOUCHSCREEN----------------------16
10. GBUI(gesture-basedgraphicaluser Interface)--------------------------17
11. TOUCHLESS UI--------------------------------------------------------------18
12. MINORITYREPORT TOUCHLESS TECHNOLOGY--------------19
12.1 TOBII REX-------------------------------------------------------------------------------19
12.2 ELLIPTIC LABS-----------------------------------------------------------------------19
12.3 AIRWRITING---------------------------------------------------------------------------20
12.4 EYESIGHT-------------------------------------------------------------------------------20
12.5 MAUZ-------------------------------------------------------------------------------------21
12.6 POINT GRAB---------------------------------------------------------------------------21
12.7 LEAP MOTION-------------------------------------------------------------------------22
2
12.8 MICROSOFT KINECT------------------------------------------------------------22
13. CONCLUSION----------------------------------------------------------------23
14. BIBLIOGRAPHY-------------------------------------------------------------24
3
LIST OF FIGURES
Figure: 3.1- Touch Sensor-------------------------------------------------9
Figure: 3.2- Controller----------------------------------------------------10
Figure: 3.1- Diagrammatically working of touchscreen-----------10
Figure: 6.1- Touchless Touchscreen------------------------------------13
Figure: 7.1- Touch Monitor----------------------------------------------14
Figure: 7.2- Doctor using Touchless Touchscreen------------------14
Figure: 8.1- Touch Wall ------------------------------------------------15
Figure: 8.1- Kinect used for Touch Wall------------------------------15
Figure: 9.1- Working------------------------------------------------------16
Figure: 9.2- Use of Working---------------------------------------------16
Figure: 9.3- 3D Object----------------------------------------------------16
Figure: 10.1- Hand Moving----------------------------------------------17
Figure: 10.2- GBUI--------------------------------------------------------17
Figure: 11.1- UI of Touchscreen----------------------------------------18
Figure: 12.1.1- Use of Tobii Rex----------------------------------------19
Figure: 12.2.1- Gesture Suit---------------------------------------------19
Figure: 12.3.1- Airwiriting----------------------------------------------20
Figure: 12.4.1- Gesture of hand moving------------------------------20
4
Figure: 12.5.1-MAUZ Device--------------------------------------------21
Figure: 12.6.1-Use of Point Grab---------------------------------------21
Figure: 12.7.1- Use Third Device Leap--------------------------------22
Figure: 12.8.1- Third party device for Touch Wall-----------------22
5
1. INTRODUCTION TO TOUCHSCREEN
A touchscreen is an important source of input device and output device normally layered on
the top of an electronic visual display of an information processing system. A user can give
input or control the information processing system through simple or multi-touch gestures by
touching the screen with a special stylus and/or one or more fingers. Some touchscreens use
ordinary or specially coated gloves to work while others use a special stylus/pen only. The user
can use the touchscreen to react to what is displayed and to control how it is displayed; for
example, zooming to increase the text size. The touchscreen enables the user to interact directly
with what is displayed, rather than using a mouse, touchpad, or any other intermediate device
(other than a stylus, which is optional for most modern touchscreens). Touchscreens are
common in devices such as game consoles, personal computers, tablet computers, electronic
voting machines, point of sale systems, and smartphones. They can also be attached to
computers or, as terminals, to networks. They also play a prominent role in the design of digital
appliances such as personal digital assistants (PDAs) and some e-readers. The popularity of
smartphones, tablets, and many types of information appliances is driving the demand and
acceptance of common touchscreens for portable and functional electronics. Touchscreens are
found in the medical field and in heavy industry, as well as for automated teller machines
(ATMs), and kiosks such as museum displays or room automation, where keyboard and mouse
systems do not allow a suitably intuitive, rapid, or accurate interaction by the user with the
display's content. Historically, the touchscreen sensor and its accompanying controller-based
firmware have been made available by a wide array of after-market system integrators, and not
by display, chip, or motherboard manufacturers. Display manufacturers and chip
manufacturers worldwide have acknowledged the trend toward acceptance of touchscreens as
a highly desirable user interface component and have begun to integrate touchscreens into the
fundamental design of their products.
6
2. HISTORY OF TOUCHSCREEN
E.A. Johnson of the Royal Radar Establishment, Malvern described his work on capacitive
touchscreens in a short article published in 1965 and then more fully—with photographs and
diagrams in an article published in 1967. The applicability of touch technology for air traffic
control was described in an article published in 1968. Frank Beck and Bent Stumpe, engineers
from CERN, developed a transparent touchscreen in the early 1970s, based on Stumpe's work
at a television factory in the early 1960s. Then manufactured by CERN, it was put to use in
1973. A resistive touchscreen was developed by American inventor George Samuel Hurst, who
received US patent #3,911,215 on October 7, 1975. The first version was produced in 1982. In
1972, a group at the University of Illinois filed for a patent on an optical touchscreen that
became a standard part of the Magnavox Plato IV Student Terminal. Thousands were built for
the PLATO IV system. These touchscreens had a crossed array of 16 by 16 infrared position
sensors, each composed of an LED on one edge of the screen and a matched phototransistor on
the other edge, all mounted in front of a monochrome plasma display panel. This arrangement
can sense any fingertip-sized opaque object in close proximity to the screen. A similar
touchscreen was used on the HP-150 starting in 1983; this was one of the world's earliest
commercial touchscreen computers. HP mounted their infrared transmitters and receivers
around the bezel of a 9" Sony Cathode Ray Tube (CRT). In 1984, Fujitsu released a touch pad
for the Micro 16, to deal with the complexity of kanji characters, which were stored as tiled
graphics. In 1985, Sega released the Terebi Oekaki, also known as the Sega Graphic Board, for
the SG-1000 video game console and SC-3000 home computer. It consisted of a plastic pen
and a plastic board with a transparent window where the pen presses are detected. It was used
primarily for a drawing software application. A graphic touch tablet was released for the Sega
AI Computer in 1986. Touch-sensitive Control-Display Units (CDUs) were evaluated for
commercial aircraft flight decks in the early 1980s. Initial research showed that a touch
interface would reduce pilot workload as the crew could then select waypoints, functions and
actions, rather than be "head down" typing in latitudes, longitudes, and waypoint codes on a
keyboard. An effective integration of this technology was aimed at helping flight crews
maintain a high-level of situational awareness of all major aspects of the vehicle operations
including its flight path, the functioning of various aircraft systems, and moment-to-moment
human interactions. In the early 1980s, General Motors tasked its Delco Electronics division
with a project aimed at replacing an automobile's non-essential functions (i.e. other than
throttle, transmission, braking and steering) from mechanical or electro-mechanical systems
7
with solid state alternatives wherever possible. The finished device was dubbed the ECC for
"Electronic Control Centre", a digital computer and software control system hardwired to
various peripheral sensors, servos, solenoids, antenna and a monochrome CRT touchscreen
that functioned both as display and sole method of input.[19] The ECC replaced the traditional
mechanical stereo, fan, heater and air conditioner controls and displays, and was capable of
providing very detailed and specific information about the vehicle's cumulative and current
operating status in real time. The ECC was standard equipment on the 1985–89 Buick Riviera
and later the 1988–89 Buick Reatta, but was unpopular with consumers partly due to the
technophobia of some traditional Buick customers, but mostly because of costly to repair
technical problems suffered by the ECC's touchscreen which being the sole access method,
would render climate control or stereo operation impossible. Multi-touch technology began in
1982, when the University of Toronto's Input Research Group developed the first human-input
multi-touch system, using a frosted-glass panel with a camera placed behind the glass. In 1985,
the University of Toronto group including Bill Buxton developed a multi-touch tablet that used
capacitance rather than bulky camera-based optical sensing systems (see History of multi-
touch). In 1986, the first graphical point of sale software was demonstrated on the 16-bit Atari
520ST colour computer. It featured a colour touchscreen widget-driven interface.[21] The View
Touch point of sale software was first shown by its developer, Gene Mosher, at Fall Comdex,
1986, in Las Vegas, Nevada to visitors at the Atari Computer demonstration area and was the
first commercially available POS system with a widget-driven colour graphic touchscreen
interface. In 1987, Casio launched the Casio PB-1000 pocket computer with a touchscreen
consisting of a 4x4 matrix, resulting in 16 touch areas in its small LCD graphic screen. Until
1988 touchscreens had the bad reputation of being imprecise. Most user interface books would
state that touchscreens selections were limited to targets larger than the average finger. At the
time, selections were done in such a way that a target was selected as soon as the finger came
over it, and the corresponding action was performed immediately. Errors were common, due
to parallax or calibration problems, leading to frustration. A new strategy called "lift-off
strategy" was introduced by researchers at the University of Maryland Human – Computer
Interaction Lab and is still used today. As users touch the screen, feedback is provided as to
what will be selected, users can adjust the position of the finger, and the action takes place only
when the finger is lifted off the screen. This allowed the selection of small targets, down to a
single pixel on a VGA screen (standard best of the time). Sears et al. (1990) gave a review of
academic research on single and multi-touch human–computer interaction of the time,
describing gestures such as rotating knobs, adjusting sliders, and swiping the screen to activate
8
a switch (or a U-shaped gesture for a toggle switch). The University of Maryland Human –
Computer Interaction Lab team developed and studied small touchscreen keyboards (including
a study that showed that users could type at 25 wpm for a touchscreen keyboard compared with
58 wpm for a standard keyboard), thereby paving the way for the touchscreen keyboards on
mobile devices. They also designed and implemented multitouch gestures such as selecting a
range of a line, connecting objects, and a "tap-click" gesture to select while maintaining
location with another finger.
9
3. WORKING OF TOUCHSCREEN
A resistive touchscreen panel comprises several layers, the most important of which are
two thin, transparent electrically resistive layers separated by a thin space. These layers
face each other with a thin gap between. The top screen (the screen that is touched) has a
coating on the underside surface of the screen. Just beneath it is a similar resistive layer on
top of its substrate. One layer has conductive connections along its sides, the other along
top and bottom. A voltage is applied to one layer, and sensed by the other. When an object,
such as a fingertip or stylus tip, presses down onto the outer surface, the two layers touch
to become connected at that point: The panel then behaves as a pair of voltage dividers, one
axis at a time. By rapidly switching between each layer, the position of a pressure on the
screen can be read. A capacitive touchscreen panel consists of an insulator such as glass,
coated with a transparent conductor such as indium tin oxide (ITO).[32] As the human body
is also an electrical conductor, touching the surface of the screen results in a distortion of
the screen's electrostatic field, measurable as a change in capacitance. Different
technologies may be used to determine the location of the touch. The location is then sent
to the controller for processing. Unlike a resistive touchscreen, one cannot use a capacitive
touchscreen through most types of electrically insulating material, such as gloves. This
disadvantage especially affects usability in consumer electronics, such as touch tablet PCs
and capacitive smartphones in cold weather. It can be overcome with a special capacitive
stylus, or a special-application glove with an embroidered patch of conductive thread
passing through it and contacting the user's fingertip.
3.1 TOUCH SENSOR
A touch screen sensor is a clear glass panel with a touch responsive surface. The sensor
generally has an electrical current or signal going through it and touching the screen
causes a voltage or signal change.
Figure: 3.1- Touch Sensor
10
3.2 CONTROLLER
The controller is a small PC card that connects between the touch sensor and the PC.
The controller determines what type of interface/connection you will need on the PC.
Figure: 3.2- Controller
3.3 DRIVER
The driver is a software that allows the touch screen and computer to work together.
Most touch screen drivers today are a mouse-emulation type driver.
Figure: 3.1- Diagrammatically working of touchscreen
11
4. ADVANTAGE OF TOUCHSCREEN
1. Direct pointing to the objects.
2. Fast.
3. Finger or pen is usable (No cable required).
4. No keyboard necessary.
5. Suited to: novices, application for information retrieval etc
12
5. DISADVANTAGE OF TOUCHSCREEN
1. Low precision by using finger.
2. User has to sit or stand closer to the screen.
3. The screen may be covered more by using hand.
4. No direct activation to the selected function.
13
6. INTRODUCTION TO TOUCHLESS TOUCHSCREEN
Touch less control of electrically operated equipment is being developed by Elliptic Labs.
This system depends on hand or finger motions, a hand wave in a certain direction. The
sensor can be placed either on the screen or near the screen. The touchscreenenablestheuser
to interact directly with what is displayed,rather than using a mouse, touchpad, or any other
intermediate device (other than a stylus, which is optional for most modern
touchscreens).Touchscreensare commonindevicessuchas game consoles,personal computers,
tabletcomputers,electronicvotingmachines,pointof sale systems,and smartphones.Theycan
also be attached to computers or, as terminals, to networks. They also play a prominent role in
the designof digital appliancessuchas personal digitalassistants(PDAs) andsome e-readers.The
popularity of smartphones, tablets, and many types of information appliances is driving the
demand and acceptance of common touchscreens for portable and functional electronics.
Touchscreensare foundinthe medical fieldandin heavyindustry,aswell asforautomatedteller
machines (ATMs),andkioskssuchasmuseumdisplaysor roomautomation,wherekeyboardand
mouse systems do not allow a suitably intuitive, rapid, or accurate interaction by the user with
the display'scontent. Historically,the touchscreensensoranditsaccompanyingcontroller-based
firmware have been made available by a wide array of after-market system integrators,and not
by display, chip, or motherboard manufacturers.Display manufacturers and chip manufacturers
worldwidehave acknowledgedthe trendtowardacceptance of touchscreensasahighlydesirable
userinterface componentandhave beguntointegrate touchscreensintothe fundamentaldesign
of their products.
Figure: 6.1- Touchless Touchscreen
14
7. TOUCHLESS MONITOR
This monitor is made by TouchKo. Touch less touch screen your hand doesn’t have
to come in contact with the screen at all, it works by detecting your hand movements in
front of it.
Figure: 7.1- Touch Monitor
Point your finger in the air towards the device and move it accordingly to control the
navigation in the device. Designed for applications where touch may be difficult, such
as for doctors who might be wearing surgical gloves.
Figure: 7.2- Doctor using Touchless Touchscreen
15
8. TOUCHWALL
Touch Wall it is the first multi touch product. It refers to the touch screen hardware
setup itself and software is plex.
Figure: 8.1- Touch Wall
Touch Wall consists of three infrared lasers that scan a surface. By using a projector entire
walls can easily be turned into a multi touch user interface.
Figure: 8.1- Kinect used for Touch Wall
16
9. WORKING OF TOUCHLESS TOUCHSCREEN
The system is capable of detecting movements in 3-dimensions without ever having to
put your fingers on the screen. Sensors are mounted around the screen that is being
used, by interacting in the line-of-sight of these sensors the motion is detected and
interpreted into on-screen movements. The device is based on optical pattern
recognition using a solid state optical matrix sensor with a lens to detect hand
motions.
Figure: 9.1- Working Figure: 9.2- Use of Working
This sensor is then connected to a digital image processor, which interprets the patterns of
motion and outputs the results as signals to control fixtures, appliances, machinery, or any
device controllable through electrical signals. You just point at the screen (from as far as 5
feet away), and you can manipulate objects in 3D.
Figure: 9.3- 3D Object
17
10. GBUI(gesture-based graphical user Interface)
A movement of part of the body, especially a hand or the head, to express an idea or meaning
Based graphical user interphase.
Figure: 10.1- Hand Moving
Figure: 10.2- GBUI
We have seen the futuristic user interfaces of movies like Minority Report and the Matrix
Revolutions where people wave their hand in 3 dimensions and the computer understands what
the user wants and shifts and sorts data with precision.
18
11. TOUCHLESS UI
The basic idea described in the patent is that there would be sensors arrayed around the
perimeter of the device capable of sensing finger movements in 3-D space.
Figure: 11.1- UI of Touchscreen
19
12. MINORITY REPORT INSPIRED TOUCHLESS
TECHNOLOGY
There are eight types of Minority Report Inspired Touchless Technology. These are as
follows:-
12.1 Tobii Rex
Tobii Rex is an eye-tracking device from Sweden which works with any computer running on
Windows 8. The device has a pair of infrared sensors built in that will track the user’s eyes.
Figure: 12.1.1- Use of Tobii Rex
12.2 Elliptic Labs
Elliptic Labs allows you to operate your computer without touching it with the Windows
8 Gesture Suite.
Figure: 12.2.1- Gesture Suit
20
12.3 Airwirting
Airwriting is a technology that allows you to write text messages or compose emails by writing
in the air.
Figure: 12.3.1- Airwiriting
12.4 Eyesight
EyeSight is a gesture technology which allows you to navigate through your devices by just
pointing at it.
Figure: 12.4.1- Gesture of hand moving
21
12.5 MAUZ
Mauz is a third party device that turns your iPhone into a trackpad or mouse.
Figure: 12.5.1-MAUZ Device
12.6 POINT GRAB
Point Grab is something similar to eyeSight, in that it enables users to navigate on their
computer just by pointing at it.
Figure: 12.6.1-Use of Point Grab
22
12.7 LEAP MOTION
Leap Motion is a motion sensor device that recognizes the user’s fingers with its infrared LEDs
and cameras.
Figure: 12.7.1- Use Third Device Leap
12.8 MICROSOFT KINECT
It detects and recognizes a user’s body movement and reproduces it within the video game that
is being played.
Figure: 12.8.1- Third party device for Touch Wall
23
13.CONCLUSION
o Touchless Technology is still developing.
o Many Future Aspects.
o With this in few years our body can become a input device.
o The Touch less touch screen user interface can be used effectively in computers,
cell phones, webcams and laptops.
o May be few years down the line, our body can be transformed into a virtua l
mouse, virtual keyboard ,Our body may be turned in to an input device.
24
14.Bibliography
http://www.comogy.com/concepts/170-universal-remote-concept.html
http://www.hitslot.com/?p=214
http://www.touchuserinterface.com/touchless-touch-screen-that-senses-
your.html
http://www.etre.com/blog/elliptic_labs_touchless_user_interface/
http://www.lewisshepherd.wordpress.com/stop-being-so-touchy

More Related Content

What's hot (20)

PPTX
Touchless touch screen
Sanjit Sadhukhan
 
PPTX
Touchless Touchscreen
Saptarshi Dey
 
PPT
Blue eye technology ppt
-jyothish kumar sirigidi
 
PPTX
Touchless touchscreen
Naga Dinesh
 
PPTX
Touchless Touchscreen Technology
Akshay Vasava
 
PPTX
Hand Gesture Recognition Using OpenCV Python
Arijit Mukherjee
 
PDF
Touch Less Touch Screen Technology
ijtsrd
 
PPT
Smart note taker
PRADEEP Cheekatla
 
PPTX
E ball technology
MOHAMMAD ASIF
 
PPTX
SMART DUST
Khyravdhy Tannaya
 
PPT
Blue Eyes ppt
deepu427
 
DOC
SMART NOTE TAKER REPORT
Vivek Jha
 
PPTX
eye phone technology
Naga Dinesh
 
PDF
Screenless displays seminar report
Jeevan Kumar D
 
PPTX
Gesture Recognition Technology-Seminar PPT
Suraj Rai
 
PPTX
Mind reading computer ppt
Tarun tyagi
 
PPTX
Main ppt
Manish Mani
 
PPTX
Virtual keyboard seminar ppt
Shruti Maheshwari
 
PPTX
Touchless Touch screen technology
Anudeep Sharma Ramadugu
 
PPTX
Touch screen and Touchless technology
Rajesh Kumar Sahoo
 
Touchless touch screen
Sanjit Sadhukhan
 
Touchless Touchscreen
Saptarshi Dey
 
Blue eye technology ppt
-jyothish kumar sirigidi
 
Touchless touchscreen
Naga Dinesh
 
Touchless Touchscreen Technology
Akshay Vasava
 
Hand Gesture Recognition Using OpenCV Python
Arijit Mukherjee
 
Touch Less Touch Screen Technology
ijtsrd
 
Smart note taker
PRADEEP Cheekatla
 
E ball technology
MOHAMMAD ASIF
 
SMART DUST
Khyravdhy Tannaya
 
Blue Eyes ppt
deepu427
 
SMART NOTE TAKER REPORT
Vivek Jha
 
eye phone technology
Naga Dinesh
 
Screenless displays seminar report
Jeevan Kumar D
 
Gesture Recognition Technology-Seminar PPT
Suraj Rai
 
Mind reading computer ppt
Tarun tyagi
 
Main ppt
Manish Mani
 
Virtual keyboard seminar ppt
Shruti Maheshwari
 
Touchless Touch screen technology
Anudeep Sharma Ramadugu
 
Touch screen and Touchless technology
Rajesh Kumar Sahoo
 

Viewers also liked (20)

PPTX
digital pen
gavshinde
 
PPTX
Digital jewellery ppt
ramco institute of technology
 
PPTX
Nano water technology
PG Scholar
 
PPTX
Presentation on Digital jewelry
Arohi Khandelwal
 
PPTX
Wi Vi technology
Liju Thomas
 
PPTX
Digital jewellery
Poojitha Harithasa
 
PPTX
Multi touch technology
SonuRana20111045
 
PPTX
Applications of nanotechnology in pharmacy
Saravanan subramaniyam
 
PPTX
Smart contact lens
Other Mother
 
PPTX
5 pen-pc-technology-presentation
Preshin Smith
 
PPTX
Google Lens
SMED Tests
 
PPTX
Wi vi
Praneeth Kumar
 
PPTX
Google lens
sreelakshmikv
 
PDF
Top 10 Technology Trends to Watch Out for in 2017
Kirti Khanna
 
PPTX
Nanotechnology in treatment of cancer
RAJASEKHARREDDY POLAM
 
PPTX
Use of Nanotechnology in Diagnosis and Treatment of Cancer
Anas Indabawa
 
PPTX
Multitouch Interaction
Saurabh Singh Chauhan
 
PPTX
Carbon nanotubes ppt
Saurabh Nandy
 
DOCX
Seminar report on Carbon Nanotubes
Saurabh Nandy
 
PDF
Technology Vision 2017 - Overview
Accenture Technology
 
digital pen
gavshinde
 
Digital jewellery ppt
ramco institute of technology
 
Nano water technology
PG Scholar
 
Presentation on Digital jewelry
Arohi Khandelwal
 
Wi Vi technology
Liju Thomas
 
Digital jewellery
Poojitha Harithasa
 
Multi touch technology
SonuRana20111045
 
Applications of nanotechnology in pharmacy
Saravanan subramaniyam
 
Smart contact lens
Other Mother
 
5 pen-pc-technology-presentation
Preshin Smith
 
Google Lens
SMED Tests
 
Google lens
sreelakshmikv
 
Top 10 Technology Trends to Watch Out for in 2017
Kirti Khanna
 
Nanotechnology in treatment of cancer
RAJASEKHARREDDY POLAM
 
Use of Nanotechnology in Diagnosis and Treatment of Cancer
Anas Indabawa
 
Multitouch Interaction
Saurabh Singh Chauhan
 
Carbon nanotubes ppt
Saurabh Nandy
 
Seminar report on Carbon Nanotubes
Saurabh Nandy
 
Technology Vision 2017 - Overview
Accenture Technology
 
Ad

Similar to Seminar report Of Touchless Touchscreen (20)

DOC
29866031 touch-screen-technology-by-pavan-kumar-m-t-140824023627-phpapp02
jaya prakash
 
DOC
Report on Touch Screens
Pavan Kumar MT
 
DOCX
A touchscreen
Jeevan M C
 
PDF
Touchscreen technology report
Vivektech
 
PDF
Touch screen-technology-1
Azhar Ansari
 
DOCX
Introduction
viharika
 
DOC
Report on touch screen
Alisha Korpal
 
PPT
Touch screen
Nitesh Tyagi
 
PPTX
Smart phone touch technology 20111104
Edward (In Young) Cho
 
PPTX
Presentation on touchscreen
Abhijeet Singh
 
PDF
IRJET - Touchless Technology
IRJET Journal
 
PPTX
Touchscreen Technology
Puneeth Punny
 
PPT
Touch screen
steefan
 
PDF
Cds touchscreens – the different technologies
Chris Bartram, Digital Displays Specialist
 
PPT
Touch screen sensor
monu100
 
PPT
Touch screen sensor
Aashish Uppal
 
PDF
zForce Touch Screen Technology
Suryakanta Rout
 
DOCX
52497104 seminar-report
dhiru8342
 
PPT
Touchscreen technology
Pavan Kumar MT
 
PPTX
Touch screen
Aravind Ganesan
 
29866031 touch-screen-technology-by-pavan-kumar-m-t-140824023627-phpapp02
jaya prakash
 
Report on Touch Screens
Pavan Kumar MT
 
A touchscreen
Jeevan M C
 
Touchscreen technology report
Vivektech
 
Touch screen-technology-1
Azhar Ansari
 
Introduction
viharika
 
Report on touch screen
Alisha Korpal
 
Touch screen
Nitesh Tyagi
 
Smart phone touch technology 20111104
Edward (In Young) Cho
 
Presentation on touchscreen
Abhijeet Singh
 
IRJET - Touchless Technology
IRJET Journal
 
Touchscreen Technology
Puneeth Punny
 
Touch screen
steefan
 
Cds touchscreens – the different technologies
Chris Bartram, Digital Displays Specialist
 
Touch screen sensor
monu100
 
Touch screen sensor
Aashish Uppal
 
zForce Touch Screen Technology
Suryakanta Rout
 
52497104 seminar-report
dhiru8342
 
Touchscreen technology
Pavan Kumar MT
 
Touch screen
Aravind Ganesan
 
Ad

Recently uploaded (20)

PDF
How ETL Control Logic Keeps Your Pipelines Safe and Reliable.pdf
Stryv Solutions Pvt. Ltd.
 
PPTX
The Future of AI & Machine Learning.pptx
pritsen4700
 
PDF
Market Insight : ETH Dominance Returns
CIFDAQ
 
PDF
CIFDAQ's Market Wrap : Bears Back in Control?
CIFDAQ
 
PPTX
Agile Chennai 18-19 July 2025 | Workshop - Enhancing Agile Collaboration with...
AgileNetwork
 
PDF
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
PDF
MASTERDECK GRAPHSUMMIT SYDNEY (Public).pdf
Neo4j
 
PPTX
Agile Chennai 18-19 July 2025 Ideathon | AI Powered Microfinance Literacy Gui...
AgileNetwork
 
PPTX
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
PDF
Presentation about Hardware and Software in Computer
snehamodhawadiya
 
PPTX
AI Code Generation Risks (Ramkumar Dilli, CIO, Myridius)
Priyanka Aash
 
PDF
Peak of Data & AI Encore - Real-Time Insights & Scalable Editing with ArcGIS
Safe Software
 
PDF
TrustArc Webinar - Navigating Data Privacy in LATAM: Laws, Trends, and Compli...
TrustArc
 
PPTX
OA presentation.pptx OA presentation.pptx
pateldhruv002338
 
PDF
Brief History of Internet - Early Days of Internet
sutharharshit158
 
PDF
Data_Analytics_vs_Data_Science_vs_BI_by_CA_Suvidha_Chaplot.pdf
CA Suvidha Chaplot
 
PPTX
Applied-Statistics-Mastering-Data-Driven-Decisions.pptx
parmaryashparmaryash
 
PPTX
What-is-the-World-Wide-Web -- Introduction
tonifi9488
 
PPTX
AVL ( audio, visuals or led ), technology.
Rajeshwri Panchal
 
PPTX
IT Runs Better with ThousandEyes AI-driven Assurance
ThousandEyes
 
How ETL Control Logic Keeps Your Pipelines Safe and Reliable.pdf
Stryv Solutions Pvt. Ltd.
 
The Future of AI & Machine Learning.pptx
pritsen4700
 
Market Insight : ETH Dominance Returns
CIFDAQ
 
CIFDAQ's Market Wrap : Bears Back in Control?
CIFDAQ
 
Agile Chennai 18-19 July 2025 | Workshop - Enhancing Agile Collaboration with...
AgileNetwork
 
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
MASTERDECK GRAPHSUMMIT SYDNEY (Public).pdf
Neo4j
 
Agile Chennai 18-19 July 2025 Ideathon | AI Powered Microfinance Literacy Gui...
AgileNetwork
 
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
Presentation about Hardware and Software in Computer
snehamodhawadiya
 
AI Code Generation Risks (Ramkumar Dilli, CIO, Myridius)
Priyanka Aash
 
Peak of Data & AI Encore - Real-Time Insights & Scalable Editing with ArcGIS
Safe Software
 
TrustArc Webinar - Navigating Data Privacy in LATAM: Laws, Trends, and Compli...
TrustArc
 
OA presentation.pptx OA presentation.pptx
pateldhruv002338
 
Brief History of Internet - Early Days of Internet
sutharharshit158
 
Data_Analytics_vs_Data_Science_vs_BI_by_CA_Suvidha_Chaplot.pdf
CA Suvidha Chaplot
 
Applied-Statistics-Mastering-Data-Driven-Decisions.pptx
parmaryashparmaryash
 
What-is-the-World-Wide-Web -- Introduction
tonifi9488
 
AVL ( audio, visuals or led ), technology.
Rajeshwri Panchal
 
IT Runs Better with ThousandEyes AI-driven Assurance
ThousandEyes
 

Seminar report Of Touchless Touchscreen

  • 1. 1 INDEX 1. INTRODUCTIONTO TOUCHSCREEN----------------------------------5 2. HISTORY OF TOUCHSCREEN--------------------------------------------6 3. WORKING OF TOUCHSCREEN-------------------------------------------9 3.1 TOUCH SENSOR-------------------------------------------------------------------------9 3.2 CONTROLLER--------------------------------------------------------------------------10 3.3 DRIVERS----------------------------------------------------------------------------------10 4. ADVANTAGE OF TOUCHSCREEN-------------------------------------11 5. DISADVANTAGE OF TOUCHSCREEN--------------------------------12 6. INTRODUCTIONTO TOUCHLESS TOUCHSCREEN-------------13 7. TOUCHLESS MONITOR---------------------------------------------------14 8. TOUCHWALL-----------------------------------------------------------------15 9. WORKING OF TOUCHLESS TOUCHSCREEN----------------------16 10. GBUI(gesture-basedgraphicaluser Interface)--------------------------17 11. TOUCHLESS UI--------------------------------------------------------------18 12. MINORITYREPORT TOUCHLESS TECHNOLOGY--------------19 12.1 TOBII REX-------------------------------------------------------------------------------19 12.2 ELLIPTIC LABS-----------------------------------------------------------------------19 12.3 AIRWRITING---------------------------------------------------------------------------20 12.4 EYESIGHT-------------------------------------------------------------------------------20 12.5 MAUZ-------------------------------------------------------------------------------------21 12.6 POINT GRAB---------------------------------------------------------------------------21 12.7 LEAP MOTION-------------------------------------------------------------------------22
  • 2. 2 12.8 MICROSOFT KINECT------------------------------------------------------------22 13. CONCLUSION----------------------------------------------------------------23 14. BIBLIOGRAPHY-------------------------------------------------------------24
  • 3. 3 LIST OF FIGURES Figure: 3.1- Touch Sensor-------------------------------------------------9 Figure: 3.2- Controller----------------------------------------------------10 Figure: 3.1- Diagrammatically working of touchscreen-----------10 Figure: 6.1- Touchless Touchscreen------------------------------------13 Figure: 7.1- Touch Monitor----------------------------------------------14 Figure: 7.2- Doctor using Touchless Touchscreen------------------14 Figure: 8.1- Touch Wall ------------------------------------------------15 Figure: 8.1- Kinect used for Touch Wall------------------------------15 Figure: 9.1- Working------------------------------------------------------16 Figure: 9.2- Use of Working---------------------------------------------16 Figure: 9.3- 3D Object----------------------------------------------------16 Figure: 10.1- Hand Moving----------------------------------------------17 Figure: 10.2- GBUI--------------------------------------------------------17 Figure: 11.1- UI of Touchscreen----------------------------------------18 Figure: 12.1.1- Use of Tobii Rex----------------------------------------19 Figure: 12.2.1- Gesture Suit---------------------------------------------19 Figure: 12.3.1- Airwiriting----------------------------------------------20 Figure: 12.4.1- Gesture of hand moving------------------------------20
  • 4. 4 Figure: 12.5.1-MAUZ Device--------------------------------------------21 Figure: 12.6.1-Use of Point Grab---------------------------------------21 Figure: 12.7.1- Use Third Device Leap--------------------------------22 Figure: 12.8.1- Third party device for Touch Wall-----------------22
  • 5. 5 1. INTRODUCTION TO TOUCHSCREEN A touchscreen is an important source of input device and output device normally layered on the top of an electronic visual display of an information processing system. A user can give input or control the information processing system through simple or multi-touch gestures by touching the screen with a special stylus and/or one or more fingers. Some touchscreens use ordinary or specially coated gloves to work while others use a special stylus/pen only. The user can use the touchscreen to react to what is displayed and to control how it is displayed; for example, zooming to increase the text size. The touchscreen enables the user to interact directly with what is displayed, rather than using a mouse, touchpad, or any other intermediate device (other than a stylus, which is optional for most modern touchscreens). Touchscreens are common in devices such as game consoles, personal computers, tablet computers, electronic voting machines, point of sale systems, and smartphones. They can also be attached to computers or, as terminals, to networks. They also play a prominent role in the design of digital appliances such as personal digital assistants (PDAs) and some e-readers. The popularity of smartphones, tablets, and many types of information appliances is driving the demand and acceptance of common touchscreens for portable and functional electronics. Touchscreens are found in the medical field and in heavy industry, as well as for automated teller machines (ATMs), and kiosks such as museum displays or room automation, where keyboard and mouse systems do not allow a suitably intuitive, rapid, or accurate interaction by the user with the display's content. Historically, the touchscreen sensor and its accompanying controller-based firmware have been made available by a wide array of after-market system integrators, and not by display, chip, or motherboard manufacturers. Display manufacturers and chip manufacturers worldwide have acknowledged the trend toward acceptance of touchscreens as a highly desirable user interface component and have begun to integrate touchscreens into the fundamental design of their products.
  • 6. 6 2. HISTORY OF TOUCHSCREEN E.A. Johnson of the Royal Radar Establishment, Malvern described his work on capacitive touchscreens in a short article published in 1965 and then more fully—with photographs and diagrams in an article published in 1967. The applicability of touch technology for air traffic control was described in an article published in 1968. Frank Beck and Bent Stumpe, engineers from CERN, developed a transparent touchscreen in the early 1970s, based on Stumpe's work at a television factory in the early 1960s. Then manufactured by CERN, it was put to use in 1973. A resistive touchscreen was developed by American inventor George Samuel Hurst, who received US patent #3,911,215 on October 7, 1975. The first version was produced in 1982. In 1972, a group at the University of Illinois filed for a patent on an optical touchscreen that became a standard part of the Magnavox Plato IV Student Terminal. Thousands were built for the PLATO IV system. These touchscreens had a crossed array of 16 by 16 infrared position sensors, each composed of an LED on one edge of the screen and a matched phototransistor on the other edge, all mounted in front of a monochrome plasma display panel. This arrangement can sense any fingertip-sized opaque object in close proximity to the screen. A similar touchscreen was used on the HP-150 starting in 1983; this was one of the world's earliest commercial touchscreen computers. HP mounted their infrared transmitters and receivers around the bezel of a 9" Sony Cathode Ray Tube (CRT). In 1984, Fujitsu released a touch pad for the Micro 16, to deal with the complexity of kanji characters, which were stored as tiled graphics. In 1985, Sega released the Terebi Oekaki, also known as the Sega Graphic Board, for the SG-1000 video game console and SC-3000 home computer. It consisted of a plastic pen and a plastic board with a transparent window where the pen presses are detected. It was used primarily for a drawing software application. A graphic touch tablet was released for the Sega AI Computer in 1986. Touch-sensitive Control-Display Units (CDUs) were evaluated for commercial aircraft flight decks in the early 1980s. Initial research showed that a touch interface would reduce pilot workload as the crew could then select waypoints, functions and actions, rather than be "head down" typing in latitudes, longitudes, and waypoint codes on a keyboard. An effective integration of this technology was aimed at helping flight crews maintain a high-level of situational awareness of all major aspects of the vehicle operations including its flight path, the functioning of various aircraft systems, and moment-to-moment human interactions. In the early 1980s, General Motors tasked its Delco Electronics division with a project aimed at replacing an automobile's non-essential functions (i.e. other than throttle, transmission, braking and steering) from mechanical or electro-mechanical systems
  • 7. 7 with solid state alternatives wherever possible. The finished device was dubbed the ECC for "Electronic Control Centre", a digital computer and software control system hardwired to various peripheral sensors, servos, solenoids, antenna and a monochrome CRT touchscreen that functioned both as display and sole method of input.[19] The ECC replaced the traditional mechanical stereo, fan, heater and air conditioner controls and displays, and was capable of providing very detailed and specific information about the vehicle's cumulative and current operating status in real time. The ECC was standard equipment on the 1985–89 Buick Riviera and later the 1988–89 Buick Reatta, but was unpopular with consumers partly due to the technophobia of some traditional Buick customers, but mostly because of costly to repair technical problems suffered by the ECC's touchscreen which being the sole access method, would render climate control or stereo operation impossible. Multi-touch technology began in 1982, when the University of Toronto's Input Research Group developed the first human-input multi-touch system, using a frosted-glass panel with a camera placed behind the glass. In 1985, the University of Toronto group including Bill Buxton developed a multi-touch tablet that used capacitance rather than bulky camera-based optical sensing systems (see History of multi- touch). In 1986, the first graphical point of sale software was demonstrated on the 16-bit Atari 520ST colour computer. It featured a colour touchscreen widget-driven interface.[21] The View Touch point of sale software was first shown by its developer, Gene Mosher, at Fall Comdex, 1986, in Las Vegas, Nevada to visitors at the Atari Computer demonstration area and was the first commercially available POS system with a widget-driven colour graphic touchscreen interface. In 1987, Casio launched the Casio PB-1000 pocket computer with a touchscreen consisting of a 4x4 matrix, resulting in 16 touch areas in its small LCD graphic screen. Until 1988 touchscreens had the bad reputation of being imprecise. Most user interface books would state that touchscreens selections were limited to targets larger than the average finger. At the time, selections were done in such a way that a target was selected as soon as the finger came over it, and the corresponding action was performed immediately. Errors were common, due to parallax or calibration problems, leading to frustration. A new strategy called "lift-off strategy" was introduced by researchers at the University of Maryland Human – Computer Interaction Lab and is still used today. As users touch the screen, feedback is provided as to what will be selected, users can adjust the position of the finger, and the action takes place only when the finger is lifted off the screen. This allowed the selection of small targets, down to a single pixel on a VGA screen (standard best of the time). Sears et al. (1990) gave a review of academic research on single and multi-touch human–computer interaction of the time, describing gestures such as rotating knobs, adjusting sliders, and swiping the screen to activate
  • 8. 8 a switch (or a U-shaped gesture for a toggle switch). The University of Maryland Human – Computer Interaction Lab team developed and studied small touchscreen keyboards (including a study that showed that users could type at 25 wpm for a touchscreen keyboard compared with 58 wpm for a standard keyboard), thereby paving the way for the touchscreen keyboards on mobile devices. They also designed and implemented multitouch gestures such as selecting a range of a line, connecting objects, and a "tap-click" gesture to select while maintaining location with another finger.
  • 9. 9 3. WORKING OF TOUCHSCREEN A resistive touchscreen panel comprises several layers, the most important of which are two thin, transparent electrically resistive layers separated by a thin space. These layers face each other with a thin gap between. The top screen (the screen that is touched) has a coating on the underside surface of the screen. Just beneath it is a similar resistive layer on top of its substrate. One layer has conductive connections along its sides, the other along top and bottom. A voltage is applied to one layer, and sensed by the other. When an object, such as a fingertip or stylus tip, presses down onto the outer surface, the two layers touch to become connected at that point: The panel then behaves as a pair of voltage dividers, one axis at a time. By rapidly switching between each layer, the position of a pressure on the screen can be read. A capacitive touchscreen panel consists of an insulator such as glass, coated with a transparent conductor such as indium tin oxide (ITO).[32] As the human body is also an electrical conductor, touching the surface of the screen results in a distortion of the screen's electrostatic field, measurable as a change in capacitance. Different technologies may be used to determine the location of the touch. The location is then sent to the controller for processing. Unlike a resistive touchscreen, one cannot use a capacitive touchscreen through most types of electrically insulating material, such as gloves. This disadvantage especially affects usability in consumer electronics, such as touch tablet PCs and capacitive smartphones in cold weather. It can be overcome with a special capacitive stylus, or a special-application glove with an embroidered patch of conductive thread passing through it and contacting the user's fingertip. 3.1 TOUCH SENSOR A touch screen sensor is a clear glass panel with a touch responsive surface. The sensor generally has an electrical current or signal going through it and touching the screen causes a voltage or signal change. Figure: 3.1- Touch Sensor
  • 10. 10 3.2 CONTROLLER The controller is a small PC card that connects between the touch sensor and the PC. The controller determines what type of interface/connection you will need on the PC. Figure: 3.2- Controller 3.3 DRIVER The driver is a software that allows the touch screen and computer to work together. Most touch screen drivers today are a mouse-emulation type driver. Figure: 3.1- Diagrammatically working of touchscreen
  • 11. 11 4. ADVANTAGE OF TOUCHSCREEN 1. Direct pointing to the objects. 2. Fast. 3. Finger or pen is usable (No cable required). 4. No keyboard necessary. 5. Suited to: novices, application for information retrieval etc
  • 12. 12 5. DISADVANTAGE OF TOUCHSCREEN 1. Low precision by using finger. 2. User has to sit or stand closer to the screen. 3. The screen may be covered more by using hand. 4. No direct activation to the selected function.
  • 13. 13 6. INTRODUCTION TO TOUCHLESS TOUCHSCREEN Touch less control of electrically operated equipment is being developed by Elliptic Labs. This system depends on hand or finger motions, a hand wave in a certain direction. The sensor can be placed either on the screen or near the screen. The touchscreenenablestheuser to interact directly with what is displayed,rather than using a mouse, touchpad, or any other intermediate device (other than a stylus, which is optional for most modern touchscreens).Touchscreensare commonindevicessuchas game consoles,personal computers, tabletcomputers,electronicvotingmachines,pointof sale systems,and smartphones.Theycan also be attached to computers or, as terminals, to networks. They also play a prominent role in the designof digital appliancessuchas personal digitalassistants(PDAs) andsome e-readers.The popularity of smartphones, tablets, and many types of information appliances is driving the demand and acceptance of common touchscreens for portable and functional electronics. Touchscreensare foundinthe medical fieldandin heavyindustry,aswell asforautomatedteller machines (ATMs),andkioskssuchasmuseumdisplaysor roomautomation,wherekeyboardand mouse systems do not allow a suitably intuitive, rapid, or accurate interaction by the user with the display'scontent. Historically,the touchscreensensoranditsaccompanyingcontroller-based firmware have been made available by a wide array of after-market system integrators,and not by display, chip, or motherboard manufacturers.Display manufacturers and chip manufacturers worldwidehave acknowledgedthe trendtowardacceptance of touchscreensasahighlydesirable userinterface componentandhave beguntointegrate touchscreensintothe fundamentaldesign of their products. Figure: 6.1- Touchless Touchscreen
  • 14. 14 7. TOUCHLESS MONITOR This monitor is made by TouchKo. Touch less touch screen your hand doesn’t have to come in contact with the screen at all, it works by detecting your hand movements in front of it. Figure: 7.1- Touch Monitor Point your finger in the air towards the device and move it accordingly to control the navigation in the device. Designed for applications where touch may be difficult, such as for doctors who might be wearing surgical gloves. Figure: 7.2- Doctor using Touchless Touchscreen
  • 15. 15 8. TOUCHWALL Touch Wall it is the first multi touch product. It refers to the touch screen hardware setup itself and software is plex. Figure: 8.1- Touch Wall Touch Wall consists of three infrared lasers that scan a surface. By using a projector entire walls can easily be turned into a multi touch user interface. Figure: 8.1- Kinect used for Touch Wall
  • 16. 16 9. WORKING OF TOUCHLESS TOUCHSCREEN The system is capable of detecting movements in 3-dimensions without ever having to put your fingers on the screen. Sensors are mounted around the screen that is being used, by interacting in the line-of-sight of these sensors the motion is detected and interpreted into on-screen movements. The device is based on optical pattern recognition using a solid state optical matrix sensor with a lens to detect hand motions. Figure: 9.1- Working Figure: 9.2- Use of Working This sensor is then connected to a digital image processor, which interprets the patterns of motion and outputs the results as signals to control fixtures, appliances, machinery, or any device controllable through electrical signals. You just point at the screen (from as far as 5 feet away), and you can manipulate objects in 3D. Figure: 9.3- 3D Object
  • 17. 17 10. GBUI(gesture-based graphical user Interface) A movement of part of the body, especially a hand or the head, to express an idea or meaning Based graphical user interphase. Figure: 10.1- Hand Moving Figure: 10.2- GBUI We have seen the futuristic user interfaces of movies like Minority Report and the Matrix Revolutions where people wave their hand in 3 dimensions and the computer understands what the user wants and shifts and sorts data with precision.
  • 18. 18 11. TOUCHLESS UI The basic idea described in the patent is that there would be sensors arrayed around the perimeter of the device capable of sensing finger movements in 3-D space. Figure: 11.1- UI of Touchscreen
  • 19. 19 12. MINORITY REPORT INSPIRED TOUCHLESS TECHNOLOGY There are eight types of Minority Report Inspired Touchless Technology. These are as follows:- 12.1 Tobii Rex Tobii Rex is an eye-tracking device from Sweden which works with any computer running on Windows 8. The device has a pair of infrared sensors built in that will track the user’s eyes. Figure: 12.1.1- Use of Tobii Rex 12.2 Elliptic Labs Elliptic Labs allows you to operate your computer without touching it with the Windows 8 Gesture Suite. Figure: 12.2.1- Gesture Suit
  • 20. 20 12.3 Airwirting Airwriting is a technology that allows you to write text messages or compose emails by writing in the air. Figure: 12.3.1- Airwiriting 12.4 Eyesight EyeSight is a gesture technology which allows you to navigate through your devices by just pointing at it. Figure: 12.4.1- Gesture of hand moving
  • 21. 21 12.5 MAUZ Mauz is a third party device that turns your iPhone into a trackpad or mouse. Figure: 12.5.1-MAUZ Device 12.6 POINT GRAB Point Grab is something similar to eyeSight, in that it enables users to navigate on their computer just by pointing at it. Figure: 12.6.1-Use of Point Grab
  • 22. 22 12.7 LEAP MOTION Leap Motion is a motion sensor device that recognizes the user’s fingers with its infrared LEDs and cameras. Figure: 12.7.1- Use Third Device Leap 12.8 MICROSOFT KINECT It detects and recognizes a user’s body movement and reproduces it within the video game that is being played. Figure: 12.8.1- Third party device for Touch Wall
  • 23. 23 13.CONCLUSION o Touchless Technology is still developing. o Many Future Aspects. o With this in few years our body can become a input device. o The Touch less touch screen user interface can be used effectively in computers, cell phones, webcams and laptops. o May be few years down the line, our body can be transformed into a virtua l mouse, virtual keyboard ,Our body may be turned in to an input device.