Report Final
Report Final
A Report submitted
In Partial Fulfilment of the Requirements
for the degree of
BACHELOR OF TECHNOLOGY
In
to the
Department of Electronics And Communication Engineering
BUNDELKHAND INSTITUTE OF ENGINEERING AND
TECHNOLOGY JHANSI
(An Autonomous Institute funded By the U.P. Goverment)
March,2019
BUNDELKHAND INSTITUTE OF
ENGINEERING AND TECHNOLOGY JHANSI
(An Autonomous Institute funded By the U.P. Goverment)
Kanpur road, jhansi
Date:
CERTIFICATE
Certified that “Rohit kumar gupta” (1604331041) has carried out the hard work pre-
sented in this report entitled “3-D Printing” for the award of a subject of 3rd year of
BACHELOR OF THECHNOLOGY from BUNDELKHNAD INSTITUTE OF ENGI-
NEERING & TECHNOLOGY, JHNASI, under my supervision. The report embodies
results of work and studies are carried out by the student himself and the contents of the
report do not form the basis for the subjects of any other degree to the candidate or to any-
body else from this or any other University/Institution
Date:
Users are given the illusion that they are touching or Manipulating a real Physical Object
‘Haptics’ is a technology that adds the sense of touch to virtual environments.This seminar
discusses the important concepts in Haptics, some of the most commonly used haptics sys-
tems like ‘Phantom’, ‘Cyber glove’, ‘Novint Falcon’ and such similar devices. Following
this, a description about how sensors and actuators are used for tracking the position and
movement of the haptic systems, is provided.The different types of force rendering algo-
rithms are discussed next. The seminar explains the blocks in force rendering. Then a few
applications of haptic systems are taken up for discussion.
ii
ACKNOWLEDGEMENT
I wish to extend my sincere gratitude to my seminar guide, Dr. Atul Kumar Dwivedi
Lecturer, Department of Electronics & Communication, for his valuable guidance and
encouragement which has been absolutely helpful in successful completion of this seminar.
I am indebted to Prof. Mr. D.K. Srivastava, Professor and Head, Department of Electronics
& Communication for his valuable support.
I am also grateful to my parents and friends for their timely aid without which I wouldn’t
have finished my seminar successfully. I extend my thanks to all my well wishers and all
those who have contributed directly and indirectly for the completion of this work.
And last but not the least; I thank God Almighty for his blessings without which the
completion of this seminar would not have been Possible.
iii
TABLE OF CONTENTS
Page No
Certificate i
Abstract ii
Acknowledgement iii
List of Figures iv
1. INTRODUCTION 1-3
3. HAPTICDEVICES 8-18
3.3.1 Phantom 10
iv
3.3.2 Cyber glove 10
5.1 Application 20
5.1.4 Telerobotics 25
Conclution 28
References 29-30
v
LIST OF FIGURES
vi
1. INTRODUCTION
Haptic technology, or haptics, is a
feedback technology which takes ad-
vantage of a user's sense of touch by
applying forces, vibrations, and/or
motions upon the user. The word de-
rives from the Greek haptikos mean-
ing “being able to come into contact
with”. Users are given the illusion that they figure 1.1 haptic (sense of touch) are touching or
manipulating a real physical object. This Technology has been evolving over the past dec-
ade. It communicates to the user via their sense of touch i.e. heat, movement, pressure and
pain. Haptic interfaces allow the user to feel as well as to see virtual objects on a computer,
and so we can give an illusion of touching surfaces, shaping virtual clay or moving objects
around. Many everyday objects communicate to us through movement (often, vibrations).
Examples:
Haptics could also prove useful for people who, due to visual impairment, are unable to
effectively operate computers as they are made today[1]
1
remote control of machines and devices (teleoperators). This emerging technology prom-
ises to have wide reaching applications as it already has in some fields. For example, hap-
tic technology has made it possible to investigate in detail how the human sense of touch
works by allowing the creation of carefully controlled haptic virtual objects. These objects
are used to systematically probe human haptic capabilities, which would otherwise be dif-
ficult to achieve. These new research tools contribute to our understanding of how touch
and its underlying brain functions work. Although haptic devices are capable of measuring
bulk or reactive forces that are applied by the user, it should not to be confused with touch
or tactile sensors that measure the pressure or force exerted by the user to the interface.
The term haptic originated from the Greek word πτικός ( ἁ haptikos) meaning pertaining to
the sense of touch and comes from the Greek verb πτεσθαι ( ἅ haptesthai) meaning to “con-
tact” or “touch.
In the 1950’s, with the birth of nuclear industry, Goertz and his colleagues at Argonne Na-
tional Lab in the US developed the first force-reflecting robotic manipulators, a master-
slave telemanipulation system in which actuators, receiving feedback signals from slave
sensors, applied forces to a master arm controlled by the user. In 1960s, haptic technology
was used in the 1960s for applications such as military flight simulators, on which combat
pilots honed their skills. Motors and actuators pushed, pulled, and shook the flight yoke,
throttle, rudder pedals, and cockpit shell, reproducing many of the tactile and kinesthetic
cues of real flight. By the late 1970s and early 1980s, computing power reached the point
where rich color graphics and high-quality audio were possible. This multimedia revolution
spawned entirely new businesses in the late 1980s and early 1990s and opened new possi-
bilities for the consumer. Flight simulations moved from a professional pilot-only activity
to a PC-based hobby, with graphics and sound far superior to what the combat pilots in the
1960s had. The multimedia revolution also gave rise to the medical simulation industry. By
the 1990s, high-end workstations displayed highly realistic renderings of human anatomyBy
the mid 1990s, shortcomings in simulation products were identified. Even though graphics
and animations looked incredibly realistic, they could not possibly convey what it actually
feels like to break through a venal wall with a needle or fight the flight yoke out of a steep
dive. New industrial and consumer products were in need of enhanced programmable haptic
technology that could provide sensations similar to an actual hands-on experience. Then,
2
in 1993, the Artificial Intelligence Laboratory at the Massachusetts Institute of Technology
[2](MIT) constructed a device that delivered haptic stimulation, finally making it possible
to touch and feel a computergenerated object. The scientists working on the project began
to describe their research area as computer haptics to differentiate it from machine and hu-
man haptics. Today, computer haptics is defined as the systems required -- both hardware
and software -- to render the touch and feel of virtual objects. Computer haptics uses a dis-
play technology through which objects can be physically palpated. It is a rapidly growing
field that is yielding a number of promising haptic technologies.[11]
3
2 .WORKING OF HAPTICS
Basically a haptic
system consist of two
parts namely the human
part and the machine
part. In the figure shown
above, the human part
(left) senses and controls
the position of the hand,
while the machine part
Figure 2.1 Basic Configuration of Haptics
(right) exerts forces from the hand
to simulate contact with a virtual object. Also both the systems will be provided with nec-
essary sensors, processors and actuators. In the case of the human system, nerve receptors
performs sensing, brain performs processing and muscles performs actuation of the motion
performed by the hand while in the case of the machine system, the above mentioned func-
tions are performed by the encoders, computer and motors respectively.
Basically the haptic information provided by the system will be the combination of
Tactile information:
Tactile information refers the information acquired by the sensors which are actually con-
nected to the skin of the human body with a particular reference to the spatial distribution
of pressure, or more generally, tractions, across the contact area.
For example when we handle flexible materials like fabric and paper, we sense the pressure
variation across the fingertip. This is actually a sort of tactile information. Tactile sensing is
also the basis of complex perceptual tasks like medical palpation, where physicians locate
hidden anatomical structures and evaluate tissue properties using their hands.[3]
Kinesthetic information
4
Interaction forces are normally perceived through a combination of these two information’s
Virtual reality (VR) applications strive to simulate real or imaginary scenes with which users
can interact and perceive the effects of their actions in real time. Ideally the user interacts
with the simulation via all five senses. However, today’s typical VR applications rely on a
smaller subset, typically vision, hearing, and more recently, touch.[6]
Figure below shows the structure of a VR application incorporating visual, auditory, and
haptic feedback
6
hannels feature unidirectional information and energy flow (from the simulation engine
toward the user), the haptic modality exchanges information and energy in two directions,
from and toward the user. This bi-directionality is often referred to as the single most im-
portant feature of the haptic interaction modality.
7
3. HAPTIC DEVICES
A haptic device is the one that provides a physical interface between the user and the virtual
environment by means of a computer. This can be done through an input/output device that
senses the body’s movement, such as joystick or data glove. By using haptic devices, the
user can not only feed information to the computer but can also receive information from
the computer in the form of a felt sensation on some part of the body. This is referred to as
a haptic interface.[8]
3.2Feedback devices:
The term exoskeleton refers to the hard outer shell that exists on many creatures. In a tech-
nical sense, the word refers to a system that covers the user or the user has to wear. Current
haptic devices that are classified as exoskeletons are large and immobile systems that the
user must attach him or her to.
These devices are smaller exoskeleton-like devices that are often, but not always, take the
down by a large exoskeleton or other immobile devices. Since the goal of building a haptic
8
system is to be able to immerse a user in the virtual or remote environment and it is important
to provide a small remainder of the user’s actual environment as possible.
The drawback of the wearable systems is that since weight and size of the devices are a
concern, the systems will have more limited sets of capabilities
This is a class of devices that are very specialized for performing a particular given task.
Designing a device to perform a single type of task restricts the application of that device to
a much smaller number of functions. However it allows the designer to focus the device to
perform its task extremely well. These task devices have two general forms, single point of
interface devices and specific task devices.
An interesting application of haptic feedback is in the form of full body Force Feedback
called locomotion interfaces. Locomotion interfaces are movement of force restriction de-
vices in a confined space, simulating unrestrained mobility such as walking and running for
virtual reality. These interfaces overcomes the limitations of using joysticks for maneuver-
ing or whole body motion platforms, in which the user is seated and does not expend energy,
and of room environments, where only short distances can be traversed.
Force feedback input devices are usually, but not exclusively, connected to computer sys-
tems and is designed to apply forces to simulate the sensation of weight and resistance in
order to provide information to the user. As such, the feedback hardware represents a more
sophisticated form of input/output devices, complementing others such as keyboards, mice
or trackers. Input from the user in the form of hand, or other body segment whereas feedback
from the computer or other device is in the form of hand, or other body segment whereas
feedback from the computer or other device is in the form of force or position. These devices
translate digital information into physical sensations.[9]
9
Simulation task involving active exploration or delicate manipulation of a virtual environ-
ment require the addition of feedback data that presents an object’s surface geometry or
texture. Such feedback is provided by tactile feedback systems or tactile display devices.
Tactile systems differ from haptic systems in the scale of the forces being generated.[12]
While haptic interfaces will present the shape, weight or compliance of an object, tactile
interfaces present the surface properties of an object such as the object’s surface texture.
Tactile feedback applies sensation to the skin.
3.3.1 PHANTOM
It is a haptic
interfacing
device
developed by a
company
named Sensible
technologies. It
is primarily
used for
providing a 3D
touch to the
Figure 3.3.1 PHANTOM
virtual objects. This is a very high
resolution 6 DOF device in which the user holds the end of a motor controlled jointed arm.
It provides a programmable sense of touch that allows the user to feel the texture and shape
of the virtual object with a very high degree of realism. One of its key features is that it can
model free floating 3 dimensional objects . [10]
10
direction. These two conditions can be simplified by requiring the glove to apply a torque
equal to the interphalangian joint.
The solution that we have chosen uses a mechanical structure with three passive joints
which, with the interphalangian joint, make up a flat four-bar closed-link mechanism. This
solution use cables placed at the interior of the four-bar mechanism and following a trajec-
tory identical to that used by the extensor tendons which, by nature, oppose the movement
of the flexor tendons in order to harmonize the movement of the fingers. Among the ad-
vantages of this structure one can cite :
• Apply different forces on each phalanx (The possibility of applying a lateral force on the
fingertip by motorizing the abduction/adduction joint)
• Measure finger angular flexion (The measure of the joint angles are Independent and can
have a good resolution given the important paths traveled by the cables when the finger shut.
The glove is made up of five fingers and has 19 degrees of freedom 5 of which are passive.
Each finger is made up of a passive abduction joint which links it to the base (palm) and to
11
9 rotoid joints which, with the three interphalangian joints, make up 3 closed-link mecha-
nism with four bar and 1 degree of freedom. The structure of the thumb is composed of only
two closed-links, for 3 dof of which one is passive. Thesegments of the glove are made of
aluminum and can withstand high charges; their total weight does not surpass 350 grams.
The length of the segments is proportional to the length of the phalanxes. All of the joints
are mounted on miniature ball bearings in order to reduce friction.
12
The mechanical structure offers two essential advantages: the first is the facility of adapting
to different sizes of the human hand. We have also provided for lateral adjustment in order
to adapt the interval between the fingers at the palm. The second advantage is the presence
of physical stops in the structure which offer complete security to the operator. The force
sensor is placed on the inside of a fixed support on the upper part of the phalanx. The sensor
is made up of a steel strip on which a strain gauge was glued. The position sensor used to
measure the cable displacement is incremental optical encoders offering an average theoret-
ical resolution equal to 0.1 deg for the finger joints.
The glove is controlled by 14 torque motors with continuous current which can develop a
maximal torque equal to 1.4 Nm and a continuous torque equal to 0.12 Nm. On each motor
we fix a pulley with an 8.5 mm radius onto which the cable is wound. The maximal force
that the motor can exert on the cable is thus equal to 14.0 N, a value sufficient to ensure
opposition to the movement of the finger. The electronic interface of the force feedback data
glove is made of PC with several acquisition cards. The global scheme of the control is
given in the figure shown below. One can distinguish two command loops: an internal loop
which corresponds to a classic force control with constant gains and an external loop which
integrates the model of distortion of the virtual object in contact with the fingers. In this
schema the action of man on the position of the fingers joints is taken into consideration by
the two control loops. Man is considered as a displacement generator while the glove is
considered as a force generator.
13
4.HAPTIC RENDERING
As illustrated in Fig. given above, haptic interaction occurs at an interaction tool of a haptic
interface that mechanically couples two controlled dynamical systems: the haptic interface
with a computer and the human user with a central nervous system. The two systems are
exactly symmetrical in structure and information and they sense the environments, make
decisions about control actions, and provide mechanical energies to the interaction tool
through motions.[14]
• Minimal constraints on motion imposed by the device kinematics so free motion feels free;
• Symmetric inertia, friction, stiffness, and resonant frequency properties (thereby regular-
izing the device so users don’t have to unconsciously compensate for parasitic forces);
• Balanced range, resolution, and bandwidth of position sensing and force reflection; and
Proper ergonomics that let the human operator focus when wearing or manipulating the
haptic interface as pain, or even discomfort, can distract the user, reducing overall perfor-
mance.
An avatar is the virtual representation of the haptic through which the user physically inter-
acts with the virtual environment. Clearly the choice of avatar depends on what’s being
14
simulated and on the haptic device’s capabilities. The operator controls the avatar’s position
inside the virtual environment. Contact between the interface avatar and the virtual environ-
ment sets off action and reaction forces. The avatar’s geometry and the type of contact it
supports regulate these forces.Within a given application the user might choose among dif-
ferent avatars. For example, a surgical tool can be treated as a volumetric object exchanging
forces and positions with the user in a 6D space or as a pure point representing the tool’s
tip, exchanging forces and positions in a 3D space
Haptic-rendering algorithms compute the correct interaction forces between the haptic in-
terface representation inside the virtual environment and the virtual objects populating the
environment. Moreover, haptic rendering algorithms ensure that the haptic device correctly
renders such forces on the human operator. Several components composetypical haptic ren-
dering algorithms. We identify three main blocks, illustrated in Figure shown blow. Colli-
sion-detection algorithms detect collisions between objects and avatars in the virtual envi-
ronment and yield information about where, when, and ideally to what extent collisions
(penetrations, indentations, contact area, and so on) have occurred.Force-response algo-
rithms compute the interaction force between avatars and virtual objects when a collision is
detected. This force approximates as closely as possible the contact forces that would nor-
mally arise during contact between real objects. Forceresponse algorithms typically operate
on the avatars’ positions, the positions of all objects in the virtual environment, and the
collision state between avatars and virtual objects. Their return values are normally force
15
and torque vectors that are applied at the device-body interface. Hardware limitations pre-
vent haptic devices from applying the exact force computed by the force-response algo-
rithms to the user.Control algorithms command the haptic device in such a way that mini-
mizes the error between ideal and applicable forces. The discrete-time nature of the haptic-
rendering algorithms often makes this difficult; as we explain further later in the article.
Desired force and torque vectors computed by force response algorithms feed the control
algorithms. The algorithms’ return values are the actual force and torque vectors that will
be commanded to the haptic device.
• Low-level control algorithms sample the position sensor sat the haptic interface device
joints.
• These control algorithms combine the information collected from each sensor to obtain
the position of the device-body interface in Cartesian space—that is, the avatar’s position
inside the virtual environment
. • The force-response algorithm computes interaction forces between avatars and virtual
objects involved in a collision.
• The force-response algorithm sends interaction forces to the control algorithms, which
apply them on the operator through the haptic device while maintaining a stable overall
behavior.The simulation engine then uses the same interaction forces to compute their effect
on objects in the virtual environment. Although there are no firm rules about how frequently
the algorithms must repeat these computations, a 1-KHz servo rate is common. This rate
seems to be a subjectively acceptable compromise permitting presentation of reasonably
complex objects with reasonable stiffness. Higher servo rates can provide crisper contact
and texture sensations, but only at the expense of reduced scene complexity (or more capa-
ble computers).
16
Humans perceive contact with real objects through sensors (mechanoreceptors) located in
their skin, joints, tendons, and muscles. We make a simple distinction between the infor-
mation these two types of sensors can acquire. i.e. Tactile information and kinesthetic in-
formation. A tool based interaction paradigm provides a convenient simplification because
the system need only render forces resulting from contact between the tool’s avatar and
objects in the environment. Thus, haptic interfaces frequently utilize a tool handle physical
interface for the user.
To provide a haptic simulation experience, we’ve designed our systems to recreate the con-
tact forces a user would perceive when touching a real object. The haptic interfaces measure
the user’s position to recognize if and when contacts occur and to collect information needed
to determine the correct interaction force. Although determining user motion is easy, deter-
mining appropriate display forces is a complex process and a subject of much research.
Current haptic technology effectively simulates interaction forces for simple cases, but is
limited when tactile feedback is involved.
Compliant object response modeling adds a dimension of complexity because of non negli-
gible deformations, the potential for self-collision, and the general complexity of modeling
potentially large and varying areas of contact. We distinguish between two types of forces:
forces due to object geometry and forces due to object surface properties, such as texture
and friction.
The first type of force-rendering algorithms aspires to recreate the force interaction a user
would feel when touching a frictionless and texture fewer objects. Such interaction forces
depend on the geometry of the object being touched, its compliance, and the geometry of
the avatar representing the haptic interface inside the virtual environment.
Although exceptions exist, 5 of the necessary DOF to describe the interaction forces be-
tween an avatar and a virtual object typically matches the actuated DOF of the haptic device
being used. Thus for simpler devices, such as a 1-DOF force-reflecting gripper, the avatar
consists of a couple of points that can only move and exchange forces along the line con-
necting them. For this device type, the force-rendering algorithm computes a simple 1-DOF
squeeze force between the index finger and the thumb, similar to the force you would feel
when cutting an object with scissors. When using a 6-DOF haptic device, the avatar can be
17
an object of any shape. In this case, the force-rendering algorithm computes all the interac-
tion forces between the object and the virtual environment and applies the resultant force
and torque vectors to the user through the haptic device. We group current force-rendering
algorithms by the number of DOF necessary to describe the interaction force being rendered.
All real surfaces contain tiny irregularities or indentations. Obviously, it’s impossible to
distinguish each irregularity when sliding a finger over an object. However, tactile sensors
in the human skin can feel their combined effects when rubbed against a real surface.
Micro-irregularities act as obstructions when two surfaces slide against each other and gen-
erate forces tangential to the surface and opposite to motion. Friction, when viewed at the
microscopic level, is a complicated phenomenon. Nevertheless, simple empirical models
exist, such as the one Leonardo Da Vinci proposed and Charles Augustin de Coulomb later
developed in 1785. Such models served as a basis for the simpler frictional models in 3 DOF
Researchers outside the haptic community have developed many models to render friction
with higher accuracy, for example, the Karnopp modelfor modeling stick-slip friction, the
Bristle model, and the reset integrator model. Higher accuracy, however, sacrifices speed, a
critical factor in real-time applications. Any choice of modeling technique must consider
this trade off. Keeping this trade off in mind, researchers have developed more accurate
haptic-rendering algorithms for friction. A texture or pattern generally covers real surfaces.
Researchers have proposed various techniques for rendering the forces that touching such
textures generates
18
and the depth of indentation is calculated between the current point and a point on the sur-
face of the object. Forces are then generated according to physical models, such as spring
stiffness or a spring-damper model.
In ray-based rendering, the user interface mechanism, for example, a probe is modeled in
the virtual environment as a finite ray. Orientation is thus taken into account, and collisions
are determined between the simulated probe and virtual objects. Collision detection algo-
rithms return the intersection point between the ray and the surface of the simulated object.
19
5.APPLICATIONS, LIMITATION & FUTURE VISION
Simple haptic devices are common in the form of game controllers, joysticks, and steering
wheels. Early implementations were provided through optional components, such as
the Nintendo 64 controller's Rumble Pak in 1997. In the same year, the Microsoft Side-
Winder Force Feedback Pro with built in feedback from Immersion Corporation was re-
leased. [17]Many newer generation console controllers and joysticks feature built in feed-
back devices too, including Sony's DualShock technology and Microsoft's Impulse Trig-
ger technology. Some automobile steering wheel controllers, for example, are programmed
to provide a "feel" of the road. As the user makes a turn or accelerates, the steering wheel
responds by resisting turns or slipping out of control.
In 2013, Valve announced a line of Steam Machines microconsoles, including a new Steam
Controller unit that uses weighted electromagnets capable of delivering a wide range of
haptic feedback via the unit's trackpads.[23] These controllers' feedback systems are open to
the user, which allows the user to configure the feedback to occur in nearly limitless ways
and situations. Also, due to the community orientation of the controller, the possibilities to
have games interact with the controller's feedback system are only limited to the game's
design.
20
In 2014, the researchers in LG Electronics, led by Youngjun Cho, showed a new technique
to automatically generate haptic effects on a haptic cushion in interacting with multimedia
contents at ACTUATOR 2014 in Bremen, Germany.[24]
In 2015, Valve released the Steam Controller with HD Haptics, with haptic force actuators
on both sides of the controller deliver precise, high fidelity vibrations measured in micro-
seconds, can emulate the spin of a virtual trackball, the click of a scroll wheel, or the shot
of a rifle. Every input, from the triggers to the trackpads, can offer haptic feedback to fin-
gertips, delivering vital, high-bandwidth, tactile feedback about speed, boundaries, thresh-
olds, textures, or actions.[18]
In 2017, the Nintendo Switch's Joy-Con introduced the HD Rumble feature (similar to
Valve HD Haptics). It has a high level of precision, allowing it to simulate the feel of hold-
ing, moving and using objects. 1-2-Switch has a number of minigames demonstrating this
feature. It can even be as detailed as to create music, as experienced in Super Mario Party,
notifying a player of their turn.
Various haptic interfaces for medical simulation may prove especially useful for training of
minimally invasive procedures (laparoscopy/interventional radiology) and remote sur
21
gery using teleoperators. In the future, expert surgeons may work from a central workstation,
performing operations in various locations, with machine setup and patient preparation per-
formed by local nursing staff. Rather than traveling to an operating room, the surgeon in-
stead becomes a telepresence. A particular advantage of this type of work is that the surgeon
can perform many more operations of a similar type, and with lessfatigue.[19] It is well
documented that a surgeon who performs more procedures of a given kind will have statis-
tically better outcomes for his patients. Haptic interfaces are also used in rehabilitation ro-
botics.
In ophthalmology, "haptic" refers to a supporting spring, two of which hold an artificial lens
within the lens capsule (after surgical removal of cataracts).
A 'Virtual Haptic Back' (VHB) is being successfully integrated in the curriculum of students
at the Ohio University College of Osteopathic Medicine Research indicates that VHB is a
significant teaching aid in palpatory diagnosis (detection of medical problems via touch).
The VHB simulates the contour and compliance (reciprocal of stiffness) properties of hu-
man backs, which are palpated with two haptic interfaces (Sensible Technologies, Phantom
Reality-based modeling for surgical simulation consists of a continuous cycle. In the figure
given above, the surgeon receives visual and haptic (force and tactile) feedback and interacts
with the haptic interface to control the surgical robot and instrument. The robot with instru-
ment then operates on the patient at the surgical site per the commands given by the surgeon.
Visual and force feedback is then obtained through endoscopic cameras and force sensors
that are located on the surgical tools and are displayed back to the surgeon[20].
Barrow, who has a PhD in cybernetics, co-founded Generic in 2013. He has more than a
decade of experience in VR, haptics and robotics, and has collaborated on a number of med-
ical training simulators for different branches of surgery, as well as procedures such as cath-
eterisation and hernia repair.
“It’s no surprise that haptics in healthcare is a huge topic,” he said. “However, the number
of applications where haptics is being beneficially used in healthcare right now is very, very
small.”
22
Figure 5.1.2(b)Haptics can simulate the feel of different tissue types
As Barrow’s body of work suggests, surgery simulation is one of those areas. Currently,
surgeons learn predominantly from theory, observation, cadaver and close-monitored pa-
tient trial, where senior colleagues oversee their work. A refined sense of touch and hand-
eye coordination is obviously vital.
“When you’re practising to be a surgeon you need to develop an incredible array of abili-
ties,” said Barrow. “You need to be able to have great academic knowledge, decision mak-
ing, you need to look good in blue! One really important aspect is this close association of
feedback between the sense of touch and dexterous motion…so we can use haptic devices
to simulate doing procedures.”
In a warehouse looking much like a laser tag game room, nine soldiers gear up with flip
down goggle mounts, sensors strapped to their arms and legs and carry a computer-enhanced
weapon system.
Just five years ago, this scenario may have only been seen in a video game. Today, virtual
training environments are a reality.[20]The Dismounted Soldier Training System and En-
gagement Skills Trainer 2000 are two virtual training tools that are quickly becoming
thenorm for soldiers of the 157th Infantry Brigade, First Army Division East, in training
23
deploying units at Camp Atterbury Joint ManeuverTrainingCenter,Ind[18].
“One of the best parts of the DSTS is that we can create any operational environment, for
our training in a virtual environment. It does not replace training, but it can add to it. We
can bring the terrain of Afghanistan to the soldier. It’s hard to imagine a mountainous terrain
in Indiana, but the DSTS can create it,” said Sgt. 1st Class Aaron Hammond, Operations,
157th Infantry Brigade, First Army Division East.
Hammond and his team recently participated in a DSTS session to learn the capabilities
offered at CAJMTC virtual simulation centers. Geared up and ready to engage in a building
entry exercise, the nine-man squad immediately encountered and reacted to enemy fire.
With one member quickly disabled, the team must quickly adjust tactics, techniques, and
procedures, and continue their mission.[20]
“Providing the most realistic and relevant training is the benchmark for success in First
Army Division East when training soldiers for worldwide deployments. Our job is to repli-
cate situations in which the soldier will face and to create an environment to rehearse repet-
itively at the squad or team level,” said Capt. Marcus Long, 157th Infantry Brigade Training
Officer.[21]
5.1.4 Telerobotics
In a telerobotic system, a human operator controls the movements of a robot that is locat-
edsome distance away. Some teleported robots are limited to very simple tasks, such as
24
aiming a camera and sending back visual images. Haptics now makes it possible to include
touch cues in addition to audio and visual cues in telepresence models. It won't be long
before astronomers and planet scientists actually hold and manipulate a Martian rock
through an advanced hapticsenabled telerobot, a high-touch version of the Mars Explora-
tion Rover.
A. Communication is centered through touch and that the digital world can behave like the
real world.
B. Reduction of working time as objects can be captured, manipulated, modified and re-
scaled digitally.
C. Haptic touch technology can be applied to small and large surfaces and a wide range of
form factors.
D. With haptic hardware and software, the designer can maneuver the part and feel the
result, as if he/she were handling the physical object.
E. Medical field simulators allow would be surgeons to practice digitally, gaining confi-
dence in the procedure before working on breathing patients.
A. Haptic applications can be extremely complex requiring highly specialized hardware and
considerable processing power.
D. Debugging issues—these are complicated since they involve real-time data analysis.
25
5.3 FUTURE VISION
As haptics moves beyond the buzzes and thumps of today’s video games, technology will
enable increasingly believable and complex physical interaction with virtual or remote ob-
jects. Already haptically enabled commercial products let designers sculpt digital clay fig-
ures to rapidly produce new product geometry, museum goers feel previously inaccessible
artifacts, and doctors train for simple procedures without endangering patients.
Past technological advances that permitted recording, encoding, storage, transmission, edit-
ing, and ultimately synthesis of images and sound profoundly affected society. A wide range
science, were forever changed when we learned to capture, manipulate, and create sensory
stimuli nearly indistinguishable from reality. It’s not unreasonable to expect that future ad-
vancements in haptics will have equally deep effects. Though the field is still in its infancy,
hints of vast, unexplored intellectual and commercial territory add excitement and energy
For the field to move beyond today’s state of the art, researchers must surmount a number
of commercial and technological barriers. Device and software tool-oriented corporate ef-
forts have provided the tools we need to step out of the laboratory, yet we need new business
models. For example, can we create haptic content and authoring tools that will make the
Can the interface devices be made practical and inexpensive enough to make them widely
accessible? Once we move beyond single-point force-only interactions with rigid objects,
we should explore several technical and scientific avenues.[26] Multipoint, multihand, and
multi-person interaction scenarios all offer enticingly rich interactivity. Adding sub-modal-
ity stimulation such as tactile (pressure distribution) display and vibration could add subtle
26
and important richness to the experience. Modeling compliant objects, such as for surgical
simulation and training, presents many challenging problems to enable realistic defor-
mations, arbitrary collisions, and topological changes caused by cutting and joining ac-
tions.Improved accuracy and richness in object modeling and haptic rendering will require
advances in our understanding of how to represent and render psychophysically and cogni-
tively germane attributes of objects, as well as algorithms and perhaps specialty hardware
Development of multimodal workstations that provide haptic, visual, and auditory engage-
ment will offer opportunities for more integrated interactions. We’re only beginning to un-
derstand the psychophysical and cognitive details needed to enable successful multimodal-
ity interactions. For example, how do we encode and render an object so there is a seamless
consistency and congruence across sensory modalities—that is, does it look like it feels?
Are the object’s densities, compliance, motion, and appearance familiar and unconsciously
consistent with context? Are sensory events predictable enough that we consider objects to
Hopefully we could get bright solutions for all the queries in the near future itself.
27
CONCLUSIONS
Finally we shouldn’t forget that touch and physical interaction are among the fundamental
ways in which we come to understand our world and to effect changes in it. This is true on
world, as Frank Wilson suggested, “a new physics would eventually have to come into this
their brain, a new way of registering and representing the behavior of objects moving and
changing under the control of the hand.[27] It is precisely such a representational system—
a syntax of cause and effect, of stories, and of experiments, each having a beginning, a
middle, and an end— that one finds at the deepest levels of the organization of human lan-
guage.” Our efforts to communicate information by rendering how objects feel through
haptic technology, and the excitement in our pursuit, might reflect a deeper desire to speak
with an inner, physically based language that has yet to be given a true voice.
28
REFERENCES
1. Mudit Ratana Bhalla*,Harsh vardhan Bhalla, Anand Vardhan Bhalla , “Haptic technology -
introduction”, Haptic Technology: An Evolution towards Naturality in Communication,
ISSN No. 0976-5697,volume 1,No. 4 Nov-Dec 2010
2. Mudit Ratana Bhalla*,Harsh vardhan Bhalla, Anand Vardhan Bhalla , “History of Haptic
technology”, Haptic Technology: An Evolution towards Naturality in Communication, ISSN
No. 0976-5697,volume 1,No. 4 Nov-Dec 2010
3. B. Divya Jyothi1, R. V. Krishnaiah2,”working of haptics”, Haptic Technology - A Sense of
Touch, International Journal of Science and Research (IJSR), India Online ISSN: 2319-7064
4. Kiran K, Abhijit M, V. K. Parvati “Virtually Controlled Robotic Arm using Haptics” IEEE
2017
5. Megha Goyal, Dimple Saproo, Asha Bagashra, Kumar Rahul Dev “Haptics: Technology
Based on Touch” (IJSRET) Volume 2 Issue 8 pp 468-471 November 2013.
6. Hannaford, B. Kinesthetic Feedback Techniques in Teleoperated Systems. Advances in
Control and Dynamic Systems, C. Leondes, ed., Academic Press, 1991.
7. Miss. Snehal N. Meshram, Prof. Amit M. Sahu,” Haptic Science And Technology In Surgical
Simulation, Medical Training And Military Application-Haptic ffedback”, International
Journal of Computer Science and Mobile Computing, IJCSMC, Vol. 3, Issue. 4, April 2014,
pg.156 – 165, ISSN 2320–088X
8. Prof. Amit M. Sahu, haptic science and technology in surgical simulation, medical training
and military application“haptic device”, International Journal of Computer Science and Mo-
bile Computing, IJCSMC, Vol. 3, Issue. 4, April 2014, pg.156 – 165
9. Snehal N. Meshram et al,“feedback device” International Journal of Computer Science and
Mobile Computing, Vol.3 Issue.4, April- 2014, pg. 156-165
10. Snehal N. Meshram et al,” haptic device phantom” International Journal of Computer Sci-
ence and Mobile Computing, Vol.3 Issue.4, April- 2014, pg. 156-165
11. Linjama, J. and Kaaresoja, T.. Novel, minimalist haptic gesture interaction for mobile de-
vices. In Pro-ceedings of the Third Nordic Conference on Human-Computer interaction.
ACM Press 2004.
12. B. Gillespie and M. Cutkosky, “Interactive Dynamics with Haptic Display,” presented at
the 2nd Ann. Symp. on Haptic Interfaces for Virtual Environment and Teleoperator Sys-
tems, ASME/WAM, New Orleans, LA, DSC:55-1, pp. 65-72, 1993.]
29
13. C. B. Zilles and J. K. Salisbury, “A Constraint-based God-Object Method for Haptic Dis-
play,” presented at the 3rd Ann. Symp. on Haptic Interfaces for VirtualEnvironment and
Teleoperator Systems, ASME/IMECE, Chicago, IL, DSC:55-1, pp. 146-150, 1994.
14. Haptic Rendering: Introductory Concepts-Kenneth Salisbury and Francois Conti Stanford
University. Federico Barbagli Stanford University and University of Siena, Italy
15. Laboratory for Human and Machine Haptics: The Touch Lab-Dr. Mandayam A. Srinivasan,
Dr. S James Biggs, Dr. Manivannan Muniyandi, Dr. David W. Schloerb, Dr. Lihua Zhou
16. Fonz at the Killer List of Videogames,Haptic technology ,video game, wikipedia
17. Mark J. P. Wolf (2008), The video game explosion: a history from PONG to PlayStation
and beyond, p. 39, ABC-CLIO, ISBN 0-313-33868-X
18. "Microsoft and Immersion Continue Joint Efforts To Advance Future Development of Force
Feedback Technology". Stories. 3 February 1998
19. Snehal N. Meshram et al, International Journal of Computer Science and Mobile Computing,
Vol.3 Issue.4, April- 2014, pg. 156-165
20. Liu, Bhasin and Bower: A Haptic-Enabled Simulator for Cricothyroidotomy. Studies in
Health Technology and Informatics, vol. 111, pp. 308-313. IOS Press, 2005
21. Pettitt, Redden and Carstens: Comparison of Army Hand and Arm Signals to a Covert Tac-
tile Communication System in a Dynamic Environment. U.S Army Research Laboratory,
Technical Reports. 2006
22. Webster, Andrew (September 27, 2013). "Valve unveils the Steam Controller". The Verge.
Retrieved September 27, 2013.
23. Y. J., Cho. "Haptic Cushion: Automatic Generation of Vibro- tactile Feedback Based on
Audio Signal for Immersive Interaction with Multimedia". Research Gate. LG Electronics.
Archived from the original (PDF) on May 17, 2017.
24. "Nintendo's HD Rumble will be the best unused Switch feature of 2017". Engadget. Re-
trieved 2017-05-17.
25. Andrew Wade ,Midas touch – the technologies driving the haptics revolution, 18th Septem-
ber 2017 8:00 am,
26. Mudit Ratana Bhalla*, nov-dec 2010, “future vsion”international Journal of Advanced Re-
search in Computer Science, ISSN No. 0976-5697
30
31