Unit -5
Unit -5
Frameworks:
Augmented Reality and Virtual Reality
Introduction: The Eye, Ear, and
Somatic Senses in AR/VR
• Augmented Reality (AR) and Virtual Reality (VR) technologies are
designed to simulate and enhance sensory experiences by interacting
with our natural sensory systems—particularly the eyes, ears, and
somatic senses.
• Hese technologies aim to create immersive environments that can
fool the senses into perceiving digital content as part of the real world
(AR) or as an entirely new world (VR).
The Eye in AR/VR
• The eye is central to AR and VR experiences. VR headsets use
stereoscopic displays to deliver slightly different images to each eye,
creating a 3D effect and a sense of depth. High-resolution displays
and advanced optics are crucial for creating realistic visuals, reducing
eye strain, and preventing motion sickness.
• AR devices, like smart glasses, overlay digital information onto the
real world, requiring precise eye-tracking to align virtual objects with
the user's line of sight. Innovations in foveated rendering (which
prioritizes high detail where the eye is focused) and adaptive optics
are enhancing visual experiences in AR/VR.
Ear in AR/VR
• Sound plays a critical role in creating immersive AR/VR experiences.
Spatial audio technology simulates how sound behaves in the real
world, allowing users to perceive the direction and distance of sounds
in a virtual environment.
• In VR, this can create a more realistic and engaging experience,
making it feel as if sounds are coming from specific locations within
the virtual space. AR experiences often use audio cues to enhance
interaction with virtual objects or guide users through tasks, relying
on precise synchronization with visual elements.
Somatic Senses in AR/VR
• Somatic senses, which include touch, temperature, pain, and
proprioception, are increasingly being integrated into AR/VR
experiences to deepen immersion.
• Haptic feedback devices, such as gloves or suits, simulate the sense of
touch by providing vibrations, pressure, or even temperature changes
in response to interactions with virtual objects. Proprioception, the
sense of body position, is critical for full-body VR experiences, where
body tracking allows users to move and interact naturally within a
virtual space.
• Advanced systems can simulate weight, texture, and resistance,
making virtual interactions feel more lifelike.
Introduction to Sensor Hardware, Head-Coupled
Displays, and Acoustic Hardware in AR/VR
• Augmented Reality (AR) and Virtual Reality (VR) systems rely on a
combination of advanced hardware components to create immersive
and interactive experiences.
• Key to these systems are sensor hardware, head-coupled displays,
and acoustic hardware, each playing a vital role in delivering a
seamless and responsive virtual environment.
Sensor Hardware
• Sensor hardware is fundamental to AR/VR, providing the necessary data
to track the user’s movements, position, and interactions with the virtual
environment. The primary types of sensors used in AR/VR include:
• Motion Sensors: Accelerometers, gyroscopes, and magnetometers are
commonly used to detect head and body movements. These sensors
provide data on orientation and acceleration, allowing the system to
adjust the virtual environment in real-time as the user moves.
•Positional Tracking Sensors: These sensors, such as infrared
cameras or external base stations, track the exact position of the
user within a physical space. This tracking enables the system to
map the user’s movements onto their virtual avatar, providing a
more immersive experience.
• Eye-Tracking Sensors: Cameras or infrared sensors track the user’s eye
movements, enabling foveated rendering (where only the area being
looked at is rendered in high detail) and improving the realism of gaze
interactions with virtual objects.
• Haptic Sensors: In some advanced systems, haptic sensors in gloves
or suits can detect fine motor movements and provide tactile
feedback, simulating the feel of virtual objects.
Head-Coupled Displays
• Head-coupled displays are central to VR systems, providing users with
a visual interface to the virtual world.
• These displays are mounted on a headset and are designed to follow
the user's head movements, ensuring that the virtual environment
moves in sync with their gaze.
• Key features of head-coupled displays include:
• Stereoscopic Displays: These provide slightly different images to each
eye, creating a 3D effect and a sense of depth. High-resolution
displays are essential for creating realistic and detailed virtual
environments.
• Field of View (FOV): A wide field of view is crucial for
immersion, allowing users to see a broad area of the virtual
environment without needing to move their heads excessively.
• Refresh Rate and Latency: High refresh rates (often 90Hz or more)
and low latency are critical to prevent motion sickness and ensure
smooth visuals that respond quickly to head movements.
• Lens Technology: Advanced lenses reduce distortion and improve
clarity, making virtual objects appear more realistic and reducing eye
strain.
Acoustic Hardware
• Acoustic hardware in AR/VR is designed to replicate how we experience
sound in the real world, creating a convincing auditory environment that
complements the visual experience.
• The main components of acoustic hardware include:
• Spatial Audio Systems: These systems simulate the way sound interacts
with the environment, allowing users to perceive the direction, distance,
and movement of sounds within the virtual space. Spatial audio enhances
immersion by making it feel as if sounds are coming from specific locations.
• Microphones: In AR systems, microphones capture the user’s voice
and environmental sounds, which can then be processed and
integrated into the virtual experience. In VR, microphones can
enable voice commands or communication with other users in a
shared virtual space.
• Haptic Feedback for Sound: Some systems use haptic feedback to
simulate the physical sensation of sound, such as the vibration of a
loud bass note, adding another layer of realism to the audio
experience.