0% found this document useful (0 votes)
15 views35 pages

AR&VR Unit 1 Notes

The document outlines the fundamental aspects of Virtual Reality (VR), including its three key components: Immersion, Interaction, and Imagination, along with their real-world applications. It discusses the benefits of VR, such as enhanced learning, improved engagement, and remote collaboration, as well as the components of a VR system, including input and output devices, tracking systems, and software. Additionally, it covers various types of trackers and gesture input devices, emphasizing their significance in creating immersive experiences.

Uploaded by

saruhasan1103
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views35 pages

AR&VR Unit 1 Notes

The document outlines the fundamental aspects of Virtual Reality (VR), including its three key components: Immersion, Interaction, and Imagination, along with their real-world applications. It discusses the benefits of VR, such as enhanced learning, improved engagement, and remote collaboration, as well as the components of a VR system, including input and output devices, tracking systems, and software. Additionally, it covers various types of trackers and gesture input devices, emphasizing their significance in creating immersive experiences.

Uploaded by

saruhasan1103
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 35

UNIT I INTRODUCTION

1.Sketch the three I’s of virtual reality with appropriate real-


time example.
Virtual Reality (VR):
A computer-generated environment that completely
replaces the real world, making you feel like you are
somewhere else using a headset.
The Three I’s of Virtual Reality (VR) are essential pillars that define a rich
and immersive VR experience. They are:
1. Immersion - "Feeling Present in the Virtual World"
Definition: The sense of being physically present in a non-physical world. It
makes users feel they are inside the virtual environment.
 Feeling present in a virtual world.
 Achieved using head-mounted displays (HMDs), 3D audio, and
spatial tracking.
 Increases realism and presence.
 Deepens emotional connection to virtual scenarios.
Example: i)VR Gaming
ii)Google Earth VR
2. Interaction – "Engaging with the Virtual World"
Definition: The ability of users to interact with the virtual environment and
receive appropriate feedback.
 Involves motion tracking, voice commands, gesture control, and haptic
feedback.
 Makes the VR experience dynamic and engaging.
 Enhances learning and task performance.
 Can be two-way: user affects environment and vice versa.
Example: Surgical Training Simulators
VR Shopping
3. Imagination – "Creating and Exploring Beyond Reality"
Definition: Experiencing the impossible or unreal.

 Unlocks creativity and innovation.


 Enables fantasy, sci-fi, or abstract environments.
 Great for storytelling and role-playing.
 Enhances entertainment, education, and therapy .

Example: VR Storytelling

Sci-fi Simulators

2. Benefits of Virtual reality?

1. Enhanced Learning & Training

 Provides realistic simulations for hands-on practice.


 Increases retention and understanding through visual
experiences.
 Safe and repeatable practice in fields like medicine,
aviation, engineering, and military.

Example: Medical students use VR to perform virtual surgeries


without harming real patients.
2. Improved Engagement

 Makes learning or gaming more fun, interactive, and


immersive.
 Encourages active participation, especially among
students or trainees.

Example: Students learn history by “walking” through ancient


cities in VR.

3. Remote Collaboration

 Allows people from different places to meet and work


together in a shared virtual space.
 Boosts communication and teamwork in businesses and
classrooms.

Example: Teams use VR meeting apps like Horizon Workrooms


or Spatial for virtual office meetings.

4. Safe Testing Environment

 Users can practice risky or complex tasks in a virtual


space without real-world consequences.

Example: Firefighters train in simulated burning buildings to


learn rescue techniques safely.
5. Cost-Effective Solutions

 Reduces the need for physical materials or travel.


 Saves money on training setups, field trips, or equipment.

Example: VR lab simulations save schools money on chemicals


or lab tools.

6. Enhanced Creativity & Imagination


 Lets users explore impossible scenarios, design 3D
models, or build virtual worlds.

Example: Artists use tools like Tilt Brush to create 3D digital art
in virtual space.

7. Rehabilitation & Therapy

 Used in mental health therapy, pain management, and


physical rehab.
 Helps patients with phobias, PTSD, or stroke recovery.

Example: A VR app helps PTSD patients gradually face their


fears in a safe space.
8. Realistic Simulations for Skill Development

 Useful in driving, flying, machinery operation, and sports.

Example: Pilots use VR flight simulators for training without


using actual planes.

9. Entertainment & Gaming


 Offers fully immersive games with high realism and
interaction.
 Creates a new level of user experience compared to
traditional games.

Example: VR games like Half-Life: Alyx or Beat Saber give a


lifelike gaming experience.

10. Improved Accessibility

 People with disabilities can explore inaccessible places or


learn through customized experiences.

Example: Wheelchair users can experience mountain climbing


or museum tours in VR.
3. Components of VR System ?
1. Input Devices
Allow users to interact with the virtual world.
Examples:
 VR Controllers (Oculus Touch, Vive controllers)
 Motion Trackers (Leap Motion, Kinect)
 Haptic Gloves (e.g., Manus VR)
 Voice Commands (via microphones or AI assistants)
 Eye Tracking Sensors (detect gaze direction)
Functions:
 Control movement, gestures, and object manipulation.
 Capture hand/finger motion.
 Enhance realism through natural interactions.

2. Output Devices
Deliver sensory information to the user.
Types:
 Visual Output: Head-Mounted Displays (HMDs) like
Oculus Rift, HTC Vive.
 Auditory Output: 3D spatial sound using headphones or
built-in speakers.
 Tactile Output: Haptic feedback through gloves, suits, or
controllers.
 Olfactory Devices (optional): Emit smells to enhance
immersion (e.g., Feelreal VR mask).

3. Head-Mounted Display (HMD)


The main visual device worn on the head.

Features:
 Stereoscopic 3D Display: Separate image for each eye for
depth perception.
 Wide Field of View (FOV): 100–120° to cover most of the
human vision.
 Lens Adjustment: For focus and interpupillary distance
(IPD).
 Inside-Out Tracking: Built-in cameras to track the
environment (no external sensors needed).
4. Tracking System
Tracks movement of head, eyes, hands, or the entire body.
Types:
 Outside-In Tracking: External sensors track headset and
controllers (e.g., HTC Vive base stations).
 Inside-Out Tracking: Cameras on the headset track the
environment (e.g., Oculus Quest).
 6 Degrees of Freedom (6DoF): Tracks rotation (pitch, yaw,
roll) and position (X, Y, Z).

5. Computing Device / Processing Unit


Runs the VR software, processes sensor data, and renders
graphics.
Can Be:
 High-End PCs: Needed for tethered VR like Oculus Rift or
HTC Vive.
 Mobile Devices: Used in smartphone VR (e.g., Google
Cardboard).
 Standalone Headsets: Built-in processors (e.g., Meta
Quest 2).
Important Specs:
 High GPU performance (for rendering 3D graphics).
 High refresh rate (90–120Hz) to reduce motion sickness.

6. Graphics Rendering Engine


Software that creates the virtual environment in real-time.
Popular Engines:
 Unity – widely used for both games and simulations.
 Unreal Engine – high-end visuals and cinematic quality.

Features:
 Real-time rendering.
 Physics simulation.
 Lighting, shading, and animation support.

7. Audio System
Adds immersion through 3D spatial sound.
Components:
 Binaural Audio: Makes sounds appear to come from
specific directions.
 Environmental Effects: Echo, muffling, and distance sound
simulation.
Examples:
 VR headsets with built-in spatial audio (e.g., Valve Index).
 Surround sound headphones with head-tracking.

8. Haptic Feedback Devices (Optional)


Provides a sense of touch in the virtual environment.
Examples:
 Vibration-enabled controllers.
 Haptic suits or vests (e.g., Teslasuit).
 Haptic gloves for feeling textures, pressure, or shape.

9. VR Software/Applications
The VR content or programs used.
Examples:
 VR Games (e.g., Beat Saber)
 Training Simulators (e.g., VR surgery)
 Virtual Tours (e.g., museums, real estate)
 Education (e.g., VR labs)

10. Environment & Scene Data


The 3D models, textures, sound files, and interactive elements
that form the virtual world.
Formats Used:
 .FBX, .OBJ for 3D models
 .WAV, .MP3 for audio
 Scripts and animations for interactivity
4. Types of Trackers?
1. Optical Trackers (Camera-Based)
Use cameras to track markers, LEDs, or natural features.
Types:
 Outside-in Tracking: External cameras track the
headset/controllers.
👉 Example: HTC Vive with external base stations
(Lighthouse system).
 Inside-out Tracking: Cameras on the headset track the
environment.
👉 Example: Meta Quest 2, HoloLens.
✅ Pros: Accurate, supports large areas.
❌ Cons: Sensitive to lighting, may lose tracking if view is
blocked.

2. Inertial Trackers (IMU - Inertial Measurement Unit)


Use accelerometers, gyroscopes, and magnetometers to track
orientation and movement.
✅ Pros: Fast response, lightweight, no external setup.
❌ Cons: Can drift over time (not always accurate without
correction).
👉 Example: Smartphones, Meta Quest, Oculus Rift use IMUs.
3. Magnetic Trackers
Track movement using a magnetic field generated by a base
station and measured by sensors.
✅ Pros: Works without a direct line of sight.
❌ Cons: Can be affected by metal or electromagnetic
interference.
👉 Example: Polhemus Liberty tracker (used in some medical
VR applications).

4. Acoustic (Ultrasonic) Trackers


Use ultrasonic sound waves to detect distance and position
based on time-of-flight.
✅ Pros: Inexpensive, simple for short-range tracking.
❌ Cons: Limited range, sensitive to environmental noise.
👉 Example: Early VR systems and some mobile AR sensors.

5. Mechanical Trackers
Use physical linkages (arms/joints) to track user movement.
✅ Pros: Highly accurate.
❌ Cons: Limited mobility and range, bulky.
👉 Example: Early VR gloves and position arms used in
simulation setups.

6. Hybrid Trackers
Combine two or more tracking methods (e.g., optical +
inertial) for better accuracy.
✅ Pros: Compensates for weaknesses of individual methods.
❌ Cons: More complex and costly.
👉 Example: Oculus Rift CV1 (optical + inertial), Meta Quest
(inside-out + IMU).

5. Input Devices for AR?


Input devices allow users to interact with or control AR
content that’s overlaid on the real world.
1. Touchscreens
Used in smartphones and tablets to control AR apps.
 How it works: Taps, swipes, pinches to manipulate virtual
objects.
 Example: Pinching to zoom on a furniture item in IKEA AR
app.

2. Voice Input (Microphones + Voice Assistants)


AR systems use voice commands for hands-free control.
 How it works: Users speak commands interpreted by AI.
 Example: “Open map” or “Scan this object” on HoloLens
or Google Lens.

3. Gesture Recognition Devices / Hand Tracking


Detect hand and body movements to control AR content.
 How it works: Cameras or sensors track gestures.
 Examples:
o Microsoft HoloLens hand tracking
o Magic Leap using spatial gesture input
o Leap Motion sensor for finger tracking
4. Cameras and Depth Sensors
These are both input and sensing devices that scan the real-
world environment.
 How it works: Recognize surfaces, objects, or QR codes to
place virtual content.
 Example: AR measuring tools use the camera to detect
surface lengths.

5. Eye Tracking Sensors


Track eye movement to adjust focus or control interface.
 How it works: Infrared sensors detect gaze direction.
 Used in: Advanced AR headsets for faster navigation or
attention tracking.

6. Haptic Input Devices


Devices that detect physical touch, pressure, or movement.
 How it works: Gloves or wearable sensors detect finger
movements.
 Example: AR surgery simulators using haptic gloves.
7. Wearable Sensors
Track user motion and position.
 Types: Accelerometers, gyroscopes, magnetometers (IMU
sensors)
 Used in: Smart glasses, AR helmets, smartphones

8. External Controllers (Optional)


Some AR systems use joysticks or handheld remotes for input.
 Example: Magic Leap controller, handheld remotes for AR
games

5. Types of Gesture Input Devices?


Types of Gesture Input Devices

1. Camera-Based Devices
Use computer vision and image processing to detect gestures.
 How it works: Cameras track hand/body/facial movements using real-
time image analysis.
 Examples:
o Microsoft Kinect – full-body tracking using depth + RGB camera.
o Leap Motion – precise finger tracking using infrared cameras.
o Intel RealSense – tracks face, hands, and depth for gesture input.
✅ Used in: VR games, AR apps, smart TVs.

2. Wearable Gesture Devices


Worn on the body (like gloves or wristbands) to detect motion and position.
 How it works: Use sensors (accelerometers, gyroscopes, flex sensors) to
track gestures.
 Examples:
o Data Gloves (e.g., Manus VR, SenseGlove) – track finger bending
and motion.
o Myo Armband – detects muscle activity (EMG) and arm motion.
o VR Gloves – provide both gesture input and haptic feedback.
✅ Used in: Robotics, medical simulations, AR/VR environments.

3. Touchless Sensors (Infrared or Ultrasonic)


Detect motion in space using waves or sensors—no wearables needed.
 How it works: Sense hand movements using reflected IR or sound waves.
 Examples:
o Ultraleap (formerly Leap Motion) – IR tracking of hands and fingers.
o Gesture sensors in smart appliances (TVs, washing machines).
✅ Used in: Public kiosks, contactless control systems.

4. Capacitive/Proximity Gesture Sensors


Detect presence and movement near a surface without physical contact.
 How it works: Measure changes in electric fields.
 Examples:
o Google Soli chip (used in Pixel 4) – radar-based gesture sensing.
o Proximity sensors in smartphones for swipe gestures.
✅ Used in: Mobile AR, compact consumer devices.

5. Vision-Based Mobile Devices


Use built-in cameras and AI to recognize gestures via apps.
 How it works: Smartphone/tablet cameras + AR software detect gestures.
 Examples:
o AR emoji control with hand gestures.
o TikTok/Instagram filters reacting to face or hand movements.
✅ Used in: AR apps, social media filters.

6. Output Devices in Gesture Interface?


Main Output Devices Used in Gesture Interfaces

1. Display Screens (Visual Output)


Show visual responses to gesture commands.
 Types:
o Monitor / LED screen – used in desktops, smart TVs.
o AR Headsets / Glasses – overlay digital content on
real world.
o VR Headsets – immersive response in virtual
environments.
✅ Example: A wave gesture opens a menu on a smart TV.

2. Projectors
Display interactive visuals on flat or 3D surfaces.
 Used in:
o Interactive walls, smart classrooms, virtual museum
displays.
o Motion-controlled presentations or public kiosks.
✅ Example: Pointing at the wall moves a projected object.

3. Speakers / Audio Output Devices


Provide auditory feedback in response to gestures.
 Used in:
o Virtual assistants
o Smart home devices
o AR/VR voice response systems
✅ Example: A clap gesture triggers a voice response from a
robot assistant.

4. Haptic Devices (Tactile Feedback)


Give physical feedback (vibration, force) when gestures are
recognized.
 Devices:
o Haptic gloves
o Haptic suits
o Vibrating controllers
✅ Example: In VR, when you grab a virtual object with a
gesture, you feel vibration in your glove.

5. LEDs or Light Indicators


Simple visual signals showing gesture recognition or system
state.
 Used in:
o Smart appliances
o Touchless sanitizers or devices
✅ Example: Waving your hand triggers a light to turn green,
indicating success.

6. Augmented Reality Devices (AR Headsets)


Display gesture-driven information in real-world view.
 Examples:
o Microsoft HoloLens
o Magic Leap
o Google Glass (basic gesture interaction)
✅ Example: Pinching air displays a virtual object in your room.

7. VR Systems (Immersive Output)


Provide 3D visual and spatial audio feedback in virtual
environments.
 Output types: Visuals, 3D audio, environmental
simulation.
✅ Example: Moving your hand teleports you within the VR
world, with full visual and audio cues.
7. Describe the role of graphics displays in immersive
technologies like virtual reality (VR) and augmented reality
(AR).
A graphics display is an electronic visual output device
that presents images, text, and graphical content generated by
a computer or digital system. It converts digital signals into
visual representations that users can see and interact with.
Key Roles of Graphics Displays in VR and AR

1. Realistic Visual Experience


 Displays render 3D virtual environments or digital
overlays in high resolution.
 In VR: They fully immerse the user by replacing the real-
world view.
 In AR: They superimpose graphics on the real-world
background.
✅ Example: A VR headset showing a 3D mountain landscape
that responds as you move.

2. Depth and Spatial Perception


 Graphics displays in immersive systems support
stereoscopic vision (two slightly different images for each
eye).
 Helps simulate depth and distance, making objects
appear closer or farther.
✅ Example: In AR, a floating 3D model of a chair looks like it’s
actually in your room.

3. Real-Time Rendering
 Graphics displays work with GPUs to update visuals based
on user movement and interaction.
 Enables low-latency, smooth experience crucial for
immersion.
✅ Example: When you turn your head in VR, the scene updates
instantly to reflect the new view.

4. Field of View (FoV)


 Wider FoV in head-mounted displays enhances realism by
covering more of the user’s vision.
 Helps eliminate the feeling of looking through a screen or
window.
✅ Example: 110°–120° FoV in VR headsets makes users feel
truly "inside" the virtual world.

5. Overlay of Information
 In AR, displays overlay data, directions, labels, and
interactive elements on top of real-world views.
 Helps users navigate, interact, and learn without
distraction.
✅ Example: AR glasses showing directions and road names
while walking.

6. Enhancing Interaction Feedback


 High-resolution and color-accurate displays provide visual
confirmation of user actions.
 Important for applications like remote surgery, design,
training, or gaming.
✅ Example: In AR surgery training, a doctor sees a highlighted
area respond to hand gestures.
7. Reducing Motion Sickness
 Good graphics displays with high refresh rate (≥90 Hz)
and low latency help avoid dizziness and discomfort.
 Ensures longer and more comfortable usage.

8. Discuss the various types of large volume displays.


1. CAVE (Cave Automatic Virtual Environment)
 Description: A room-sized cube with walls (and
sometimes floors/ceilings) acting as projection screens.
 Features: Uses stereoscopic projectors and tracking
systems to give a highly immersive experience.
 Use Case: Engineering design, scientific visualization,
virtual walkthroughs.

2. Powerwalls
 Description: High-resolution, wall-sized display screens
that allow users to view complex data or 3D models.
 Features: Often flat or slightly curved; may be used with
tracking systems.
 Use Case: Collaborative design, big data visualization,
simulations.
3. Dome Displays
 Description: Hemispherical or dome-shaped screens
where images are projected to create a surround-view
experience.
 Features: Offers a 360-degree field of view.
 Use Case: Planetariums, flight simulation, military
training.

4. Holographic Displays
 Description: Volumetric displays that create 3D images
visible from multiple angles without needing headsets.
 Features: Can be freestanding or integrated into a large
stage/display.
Use Case: Museum exhibits, advertising, collaborative AR
experiences.

5. Immersive Room Displays


 Description: Entire rooms covered with projection-
capable surfaces or display panels.
 Features: Full environmental immersion; supports multi-
user interaction.
 Use Case: VR collaboration, medical simulation, defense.

6. Curved/360° LED Walls


 Description: Seamless LED walls curved around the user
to provide immersive visuals.
 Features: High brightness, wide field of view.
 Use Case: Entertainment, architectural visualization,
command centers.

7. Light Field Displays


 Description: Display systems that replicate how light rays
are emitted from a scene, enabling real depth perception.
 Features: No need for special glasses; allows multiple
viewers from different angles.
 Use Case: Medical imaging, 3D mapping, AR showcases.
PART A

1. Define AR (Augmented Reality):


 AR overlays digital content onto the physical world.
 It enhances real-world experiences with virtual objects.
 It works in real time using a camera, sensor, and display.
 It allows user interaction with both real and virtual
elements.
 Examples include mobile AR apps, smart glasses, and AR
games.

2. Example & Functionality of AR:


Example: IKEA Place app – lets users place virtual furniture in
their real room.
Functionality:
 Detects flat surfaces.
 Aligns 3D furniture models.
 Allows users to scale, rotate, and move furniture virtually.
 Enhances shopping and planning experience.

3. How AR integrates with digital environment:


 Uses sensors and cameras to map the real world.
 Employs computer vision to recognize objects and
locations.
 Utilizes graphics rendering engines to place digital
elements.
 Synchronizes real and virtual elements through
registration and tracking.
 Integrates networking to fetch or send AR content/data.

4. AR Technology Classification:
1. Marker-Based AR
2. Markerless AR
3. Location-Based AR
4. Projection-Based AR
5. Superimposition-Based AR
6. SLAM-Based AR
7. Wearable AR (like HoloLens)
8. WebAR (AR via web browsers without an app)

5. Components of Interactive VR System:


 Display System (HMD, curved screen)
 Audio System (3D sound)
 Haptic Devices (gloves, suits)
 Tracking System (motion sensors, cameras)
 Input Devices (joystick, controllers, gesture sensors)
 Computer System (GPU, processor)
 Software (VR simulation engine)

6. Types of Rendering:
 Rasterization: Fast, real-time graphics.
 Ray Tracing: Realistic lighting, slower.
 Volume Rendering: Used in medical AR.
 Hybrid Rendering: Mix of raster and ray tracing.
 Non-photorealistic Rendering (NPR): Artistic effects.
 Stereoscopic Rendering: Dual images for 3D depth in VR.

7. Ways of 3D Registration:
 Fiducial Markers (QR codes, AR tags)
 Natural Feature Tracking (corners, edges)
 Inertial Measurement Units (IMUs)
 Depth Cameras (e.g., Kinect)
 GPS + Compass (for outdoor AR)

8. Role of Tracking Devices:


 Tracks user position and orientation.
 Maintains correct alignment of AR/VR elements.
 Supports navigation and interaction.
 Enables multi-user shared AR.
 Used in head tracking, hand tracking, eye tracking.

9. Example of Multimodal Interface:


VR Surgical Training:
 Voice commands to switch tools.
 Hand gestures to interact with virtual patient.
 Eye gaze tracking to control focus.
Multimodal interfaces improve natural user interaction and
reduce dependency on single input methods.

10. Define Tracker Jitter:


 Small fluctuations in sensor readings.
 Causes unstable or flickering digital content.
 Can reduce user immersion.
 Often caused by low-quality sensors or interference.

11. Define Latency:


 Delay between action and response.
 Measured in milliseconds (ms).
 High latency causes motion sickness.
 Ideal latency for VR: < 20 ms.

12. What is 3D Graphic in AR?


 Virtual 3D objects placed in real space.
 Respond to lighting, shadows, and physics.
 Allow rotation, zoom, and interaction.
 Used in education, games, product demos.

13. Two Disadvantages of Magnetic Trackers:


 Metal interference distorts tracking data.
 Limited operational range (short distance).
 Affected by electromagnetic fields.
 Calibration required often.

14. What is a Personal Graphics Display?


 Display used by one person.
 Examples: HMDs, AR glasses, smartphones.
 Offers immersive visual feedback.
 Lightweight and portable.
 Used in gaming, remote work, fieldwork.

15. Sound Display?


 Provides 3D positional audio.
 Enhances immersion and realism.
 Guides user attention or actions.
 Includes binaural audio, surround sound.
 Example: Footstep sound in a VR game behind the user.
THE END

You might also like