AR&VR Unit 1 Notes
AR&VR Unit 1 Notes
Example: VR Storytelling
Sci-fi Simulators
3. Remote Collaboration
Example: Artists use tools like Tilt Brush to create 3D digital art
in virtual space.
2. Output Devices
Deliver sensory information to the user.
Types:
Visual Output: Head-Mounted Displays (HMDs) like
Oculus Rift, HTC Vive.
Auditory Output: 3D spatial sound using headphones or
built-in speakers.
Tactile Output: Haptic feedback through gloves, suits, or
controllers.
Olfactory Devices (optional): Emit smells to enhance
immersion (e.g., Feelreal VR mask).
Features:
Stereoscopic 3D Display: Separate image for each eye for
depth perception.
Wide Field of View (FOV): 100–120° to cover most of the
human vision.
Lens Adjustment: For focus and interpupillary distance
(IPD).
Inside-Out Tracking: Built-in cameras to track the
environment (no external sensors needed).
4. Tracking System
Tracks movement of head, eyes, hands, or the entire body.
Types:
Outside-In Tracking: External sensors track headset and
controllers (e.g., HTC Vive base stations).
Inside-Out Tracking: Cameras on the headset track the
environment (e.g., Oculus Quest).
6 Degrees of Freedom (6DoF): Tracks rotation (pitch, yaw,
roll) and position (X, Y, Z).
Features:
Real-time rendering.
Physics simulation.
Lighting, shading, and animation support.
7. Audio System
Adds immersion through 3D spatial sound.
Components:
Binaural Audio: Makes sounds appear to come from
specific directions.
Environmental Effects: Echo, muffling, and distance sound
simulation.
Examples:
VR headsets with built-in spatial audio (e.g., Valve Index).
Surround sound headphones with head-tracking.
9. VR Software/Applications
The VR content or programs used.
Examples:
VR Games (e.g., Beat Saber)
Training Simulators (e.g., VR surgery)
Virtual Tours (e.g., museums, real estate)
Education (e.g., VR labs)
5. Mechanical Trackers
Use physical linkages (arms/joints) to track user movement.
✅ Pros: Highly accurate.
❌ Cons: Limited mobility and range, bulky.
👉 Example: Early VR gloves and position arms used in
simulation setups.
6. Hybrid Trackers
Combine two or more tracking methods (e.g., optical +
inertial) for better accuracy.
✅ Pros: Compensates for weaknesses of individual methods.
❌ Cons: More complex and costly.
👉 Example: Oculus Rift CV1 (optical + inertial), Meta Quest
(inside-out + IMU).
1. Camera-Based Devices
Use computer vision and image processing to detect gestures.
How it works: Cameras track hand/body/facial movements using real-
time image analysis.
Examples:
o Microsoft Kinect – full-body tracking using depth + RGB camera.
o Leap Motion – precise finger tracking using infrared cameras.
o Intel RealSense – tracks face, hands, and depth for gesture input.
✅ Used in: VR games, AR apps, smart TVs.
2. Projectors
Display interactive visuals on flat or 3D surfaces.
Used in:
o Interactive walls, smart classrooms, virtual museum
displays.
o Motion-controlled presentations or public kiosks.
✅ Example: Pointing at the wall moves a projected object.
3. Real-Time Rendering
Graphics displays work with GPUs to update visuals based
on user movement and interaction.
Enables low-latency, smooth experience crucial for
immersion.
✅ Example: When you turn your head in VR, the scene updates
instantly to reflect the new view.
5. Overlay of Information
In AR, displays overlay data, directions, labels, and
interactive elements on top of real-world views.
Helps users navigate, interact, and learn without
distraction.
✅ Example: AR glasses showing directions and road names
while walking.
2. Powerwalls
Description: High-resolution, wall-sized display screens
that allow users to view complex data or 3D models.
Features: Often flat or slightly curved; may be used with
tracking systems.
Use Case: Collaborative design, big data visualization,
simulations.
3. Dome Displays
Description: Hemispherical or dome-shaped screens
where images are projected to create a surround-view
experience.
Features: Offers a 360-degree field of view.
Use Case: Planetariums, flight simulation, military
training.
4. Holographic Displays
Description: Volumetric displays that create 3D images
visible from multiple angles without needing headsets.
Features: Can be freestanding or integrated into a large
stage/display.
Use Case: Museum exhibits, advertising, collaborative AR
experiences.
4. AR Technology Classification:
1. Marker-Based AR
2. Markerless AR
3. Location-Based AR
4. Projection-Based AR
5. Superimposition-Based AR
6. SLAM-Based AR
7. Wearable AR (like HoloLens)
8. WebAR (AR via web browsers without an app)
6. Types of Rendering:
Rasterization: Fast, real-time graphics.
Ray Tracing: Realistic lighting, slower.
Volume Rendering: Used in medical AR.
Hybrid Rendering: Mix of raster and ray tracing.
Non-photorealistic Rendering (NPR): Artistic effects.
Stereoscopic Rendering: Dual images for 3D depth in VR.
7. Ways of 3D Registration:
Fiducial Markers (QR codes, AR tags)
Natural Feature Tracking (corners, edges)
Inertial Measurement Units (IMUs)
Depth Cameras (e.g., Kinect)
GPS + Compass (for outdoor AR)