3-Sensors in ADAS
3-Sensors in ADAS
3-Sensors in ADAS
Radar Sensors: Audi Q3 typically features radar sensors positioned at the front, rear, and
sometimes sides of the vehicle. These radar sensors enable functionalities such as adaptive
cruise control, which automatically adjusts the vehicle's speed to maintain a safe distance
from the vehicle ahead, and rear cross-traffic alert, which warns the driver of approaching
vehicles when reversing out of parking spaces.
Camera Systems: High-resolution cameras are used for various ADAS functionalities,
including lane-keeping assistance, traffic sign recognition, and pedestrian detection. These
cameras provide visual data to the vehicle's onboard computer, which processes the
information to identify lane markings, traffic signs, and pedestrians, and assists the driver
accordingly.
1
Ultrasonic Sensors: Ultrasonic sensors are commonly used for parking assistance and
proximity detection. They emit high-frequency sound waves to detect nearby objects and
provide feedback to the driver through visual or audible alerts. In the Audi Q3, ultrasonic
sensors are often located on the front and rear bumpers to assist with parking maneuvers.
Lidar Systems: Some high-end Audi models may also incorporate lidar (Light Detection and
Ranging) sensors, which use laser beams to create detailed 3D maps of the vehicle's
surroundings. Lidar sensors are particularly effective for object detection and localization,
especially in complex urban environments.
GPS and Inertial Measurement Units (IMUs): While primarily used for navigation
purposes, GPS and IMU sensors can also contribute to ADAS functionalities by providing
information about the vehicle's position, velocity, and orientation. This data can be used in
conjunction with other sensor inputs for functions like adaptive cruise control and lane-
keeping assistance.
These sensors work together to provide a comprehensive understanding of the vehicle's
environment, enabling ADAS functionalities that enhance safety, convenience, and comfort
for the driver and passengers. Audi, like other automakers, continues to innovate and refine its
sensor technologies to improve the effectiveness and reliability of its ADAS systems.
2- AUTOMATIVE RADAR:
2
Adaptive Cruise Control (ACC):Radar sensors are used in ACC systems to maintain a safe
distance from the vehicle ahead. They measure the distance and relative speed of the lead
vehicle and adjust the vehicle's speed accordingly.
Blind Spot Detection: Radar sensors can detect vehicles in the driver's blind spots and alert
the driver to prevent unsafe lane changes.
Cross-Traffic Alert: Radar sensors can detect vehicles approaching from the side, such as
when backing out of a parking space, and provide warnings to the driver to avoid collisions.
basic working of radar :
Transmitter: The radar system starts by emitting radio waves, typically in the form of short
pulses, from a specialized antenna called the transmitter. These radio waves travel through the
air at the speed of light.
Target Reflection: When the emitted radio waves encounter an object in their path, some of
the waves are reflected back towards the radar system due to the property of reflection. The
amount of reflection depends on the object's size, shape, material, and distance from the radar.
Receiver: The radar system includes a receiver antenna that captures the reflected radio
waves after they bounce off objects. The receiver then processes the received signals to
extract information about the detected objects.
Doppler Effect: In addition to detecting the presence of objects, radar can also measure their
speed and direction of movement using the Doppler effect. This effect causes a change in the
frequency of the reflected radio waves when an object is in motion relative to the radar
system. By analyzing the frequency shift of the reflected waves, the radar can determine the
speed and direction of the moving object.
1
Signal Processing:The received signals are processed by sophisticated signal processing
algorithms to filter out noise, identify relevant targets, and calculate their distance, speed, and
other characteristics. This processed information is then used to generate visual or audible
alerts to the radar operator or integrated into autonomous systems for decision-making.
Display: The radar system typically includes a display unit where the detected objects and
their characteristics are presented to the radar operator or vehicle driver. This display may
show the location, distance, speed, and direction of movement of detected objects, allowing
the operator to make informed decisions.
3- Camera:
A camera works on the principle of passive light sensors to produce a digital image of
a covered region.
Cameras are capable of detecting both moving and static objects within their
surrounding environment.
See colors and textures and can identify road signs, traffic lights, lane markings, object
type (class)…
low cost and high availability
High processing power required in analysing the data (camera Images)
Camera are highly preferred in many ADAS applications and are widely accepted for
AD.
Type of cameras for ADAS:
1. Monocular Cameras: Capture 2D images for lane departure warning, traffic sign
recognition, and pedestrian detection.
2
Advantages:
Cost-effective and lightweight.
Widely available and easily integrated into vehicles.
Suitable for functions like lane departure warning, traffic sign
recognition, and pedestrian detection.
Disadvantages:
Limited depth perception compared to stereo cameras.
accu Less rate distance estimation.
2. Stereo Cameras: Provide depth perception by comparing images from two lenses,
enhancing accuracy in features like collision warning and adaptive cruise control.
Advantages:
Provide depth perception, enhancing accuracy in features like forward
collision warning and adaptive cruise control.
Better distance estimation compared to monocular cameras.
Disadvantages:
More complex and expensive than monocular cameras.
Require careful calibration to ensure accurate depth measurements.
1
Application of camera in ADAS:
1. Lane Departure Warning and Lane Keeping Assist: Detects lane markings and assists in
staying within the lane.
2. Forward Collision Warning and Automatic Emergency Braking: Identifies obstacles
and alerts or brakes to prevent collisions.
3. Pedestrian Detection: Recognizes pedestrians and helps avoid accidents.
4. Traffic Sign Recognition: Reads and displays traffic signs for driver awareness.
5. Driver Monitoring: Tracks driver attentiveness and alerts if signs of distraction or fatigue
are detected.
6. Highway Autopilot: Supports semi-autonomous driving on highways.
7. Parking Assistance: Provides live views for safer parking maneuvers.
4- Ultrasonic sensors:
2
Used for short Range applications like Automated parking, etc.
Ultrasonic Cleaning Ultrasonic sensors are used in cleaning equipment for removing
contaminants from surfaces through high-frequency sound waves. This technology is
1
commonly employed in jewelry cleaning, electronics manufacturing, and medical instrument
sterilization.
Flow Measurement: Ultrasonic flow meters measure the flow rate of liquids and gases in
pipelines and channels. By emitting ultrasonic pulses through the flowing medium and
analyzing the time it takes for echoes to return, these sensors accurately determine velocity
and volume flow rates.
Distance Sensing in Consumer Electronics: Ultrasonic sensors are integrated into consumer
electronic devices such as smartphones and wearable gadgets for gesture recognition,
touchless interfaces, and proximity detection. They enable features like automatic screen
brightness adjustment and hands-free operation.
Medical Imaging: Ultrasonic sensors play a critical role in medical imaging techniques like
ultrasound scans (sonography). They emit high-frequency sound waves into the body and
detect echoes reflected by internal tissues and organs, allowing healthcare professionals to
visualize and diagnose medical conditions non-invasively.
Liquid Level Monitoring in Agriculture:** In agriculture, ultrasonic sensors are used to
monitor liquid levels in irrigation systems, water reservoirs, and livestock watering troughs.
They help optimize water usage, prevent overflows or shortages, and ensure proper hydration
for crops and animals.
Navigation and Mapping: Ultrasonic sensors are employed in unmanned aerial vehicles
(UAVs), autonomous robots, and drones for navigation and mapping. They provide distance
measurements to obstacles and help in collision avoidance and path planning.
Ultrasonic sensor in ADAS(mostly use for parking assistance)
Ultrasonic sensors play a significant role in Advanced Driver Assistance Systems (ADAS),
primarily in parking assistance applications.
Here's how they are utilized in parking assistance systems:
Parking Obstacle Detection: Ultrasonic sensors are strategically placed around the vehicle,
typically in the front and rear bumpers. They emit ultrasonic waves and measure the time it
takes for the waves to bounce back after hitting an obstacle. This information is then used to
calculate the distance between the vehicle and the obstacle.
2
Distance Warning: As the vehicle approaches an obstacle while parking, the ultrasonic
sensors detect the decreasing distance and trigger visual and/or audible warnings inside the
vehicle. These warnings alert the driver to the proximity of the obstacle and help prevent
collisions during parking maneuvers.
Parking Aid Display: Many vehicles equipped with ultrasonic parking assistance systems
feature a parking aid display on the infotainment screen or dashboard. This display provides a
visual representation of the vehicle's surroundings, indicating the distance to nearby obstacles
using colored bars or other graphics. The information from the ultrasonic sensors helps create
this real-time visualization, aiding the driver in parking safely and accurately.
Automatic Parking Assistance: In more advanced systems, ultrasonic sensors can be
integrated with other sensor technologies, such as cameras and radar, to provide semi-
autonomous or fully autonomous parking assistance. These systems can automatically steer
the vehicle into a parking space while the driver controls the throttle and brake pedals, using
the data from ultrasonic sensors to navigate and avoid obstacles.
1
Automotive LiDAR (Light Detection and Ranging) is a sensing technology used in
autonomous vehicles and advanced driver assistance systems (ADAS) for detecting objects,
mapping the surrounding environment, and enabling safe navigation.
Here's an explanation of how automotive LiDAR works:
1. Principle of Operation: LiDAR systems emit laser pulses in various directions and
measure the time it takes for these pulses to reflect off objects in the environment and return
to the sensor. By analyzing the time-of-flight of the laser pulses, the LiDAR system can
calculate the distance to objects and create a detailed 3D map of the surroundings.
2. Components: A typical automotive LiDAR system consists of the following components:
- Object Detection: LiDAR can detect vehicles, pedestrians, cyclists, and other obstacles in
the vehicle's path.
- Mapping and Localization: Creates detailed 3D maps of the surrounding environment to
enable accurate localization and navigation.
- Adaptive Cruise Control (ACC): Helps maintain a safe distance from other vehicles by
measuring their relative speed and distance.
- Lane Keeping Assistance: Detects lane markings and provides feedback to the vehicle's
steering system to keep the vehicle within its lane.
- Collision Avoidance: Alerts the driver or triggers autonomous emergency braking to avoid
collisions with detected obstacles.
2
Velodyne is a leading manufacturer of LiDAR (Light Detection and Ranging) sensors, which
are key components in various applications, including autonomous vehicles, robotics,
mapping, and surveillance. LiDAR works by emitting laser pulses and measuring the time it
takes for the pulses to reflect off objects and return to the sensor.
1
system. This map provides crucial information for autonomous navigation, obstacle
avoidance, and situational awareness.
Integration with Other Sensors: Velodyne's LiDAR sensors are often integrated with other
sensor modalities, such as cameras, radar, and GPS, to provide a comprehensive perception
system for autonomous vehicles and other applications. This sensor fusion enables robust and
reliable operation in various environmental conditions.
Automotive LIDAR-Example
HDL-64E
One prominent example of automotive LiDAR technology is the Velodyne HDL-64E LiDAR
sensor. This sensor features 64 laser channels and provides high-definition 3D scanning
capabilities with a range of up to 120 meters. It offers a 360-degree horizontal field of view
and a vertical field of view of 26.8 degrees, enabling comprehensive coverage of the vehicle's
surroundings. The HDL-64E LiDAR sensor is widely used in autonomous vehicle
development, mapping, and advanced driver assistance systems (ADAS) applications. It
offers high accuracy and reliability, making it a key component in the perception systems of
autonomous vehicles.
2
VLP-16Puck
Another notable example of automotive LiDAR technology is the Velodyne VLP-16 Puck
LiDAR sensor. This sensor is compact and lightweight, making it suitable for integration into
various automotive platforms. It features 16 laser channels and offers a 360-degree horizontal
field of view with a vertical field of view of 30 degrees. The VLP-16 Puck LiDAR sensor
provides reliable 3D scanning capabilities with a range of up to 100 meters, making it ideal
for applications such as autonomous driving, mapping, and obstacle detection. Its small form
factor and robust performance make it a popular choice among automotive manufacturers,
technology companies, and researchers working on autonomous vehicle development.
Velodyne:
360-Degree LiDAR: Velodyne is renowned for its 360-degree LiDAR sensors, which
provide a full panoramic view around the sensor. Examples include the Velodyne HDL-64E
and Velodyne VLS-128, which offer a complete 360-degree horizontal FoV, enabling
comprehensive scanning of the environment.
Forward-Facing LiDAR: Velodyne also offers LiDAR sensors with narrower FoV options
suitable for forward-facing applications. For example, the Velodyne VLP-16 Puck has a
compact form factor and offers a 360-degree horizontal FoV with a narrower vertical FoV,
making it ideal for forward-facing applications such as obstacle detection and navigation.
Custom FoV LiDAR: Velodyne provides customizable LiDAR solutions that allow users to
adjust the FoV based on their specific application requirements. These sensors offer flexibility
in tailoring the scanning angle and range to optimize performance for different use cases.
Blickfeld:
Solid-State LiDAR: Blickfeld specializes in solid-state LiDAR sensors that use MEMS
(Micro-Electro-Mechanical Systems) technology to scan the environment. Blickfeld's solid-
state LiDAR sensors are available in various FoV options, including 90-degree, 120-degree,
and 360-degree horizontal FoV, catering to different application needs.
1
Forward-Facing LiDAR: Blickfeld offers solid-state LiDAR sensors optimized for forward-
facing applications, such as autonomous driving and advanced driver assistance systems
(ADAS). These sensors provide a wide FoV to detect objects and obstacles in the vehicle's
path, enhancing safety and situational awareness.
Custom FoV LiDAR: Blickfeld's LiDAR sensors can be customized to adapt their FoV
based on specific customer requirements. This flexibility allows users to tailor the scanning
angle and range to suit different use cases, from automotive applications to industrial
automation and smart city solutions.
In summary, both Velodyne and Blickfeld offer a range of LiDAR sensor options with
different FoV configurations to meet the diverse needs of various industries and applications.
Whether it's a full 360-degree panoramic view or a narrower FoV optimized for specific use
cases, these companies provide LiDAR solutions designed to deliver accurate and reliable
perception capabilities.
6- GNSS,GPS,IMU:
GNSS:(Global Navigation Satellite system)
GNSS stands for Global Navigation Satellite System, which is a satellite-based positioning
system that provides accurate location and time information to users anywhere on or near the
Earth's surface. GNSS systems work by utilizing a network of satellites orbiting the Earth to
transmit signals that are received and processed by GNSS receivers.
1. GPS (Global Positioning System): Operated by the United States government, GPS is the
most widely used GNSS system. It consists of a constellation of approximately 30 satellites
orbiting the Earth, providing global coverage. GPS satellites transmit precise timing and
positioning signals that are received by GPS receivers to determine the receiver's location,
velocity, and time.
2. GLONASS (Global Navigation Satellite System): Developed by Russia, GLONASS is
another major GNSS system. It consists of a constellation of satellites that provide global
2
positioning and timing coverage. GLONASS satellites transmit signals similar to GPS, and
GLONASS receivers use these signals to calculate position and time information.
3. Galileo: Developed by the European Union and the European Space Agency, Galileo is a
GNSS system designed to provide independent global positioning and timing services. It
consists of a constellation of satellites that transmit signals for use by Galileo receivers.
Galileo aims to provide more accurate and reliable positioning information than existing
GNSS systems.
4. BeiDou: Developed by China, BeiDou is a regional GNSS system that initially provided
coverage over the Asia-Pacific region but has been expanded to provide global coverage.
BeiDou satellites transmit signals similar to GPS and GLONASS, and BeiDou receivers use
these signals to determine position, velocity, and time.
1
Here's how GPS works in six steps:
1. Satellite Constellation: GPS consists of a constellation of at least 24 satellites orbiting the
Earth at approximately 20,000 kilometers above the surface.
2. Signal Transmission: Each GPS satellite continuously broadcasts signals that contain
precise timing information and ephemeris data (orbit information).
3. Receiver Reception: GPS receivers on the ground or in devices receive these signals from
multiple satellites.
4. Signal Processing: The GPS receiver analyzes the signals received from at least four
satellites to calculate its position, velocity, and time.
5. Trilateration: The GPS receiver determines its position by measuring the time it takes for
signals to travel from each satellite to the receiver. By trilaterating the signals from at least
four satellites, the receiver can calculate its position in three dimensions (latitude, longitude,
and altitude).
6. Calculation: Using the timing information and ephemeris data provided by the satellites,
along with the receiver's measurements, the GPS receiver calculates its position, velocity, and
time (PVT).
IMU:(Inertial measurement unit)
An inertial measurement unit (IMU) is an electronic component in the sensor family. It uses a
combination of accelerometers, gyroscopes and magnetometers to measure sensor
acceleration, angular velocity and orientation. A type I IMU consists of accelerometers and
gyroscopes. A type II IMU also includes magnetometers.
The accelerometers, gyroscopes and magnetometers measure data along a single axis (X:
pitch, Y: roll, Z: yaw). To get data for all three axes, you need to integrate three components
of each (accelerometers, gyros and magnetometers).
Some IMUs may have more degrees of freedom, with a temperature sensor, GPS sensor,
pressure sensor, etc. They can perform attitude, velocity and position calculations based on
acceleration
2
In ADAS and Automotive, IMU are useful for
Increasing the accuracy of GNSS/GPS system for vehicle position using sensor fusion
techniques
It also works independently to measure vehicle motions like velocity, position and
acceleration.
Also used for sensor calibration on the run, if any of the sensors like camera, lidar,
radar are deviated from their coordinate frame.
7- Outro:
The use of multiple sensors in autonomous driving involves fusion of information from
CAMERAS, LiDAR, RADAR,and GNSS,GPS&IMU to complement each other and enhance
perception capabilities. Each sensor has its own strengths and limitations in terms of distance
measurement, angle spatial resolution, velocity measurement, obstacle detection, and
classification. This fusion of sensor data is increasingly incorporating artificial intelligence
techniques like machine learning and deep learning for improved performance.
1
2