0% found this document useful (0 votes)
12 views

AdvancedSensorySystems Exercise 10

Uploaded by

Ogulcan Kertmen
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

AdvancedSensorySystems Exercise 10

Uploaded by

Ogulcan Kertmen
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Advanced Sensory Systems

and Sensor Data Processing

Exercise 10 — Image Processing and Stereo Distance Measurement


Michael Bleier
Computer Science XVII: Robotics

Summer 2024

Submission: Please submit your solutions for the Part 1 of the exercise sheet until Thursday, 11
July 2024, 10:15! Submit your solutions to Part 2 with the next exercise sheet.
You may work in groups of up to three. Please note the full names of you and all your exercise
partners on the submission. Only one submission per group is necessary. Create a single PDF-document
containing your answers/plots/screenshots. Add your source code to a zip-file and upload the two files
to WueCampus.
WueCampus course: https://wuecampus.uni-wuerzburg.de/moodle/course/view.php?id=66976

Goals: In this assignment you will perform basic image processing tasks on satellite images.

Part 1: Image Processing (15 points)

Task 1: Image Filters (5 points)


Given the following grayscale image:

9 8 8 9 9 8 8
8 9 8 9 8 9 8
9 8 9 8 9 8 9
1 1 1 1 8 9 9
1 1 1 1 9 9 9
1 1 1 1 9 9 8
1 1 1 1 9 9 9

• Filter the image (by hand on paper) using these kernels. What effects do the different kernels
have on the image?
    1 1 1

1 0 −1 0 −1 0 9 9 9
0 0 0 −1 5 −1 1 1 1
9 9 9
1 1 1
−1 0 1 0 −1 0 9 9 9

1
Task 2: Difference Images and Image Filters (10 points)

1. Download two satellite images of the Earth, that show the same area at different times. Images
from the Landsat and Sentinel missions are available for free. For sources on Landsat data you
may refer to (https://landsat.gsfc.nasa.gov/data/data-access/). Most tools listed there
also provide access to Sentinel data or you can access it via the Copernicus Open Access Hub
(https://dataspace.copernicus.eu/). Alternatively, you may find suitable images in the Land-
sat Image Gallery (https://landsat.visibleearth.nasa.gov/).

2. Write a C++ or Python program, that loads both images as grayscale images and creates a
difference image using the OpenCV functions:

• Mat image = imread("filename.jpg", IMREAD_GRAYSCALE);


• absdiff(Mat image1,Mat image2,Mat diff)

3. Generate the following kernels and filter one of the images using these kernels.
 
0 0 0
(a) 0 1 0
0 0 0
 
1 0 −1
(b)  0 0 0 
−1 0 1
 
0 1 0
(c) 1 −4 1
0 1 0
 
−1 −1 −1
(d) −1 8 −1
−1 −1 −1
 
0 −1 0
(e) −1 5 −1
0 −1 0
1 1 1
9 9 9
(f)  19 1
9
1
9
1 1 1
9 9 9

4. Create a difference image between the original image and the image convoluted with kernel (f).

5. What effects do the different kernels have on the image?

2
Part 2: Stereo Distance Measurement (15 points)

In this task we will use the stereo camera of the Sensor Cube to measure the 3D position of a circular
feature ("blob"). You will find some example code for Blob detection on WueCampus to help you with
this task.

Figure 1: Circular blob approx. 30 cm in front of the stereo camera.

1. Perform a stereo calibration of the stereo camera of the Sensor Cube! You can use your result
from exercise 09 (calibration.json) or recalibrate the stereo sensor using the tool on GitHub:
https://github.com/JMUWRobotics/sensorcube/blob/main/tools/calibrateCamera.py

2. Stereo rectify the left an right image (see example code).

3. Detect the circle feature in both images using OpenCV’s SimpleBlobDetector (see example code).

4. Compute the disparity.

5. Compute the 3D position of the feature. In a stereo rectified image pair you can do this easily
using the Q-matrix from the stereo calibration. For the detected blob pixel position (x,y) and the
corresponding disparity d, compute:
   
X x
Y   y 
  = Q
disparity(x, y) .

Z 
W 1

Note: The calibration tool performs the calibration with respect to the left camera. So you have
to plug in the pixel position (x,y) in the left image. The measurements will be relative to the
coordinate system of the left (rectified) camera.

6. Compare the measurements with a measuring tape. Characterize the accuracy of the stereo
distance measurement. What measurement range can be achieved?

7. Submit images of your measurement, your code, and comment on the comparison with a measuring
tape or the ToF sensor.

You might also like