0% found this document useful (0 votes)
55 views12 pages

VR Remote Senors

VR application with Remote Senors
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views12 pages

VR Remote Senors

VR application with Remote Senors
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

PFG DGPF

https://doi.org/10.1007/s41064-020-00113-0

NEWS ITEM

Augmented Reality and Virtual Reality Applications Based


on Satellite‑Borne and ISS‑Borne Remote Sensing Data for School
Lessons
Andreas Rienow1 · Claudia Lindner1 · Torben Dedring1 · Henryk Hodam1 · Annette Ortwein1 · Johannes Schultz1 ·
Fabian Selg1 · Kilian Staar1 · Carsten Jürgens1

© Deutsche Gesellschaft für Photogrammetrie, Fernerkundung und Geoinformation (DGPF) e.V. 2020

Abstract
Facing global challenges, a qualified education in remote sensing technologies needs to start in school to sensitise teachers and
thus young people for ecological issues and develop their technological skills. Remote sensing is part of the STEM (Science,
Technology, Engineering and Mathematics) curricular topics, all of which are either a requirement for or benefit from remote
sensing. However, implementing its data and methods into regular curricula poses technological and educational challenges,
given the lack of appropriate school IT infrastructure and remote sensing education for aspiring STEM teachers. Immersive
media can overcome these structural issues, using the teachers’ and pupils’ smartphones to display and interact, e.g. with
video data, astronaut photographs, or complex hyperspectral data. The data needs to be pre-processed for Augmented and
Virtual Reality applications (AR and VR, resp.) to fit topic, available devices and software. Game development software
allows for easy integration in apps for AR and VR, but does not contain functions for remote sensing methods. These are
either implemented directly in scripts, worked around using less computing-intensive methods or replaced by methods unique
to augmented and virtual reality. Following these premises, five AR apps are presented in this report, teaching about tropical
cyclones, anthropogenic desertification, energy consumption, gravitation, and algal blooms, all of which use remote sensing
data from different sensors aboard the ISS. One VR app teaching about Mount Fuji was developed using a DEM derived from
astronaut photographs. All have different levels of interactivity and are embedded in worksheets that fill a double period.

Keywords  Education · Digital experiments · 3D modelling · Remote sensing · International space station

Zusammenfassung
Augmented- und Virtual-Reality-Anwendungen für den Schulunterricht auf Basis satelliten- und ISS-gestützter Fernerkun-
dungsdaten. Angesichts aktueller globaler Herausforderungen muss eine qualifizierte Ausbildung in Fernerkundungstech-
nologien bereits in den Schulen erfolgen, um Lehrerinnen und Lehrer (LuL) und damit auch Schülerinnen und Schüler
(SuS) für ökologische Themen zu sensibilisieren und technologische Kompetenzen zu schulen. Fernerkundung ist Teil
der MINT-Fächer (Mathematik, Informatik, Naturwissenschaften, Technik) an Schulen. Die Fernerkundung kann hier die
physikalisch-mathematischen Grundlagen thematisieren oder auch die biologisch-geographischen Anwendungen betrachten.
Allerdings stellt die Implementierung von Daten und Methoden in die regulären Lehrpläne technische und didaktische
Herausforderungen dar: Einerseits haben die Schulen nicht die notwendige Hard- und Software-Ausstattung, andererseits
erhalten nur wenige Lehramtsstudierende eine ausreichend umfassende Fernerkundungsausbildung. Eingebettete Medien
können zumindest die technischen Hürden teilweise überwinden, indem die Smartphones der SuS und LuL genutzt werden,
um beispielsweise Videos, Astronautenbilder oder sogar komplexe hyperspektrale Daten darzustellen und interaktiv zu

* Claudia Lindner
[email protected]
1
Geomatics Research Group, Department of Geography,
Ruhr-University Bochum, Universitätsstr 150,
44801 Bochum, Germany

13
Vol.:(0123456789)
PFG

nutzen. Diese Daten müssen für den Einsatz in Augmented Reality (AR) und Virtual Reality (VR) aufbereitet werden, um
an das Lehrplanthema, vorhandene mobile Endgeräte und die verfügbare Software angepasst zu werden. Spieleentwick-
lungsumgebungen erlauben die einfache Implementierung von AR- und VR-Apps, enthalten jedoch keine Funktionen für
Fernerkundungsmethoden. Unter diesem Gesichtspunkt wurden fünf Apps entwickelt, über deren Umsetzung hier berichtet
wird. Diese behandeln die Themen tropische Zyklone, Desertifikation, Energieverbrauch, Gravitation und Algenblüten. Alle
Apps verwenden Fernerkundungsdaten von verschiedenen Sensoren auf der ISS. Weiterhin wurde eine VR-App mit einem
DEM entwickelt, welches aus Astronautenfotos abgeleitet wurde. Die Apps haben unterschiedliche Interaktivitätsgrade und
sind in Arbeitsblätter integriert, die etwa eine Doppelstunde in der Schule umfassen.

1 Introduction Hence, on-syllabus topics of everyday school lessons are


taught in an inspiring and remarkable manner (Lindner et al.
Space technology and Earth observation are key activities 2018). Starting with mediation theory rooted in problem-
not only for our society in terms of research, innovation, eco- based learning and moderate constructivism (Pagán 2006),
nomic wealth, and growing job opportunities but also for cli- the projects moved towards the design of semi-digital teach-
mate protection, conservation of ecosystem services, as well ing units, subsequently implementing entire digital, inter-
as security and international collaboration (Martinis et al. active learning modules on remote sensing (Schultz et al.
2017). Facing global challenges, a qualified education in 2017a, b). Besides, easy-to-use tools for image processing
remote sensing technologies needs to start in school to sensi- enable even novices of remote sensing to analyse current
tise teachers and thus young people for ecological issues and air mass movements, to calculate the NDVI, or to classify
educate their technological skills at the same time. Images images. These tools only contain the required data and meth-
showing our Earth from the bird’s-eye view fascinate people ods for the tasks at hand, so the pupils can focus on these
of all ages. While the methods and techniques of remote instead of getting lost among the plethora of functions in a
sensing and digital image processing are used intensively real GIS. En passant, students learn the causalities of cou-
in our society, profound background knowledge is not pled human–environment systems.
widespread (Siegmund 2011). Voß et al. (2007a, b) show Currently, the didactical paradigm is shifting from com-
that many teachers are interested in remote sensing and at puter-aided e-learning to smartphone-supported ubiquitous
the same time motivated to integrate it into their teaching. m-learning (Liu and Hwang 2010). Regular topographic
However, the transfer of Earth observation knowledge in maps are augmented by fascinating views from above.
schools is still very limited due to technical, motivational, Hence, the tangible dimensions of pens and papers are virtu-
or informational barriers. Working with remote sensing data ally lifted into the fascinating environment of space. Several
is postulated in the curricula of several German states (e.g. examples show how geographic information is integrated
SBJF 2015, MKJS BW 2016, MSB NRW 2019). Moreover, in immersive simulations to analyse photogrammetric and
the application of aerial, satellite, or even imagery from the cartographic problems (Keil et al. 2019; Kersten et al. 2018).
International Space Station (ISS) can be seen in the light of Hence, the main objective of the paper is to analyse and
problem-based learning fostering spatial orientation skills, explain how immersion techniques based on remote sensing
methodological skills, and evaluation as well as practice data can be used to mediate STEM curricular topics. It will
skills (Goetzke et al. 2013). In a nutshell, these are the main be shown how regular school lessons can benefit from the
goals of the scientific projects “KEPLER ISS” and “Remote application of remote sensing methods integrated in immer-
Sensing in School Lessons” which are carried out at the sive media in terms of a) fostering media skills of teachers,
Ruhr-University Bochum (Germany). Comprehensive and b) bridge and connect STEM subjects, c) broaden teachers’
well-structured learning platforms on the topic of space- horizons and act as access to the digitization of their own
borne remote sensing and Earth observation from the ISS profession and operational environment.
have been developed and published free of charge (www. Therefore, the subsequent research questions will be
fis.rub.de/en and www.colum​busey​e.rub.de). They allow a answered and discussed:
structured introduction for teachers and students (Voß et al.
2007a, b) and cover fundamental aspects of remote sensing 1. How can satellite-borne and ISS-borne remote sensing
(Rienow et al. 2016). Built on the basis of intermediality, data and methods be prepared to augment reality and
interactivity, and interdisciplinarity, learners are introduced virtuality in educational praxis?
to the world of data behind fancy-coloured satellite images. 2. How can immersive applications mediate interdisci-
Geographic information applications build the canvas for plinary STEM knowledge in context of problem-based
explaining physical and mathematical curricular knowledge. learning?

13
PFG

3. Which limitations are faced when working with aug- phenomena that are invisible to the eye or imply danger of
mented reality (AR) and virtual reality (VR) applica- life is facilitated with increasing level of virtuality. Experi-
tions on Earth observation in school lessons? ments can be simulated and interactively manipulated in
dynamic and responsive virtual environments. Authentic
The paper is structured as follows: Sect. 2 introduces visualisation allows first-hand experiences and involve-
the reality–virtuality continuum. Section 3 demonstrates ment while creating sustainable knowledge. Correspond-
the possibilities of augmenting everyday school resources ingly, advantages of real environments include immediate
with the help of remote sensing data to foster media and response and direct communication, and first of all ubiquity
methodological skills in STEM lessons. Section 4 gives an and, therefore, availability and access at all times and eve-
example of a success story dealing with the derivation of a rywhere without technological restraints (Jensen and Kon-
DEM based on astronaut photography and integrated in a radsen 2018; Ott and Freina 2015; Mikropoulos and Natsis
VR environment to explore the Mount Fuji. A conclusion 2011). The key issues of employing applications on the
and outlook on future research is given in Sect. 5. virtuality–reality continuum successfully in school lessons
regard availability of technology and media literacy on the
one hand, and motivation and interactivity on the other hand
2 The Reality–Virtuality Continuum (Reiners et al. 2014) (Fig. 1).

The reality one experiences every day is unique to us all,


but, nevertheless, real and hands-on; and even though vir-
tual worlds can (but are not limited to) depict real-world 3 Augmented Reality in School Lessons
scenarios, they are generated artificially. These two states are
the extremes of the reality–virtuality continuum as described The remote sensing scientists of the Columbus Eye team
by Milgram et al. (1995). All applications dealing with com- did not have prior experience with AR app development and
puter- or smartphone-aided technology employ this princi- hence, designing them has been a learning experience for
ple. AR, for example, is a chimeric state of mixed reality, the team even more than it is for the teachers and pupils to
combining virtual elements with real objects or surroundings learn using them. Since 2015, when the first app was created
and thus appending otherwise unseen knowledge to reality in Columbus Eye, steady progress has been made in using
(Milgram and Kishino 1994). Benefits of the virtual environ- built-in functions of the chosen development environment as
ment include accessibility in terms of a reduction of travel- well as in the implementation of original scripts (see Fig. 2).
ling costs, time restraints or physical condition—observing The details of these developments are laid out in this chapter.

Fig. 1  The reality–virtuality continuum

13
PFG

Fig. 2  Stages of Developer’s experience resulting in progressively more powerful apps over the years

For the development of the apps, Unity was chosen due applied immediately. Unity also comes with a wide range
to its relative simplicity as a game development platform, of shaders—essentially scripts that tell the device how to
Vuforia for the same reason regarding image recognition render each screen pixel—that are also specialised on game
for AR. functions and thus represent different surfaces and how they
AR depends on an automatic matching between an object deal with lighting in a 3D scene. These shaders are written
in reality and a dataset that represents the same object in the in High-Level Shading Language (HLSL), for which there
form of graphic data. The Vuforia Developer Library offers are no development environments, which means no autofill
functions that takes an image of the real object, extracts its or compilation possibilities. Compiling errors are, however,
edges, and compares them with the edges of the graphic quite noticeable as Visual Studio will shut down on occa-
dataset. Such a dataset is called an “Image Target”. “Image sion when trying to autocompile incomplete or faulty HLSL
Targets represent images that Vuforia Engine can detect and code. Due to these complexities, shader development is rec-
track. … The Engine detects and tracks the features that are ommended only for experienced developers.
naturally found in the image itself by comparing these natu-
ral features against a known target resource database. Once 3.1 Exploring Interactive Elements
the Image Target is detected, Vuforia Engine will track the
image as long as it is at least partially in the camera’s field of All apps work in conjunction with analogue work sheets.
view.” (PTC (n.d.), https:​ //librar​ y.vufori​ a.com/conten​ t/vufor​ These are printed material, mostly A4 in size, and prefer-
ia-libra​ry/en/artic​les/Train​ing/Image​-Targe​t-Guide​.html). ably in colour for comprehensibility. However, black and
Colour is irrelevant, as the Image Targets are processed in white works just as well. The work sheets contain tasks for
greyscales. Low image quality is not necessarily bad—frag- the pupils and all the information that is needed for a solu-
ments can serve as edges as well. However, when printed, tion: Texts, maps, and graphs, some of which serve as Image
the images need to be of high quality. Unity includes a lot of Targets for the app.
objects and attached scripts that make scripting unnecessary The first AR app, named “The Eye of the Cyclone”, was
for basic game development. Scripts can be added using C# developed as a test to find out whether pupils and teach-
for all functions that are not included. This is done in Visual ers would accept the technology as well as to let the pro-
Studio that has a Unity extension and thus the scripts can be ject team experiment with the development environment.

13
PFG

Furthermore, the app shall add value to the worksheet that In the next app, “Earth by Night”, teaching about light
teaches the emergence of tropical cyclones by the example pollution in Central Europe, a first attempt at scripting was
of Maysak, a category 5 typhoon that hit Micronesia in made. The accompanying worksheet includes a Sentinel-2
March 2015. infrared, green, blue composite of the Rhine–Ruhr area for
As a first test, this app stays close to the tutorials for the pupils to discuss false colour images. The app superim-
using Vuforia 6.2 in Unity 5.6 and displays a video over- poses a video taken by the ISS-based Japanese METEOR
lay over static images that are used as Image Targets. The project (PERC 2018) that takes nighttime videos of Earth
first Image Target is a weather map depicting the air pres- with a standard HDTV camera. The video shows a flyo-
sure at sea level and 500 hPa geopotential altitude. In the ver from London via Belgium and the Rhine–Ruhr area to
app, this image is superimposed with an animation using Frankfurt. Since orientation in nighttime images of Earth
consecutive maps of the same type of the typhoon, show- is complicated for pupils, who are used to annotated maps
ing its path across Micronesia towards the Philippines. and not unaltered satellite images in their school material,
The second Image Target is a cross-sectional diagram annotation of selected features was necessary, but would
of a hurricane. It is superimposed with a video taken by have been disruptive due to the size and nature of the video.
the HDEV (High-Definition Earth Viewing) experiment A solution was found with a virtual button: a button that is
that shows Maysak in front of the curvature of the Earth pressed on paper although it only exists in the programme
(Runco 2015). The video is annotated with wind speeds (see Fig. 3).
and air pressure, but the main feature is the perspective, A plane object that includes names of large cities in the
giving the pupils a grasp of the dimensions of tropical correct places was superimposed on the superimposed video.
cyclones. Using the additional information from the two The standard Vuforia VideoPlayerHelper script had to be
videos, the pupils are to answer questions in the worksheet extended to access the frame position of the played video to
(Ortwein et al. 2016). define the position of the cities plane object, ensuring they
The second app, “Aralkum”, enhances a worksheet about always stay in the right places. The cities plane object is
the decline of the Aral Sea. Again, videos overlaying static activated on the “push” of the Virtual Button and deactivated
images were used to convey additional information and give on release, allowing the pupils to see the nighttime view of
the pupils a perspective. An image of the already diminished Europe unobstructed or annotated for orientation.
Aral Sea in 1964 receives an overlay of the development of
the lake between 2000 and 2016 using Landsat images. A 3.2 Experimenting Digitally
photograph taken from the ground, depicting a rusted anchor
in what was formerly the lake bed and is now a desert, is So far, the AR was used for 2D animation overlays. While
enhanced to a video from above, showing the blue western the next app, the “Earth–Moon system” (Lindner et al. 2019)
basin and the sparsely filled eastern basin of the Aral Sea. also includes an HDEV experiment video, it is the first app
The video was recorded by the HDEV experiment in Sep- in this project to utilise the third dimension and extensive
tember 2016. C# scripts. Earth and Moon form a multi-body system. The
This app features the first truly interactive function: a barycentre is its centre of mass around which all participat-
Sentinel-2 image of the Aral Sea from 2017 is superimposed ing bodies orbit. To help the pupils to understand the con-
with an animation of the shorelines from 2000 to 2016. Eve- cept, a 2D image in the worksheet corresponds to the Image
rything except the shorelines is transparent and the anima- Target for a 3D animation that displays two celestial bodies
tion can be halted. Pupils can then mark the lake’s extent
on the Sentinel-2 image of 2017, answer questions in the
worksheet and discuss the future of the lake using both the
information given in the worksheet and in the animation
(Schultz et al. 2017a, b).
Since mp4 does not support an alpha channel for transpar-
ency but was the easiest to overlay for the Vuforia software,
the way the video is rendered on the screen by Unity had
to be adjusted instead. To achieve this, an existing Shader
was used that turns one colour in the video transparent. The
Shader adds a high alpha value to all pixels with the colour
defined as mask. By reducing that alpha value, a smoothing
effect is achieved, the further the colour differs from the
Fig. 3  Demonstration of the app “Earth by Night” when using the
mask colour. Virtual Button, adding a layer with city names for orientation to the
night-time video of Central Europe

13
PFG

of different sizes revolving around each other with their orbit the screen. In Unity, this is achieved by attaching the Moon
lines (see Fig. 4). object to the camera object.
The implementation in Unity uses an invisible sphere in Using the Image Target, Unity constantly computes
which two differently coloured smaller spheres represent the camera’s position in relation to the printed picture of
Earth and Moon. An attached script lets the invisible large the Earth at the wall to calculate where the superimposed
sphere rotate together with both celestial bodies which leads objects must be rendered, and where not. The determination
to the impression that Earth and Moon revolve around their of the positions can also be exploited for further calcula-
common barycentre. tions. The geometric resolution applied to the calculation
The Unity LineRenderer imports an array of coordinate- depends on the size of the Image Target. Hence, if the Earth
triples (x, y, z), forms a smoothed polygon through theses in the Image Target has a start-radius of 6.37 cm when load-
points, and returns it to the application. The smoothness of ing the Image Target into Unity, all objects are defined in cm
the line is programmer controlled: for better visual impact, and all calculations are scaled to 1:100,000,000. This results
more line segments and thus more computation power is in a distance to the Moon of 3.6–4.1 m. The pupils thus
required. Fewer line segments result in faster processing, see the accurate relative distance between the Earth and the
but also result in an orbit with angles. Moon to scale, i.e. The Earth with a diameter of 12.7 cm on
The first digital experiment was created in the same app
and simulates the influence of the Moon on the Earth and
vice versa, while the distance between both varies. Pupils
shall observe the consequences of a much larger or a much
smaller distance than today. The teaching unit consists of a
prediction by the pupils, e.g. regarding tidal effects (hypoth-
esis), and a following test using the app. They shall also
discuss the results, especially with regard to the history of
the Earth–Moon system since its formation by a crash of
Proto-Earth and Theia until the far future when it will be
swallowed by the expanding sun shortly before its death.
The setup of the experiment consists of a printed image Fig. 5  Digital Moon on the smartphone-display linked to the printed
of the Earth and a digital image of the Moon (Fig. 5). The picture of the Earth, pinned on the wall
Earth’s image is a satellite composite. It is surrounded by
the picture of a frame of a round porthole. The porthole and
its screws support the tracking process by stabilising the
computations. The Moon’s image appears on the smartphone
display. Further, the smartphone–computer holds an Image
Target of the Earth (and the porthole), which allows tracking
the printed Earth’s image at the wall (Fig. 6).
The app displays the Earth in 3D as a sphere with an
equirectangular map as texture with a slow spinning script
attached. While the Earth slowly rotates, it stays in its posi-
tion on the wall. The Moon, however, again a sphere with
an equirectangular map as texture, is affixed to the smart-
phone camera position, staying stationary when viewed on

Fig. 4  Two celestial bodies seem to revolve around a common bar- Fig. 6  Picture of the Earth and its surrounding porthole, overlaid with
ycentre the detected points for referencing the Image Target

13
PFG

the wall, the Moon with a diameter of 3.4 cm on the screen


in their hands, and a distance of about 4 m in between them.
After the program’s start, the screen only displays the
Earth and the Moon, as well as the distance in between them
in proper relation to their size, and the orbital period, shown
as a textural annotation. By pressing a button on the screen,
the pupils can activate the tide simulation. This simulation
is based on a simplified calculation of the tidal force

ag ≈ ±2RGM∕r3 . (1)

With R being the Earth’s radius, G the gravitational


constant, M the Moon’s mass, and r the distance between
the Earth and the Moon (Franc 2012). From ag, an eleva-
Fig. 7  Display of the distance calculation and tide simulation in the
tion range is constantly derived and applied to the high tide app “Earth–Moon system”
and low tide, calculating two elevation thresholds at which
the tides would occur if the Moon orbited at the selected
think the Moon should be. This, however, puts them within
distance.
the Roche limit simulation (which was very popular during
To visualise the effect, two maps of the German Bight are
the test). A hint was added to the worksheet asking for a
provided in this tide simulation, one displaying the poten-
start-point further away from the Earth. According to the
tial high tide and one the low tide. Again, the simulation is
many other suggestions of the pupils, modifications were
simplified, both to reduce computing power and to reduce
made to improve visibility and comprehension. The app will
complexity to improve speed and comprehension.
be tested for educational value in summer 2020.
The data basis for the tide maps is a Digital Elevation
Model (DEM), combined from ASTER GDEM (METI and
3.3 Visualising Complex Data: Hyperspectral Image
NASA 2009) and Bathymetry data of the North Sea (GPDN
Stack
2013) and Baltic Sea (BHG 2013). The DEM is given as
a greyscale image with one grey-level step being equiva-
The most recent app in the project features a video from
lent to 1 m elevation difference. Using these maps and the
the HDEV experiment and hyperspectral data from the con-
tide thresholds, two Unity Raw Images, i.e. components
cluded International Space Station (ISS)-based Hyperspec-
whose textures can be generated at runtime, are redrawn
tral Imager for the Coastal Ocean (HICO) experiment. The
with a semi-transparent blue colour below the respective
area of interest is Lake Erie with its massive algal bloom
tide threshold and transparent above it.
in fall 2011 that was observed with the help of HICO data.
These “water images” are superimposed on the DEM in
In the accompanying worksheet, the pupils learn about the
a green–yellow–red scale or on a Sentinel-2 image that is
emergence and dangers of algal blooms as well as the appli-
radiometrically modified with the goal of displaying all low-
cation of hyperspectral data.
tide dry areas as sand (see Fig. 7).
Hyperspectral datasets are rather large as a result of their
The simulation is turned off when the “smartphone-
many bands. HICO has 128 bands with 2 MB per band.
moon” comes too close to Earth, reaching the Roche limit
Each band image is about 90 by 190 km (varying with the
at which the Moon would be ripped apart by differential
exact altitude and attitude of the ISS) or 2000 by 512 pixels
gravitation. The Moon object is then distorted, shaken by a
(fixed) in size. The total file has a size of 591 MB. This
random position modifier, and the texture replaced with an
exceeds the capacity of a smartphone. Therefore, the bands
array of textures that has an increasing amount and width of
of the HICO scene in question were reduced to the 87 bands
additional orange markings, giving the impression of lava
between 400 and 900 nm, due to the bad signal-to-noise ratio
channels opening up due to the internal friction. The closer
of the bands below 400 nm and beyond 900 nm (Korwan
the Moon comes to the Earth, the more orange is added
2009). The selected bands were also subset, resampled, and
until the whole screen turns orange to symbolise the Moon
transformed into 24-bit images, reducing the total file size
crashing into the Earth. The smartphone’s vibration is also
to 19 MB, which is just about processable for most current
turned on.
smartphones.
The app was tested as an Alpha test for functionality in
RGB images are usually calculated from three greyscale
two high school classes. One of the main takeaways from
images, which, due to C# not being specialised in image
the test is that pupils will start the app in less than 20 cm
processing like, e.g. IDL or MATLAB, would take up a lot
distance from the Earth Image Target, which is where they
of time and processing power. A different approach had to

13
PFG

be developed for the app. Hence, a red, a green and a blue The image cube that shows an RGB composite of a scene
image object with different transparency settings are added and the border pixels’ spectra in a rainbow colour scale is
on top of each other and receive the band data as Main Tex- a popular way of displaying hyperspectral data. The main
ture. However, Unity does not offer a predefined shader that drawback is the limited visibility because the cube can only
can display the three images as required for the task, i.e. be viewed from the outside. For the app, a different approach
with only colour and transparency. Thus, a shader had to be was developed. Based on the 87 bands being available in
written specifically for this purpose. The shader needs access greyscale, a newly developed shader colours them in the
to the band image’s pixel colours as well as the colour and “Rainbow-Black” colour gradient set by ENVI as the default
the transparency assigned to the image object. The greyscale for the image cubes. The resulting coloured images are then
value from each pixel of the band image is multiplied by stacked atop of each other with a few millimetres apart, with
the image object colour and the new value, together with the lowest band number at the top. Users can use a slider to
the image object’s transparency, set as the rendered texture. remove band-by-band beginning at the top. The RGB viewer
Three sliders for the three colours can then be adjusted by only allows them to see individual bands in greyscale if they
the user to define which band is red, which green, and which move all sliders to the same band. However, humans are only
blue. The user can slide through the colour combinations in able to differentiate between about 30 shades of grey (Kreit
real time (see Fig. 8). et al. 2013) instead of the 256 shades available in the image.
The coloured cube allows them to see individual bands with
a much finer colour gradient. This gradient shader uses a
gradient map, an image that has the desired colour gradient
on the horizontal axis. The gradient map works as a colour
palette in any GIS; but as Unity does not have a way to mul-
tiply the greyscale value with a colour palette, it has to be
implemented in this shader.
The shader was adjusted to not render pixels outside user-
defined borders. Two double sliders for North/South and
East/West can be used to adjust the borders of the images
and due to the stacked, gradient-coloured band images,
profiles of the entire scene from all four directions can be
viewed (see Fig. 9).
The app will be tested for functionality and educational
value in spring 2020.

4 Virtual Reality in School Lessons

Taking virtuality a step further, virtual environments allow


users to meet imaginative as well as realistic landscapes and
scenarios that are not easily accessible, e.g. because they no
longer exist in the required way (Tschirschwitz et al. 2019).

Fig. 8  Display of the instant RGB image viewer. Sliders at the bot-


tom adjust which HICO band is displayed in which colour. Invisible
buttons are included in the satellite image to activate distinct spectral
signatures Fig. 9  Display of the Image Stack

13
PFG

In contrast to AR, the usage of VR equipment encapsulates On 8.2.2016, 32 high-resolution images of the 3776 m
the user in an immersive environment. Therefore, interac- above sea-level Mt. Fuji were taken by ISS astronauts. The
tive school lessons in virtual environments have to com- images are available in the “Gateway to Astronaut Photog-
pensate for a lack of shared experience. To overcome this raphy of Earth” (Earth Science and Remote Sensing Unit,
issue, a collaborative mixed virtuality–reality approach was NASA Johnson Space Center). The great advantages of this
designed to ensure a learning effect for all participants. The image series are the short time interval between the images,
example application presented in this paper conveys knowl- different shooting angles, the high resolution (4928 × 3280
edge about Mount Fuji’s geology. Directed at an audience of pixels) and the small Ground Sampling Distance (GSD) of
younger pupils, the participants work in teams to solve the ≈ 4 m using a 1150-mm lens. Nevertheless, the problems
accompanying worksheet (see Fig. 13). While one partici- for low-resolution 3D models (GSD of 500 m) derived from
pant moves in VR, the other participants supply additional NASA’s HDEV experiment on the ISS (Muri et al. 2017)
information needed to solve questions in the virtual envi- were already described by Schultz et al. (2017a, b) and are
ronment. To make the application as realistic as possible, similar to those which were observed for the here presented
an astronaut’s imagery of Mount Fuji has been integrated high-resolution Fuji model. As a previous study has shown
into our VR framework. The workflow is described in the for Mt. Fuji, the high relief energy (large altitude differ-
following subchapters. ences of over 2600 m), clouds in the images, relief shad-
ows as well as the lack of information regarding shooting
4.1 Photogrammetry to Derive 3D Models for VR angle and ISS position makes it difficult to create a geo-
Applications referenced 3D model without considerable errors (Schultz
et al. 2018). The minimization of artefacts in the 3D model
Photogrammetry is the science of deriving measurements is particularly important to derive visually attractive and
from photographs. Commonly, the field of photogrammetry functional VR applications, which do not cause nausea. In a
can be split, based on camera location during image acquisi- first attempt, it turned out that at least seven Mt. Fuji images
tion, into aerial photogrammetry and terrestrial photogram- were required to create a “visually appealing” 3D model
metry. It is used to obtain information about objects or even (Schultz et al. 2018). However, a relatively large number of
a land surface through measuring and interpreting photo- errors occurred, which makes the model insufficient for VR.
graphic images. Photogrammetry is a remote (non-contact) These errors could be significantly reduced using 10 manu-
measurement technique and can be applied to one image or ally chosen ground control points for all 32 images. These
to a series of images to detect, measure and record complex points are also used to georeference the complete model.
2D and 3D fields (Baqersad et al. 2017). These techniques All calculations were performed with Agisoft ­Metashape©
are used in several applications and scientific disciplines, (Agisoft 2018). The first step is to align the images and to
e.g. analysis and restoration of cultural heritage, topographic calculate the exact camera position based on the common
mapping, architecture, engineering, and medicine. overlap between all images. Based on this data, a 3D point
Here, stereophotogrammetry, which is applied in aerial cloud and a mesh are calculated. This process is computa-
and terrestrial photogrammetry, is used to derive high-reso- tionally intensive but can be accelerated with GPU (graph-
lution georeferenced 3D models for VR applications. These ics processing unit). In total, the Mt. Fuji model has 3202
3D models form the basis for virtual animations and VR so-called tie points, which represent the number of points
applications, which are used in schools in conjunction with detected on two (or more) different images and a total num-
working sheets to foster the STEM skills of pupils. ber of 610,526 points for the dense point cloud.
Especially high-resolution astronauts’ images of the The individual images are then projected onto the mesh
Earth, taken from the ISS, are particularly suitable for mod- (see Fig. 10) and are the basis to generate the texture. To
elling individual structures, such as volcanoes or mountain enhance the quality of the model, gaps were filled and arte-
ranges. Most of the current satellite systems from NASA facts corrected manually. Especially in regions with clouds,
or ESA like Landsat or Sentinel are not able to record the the mesh was insufficient and large gaps occurred (see
Earth’s surface stereoscopically. The ISS itself is an Earth Fig. 10).
observation platform but has several unique characteristics The complete orthomosaic has 6804 × 6628 pixels with
which distinguishes it from satellite- or airborne remote the GSD of 2.68 m. The DEM shown in Fig. 10 has a lower
sensing: it is manned, sun-asynchronous in a low Earth resolution with a GSD of 5.8 m and 3,173 × 3,111 pixels.
orbit (400 km) and faces varying dynamics regarding roll, Consequently, the coloured 3D model shown in Fig. 11
pitch and yaw angles (Stefanov and Evans, 2014). Astro- covers an area of 15.1 km by 10.4 km. The results can be
nauts’ images profit from these unique advantages and can displayed as a simple, freely rotatable 3D model. In addi-
provide a high-resolution view of the Earth’s surface with a tion, anaglyph images (Michel 2013), which convey a three-
focal length of up to 1600 mm. dimensional visual impression, can be generated. The 3D

13
PFG

Fig. 12:  3D model of Mt. Fuji

Fig. 10  Mesh generated based on 32 images from Mt. Fuji taken by


Astronauts ISS046-E-35796 to ISS046-E-35825 Fuji was embedded into a realistic looking skybox with an
accelerated day–night cycle and a surrounding snowy moun-
tain range as a backdrop.
Regarding the hardware, it was decided to use the HTC
Vive (Gen 1 from 2016) with an Xbox One Controller as the
primary method of input. Keyboard and mouse controls as
well as the HTC Vive motion controllers are supported to
give the user the possibility to choose their preferred input
device. To approach the VR reality environment, the appli-
cation was split into two elements, more specifically two
separate levels. The first part consists of a free-flying por-
tion where you can fly around and explore Mt. Fuji. This
gives the user a bit of an overview of the general area and
the possibility to familiarise themselves with the VR HMD
(head-mounted display) and controls. If the user approaches
the summit with the flyer, the user will be teleported into
a small room and the VR application switches to a first-
Fig. 11  Digital elevation model from Mount Fuji with contour lines person-controlled environment. For this, a simple spherical
and colour shading trigger that acts as a level-switch was used. During this ses-
sion, the user has to solve a riddle. The only two points of
interest in the room are a laptop with a keypad on the wall
model can be exported as an animated PDF or a kmz file next to it. By solving the associated worksheet (see Fig. 13),
for Google Earth. However, the OBJ format was used to the user will obtain a code to input in the aforementioned
import the Fuji 3D model in Blender and afterwards in the keypad. The correct code triggers a small reward that will
VR application. be shown on the laptop.

4.2 Implementation of 3D Models in VR

The 3D model was imported as a landscape into Unreal 5 Conclusions and Outlook


Engine for better and easier customization options. To
import the landscape, a monochrome height map was The paper presents a broad spectrum of remote sensing data
derived from the 3D model. To change the landscape from and their preparation for easy-to-use immersive applications
a monotonous grey look, a complex “material” was created. for school lessons. A variety of Earth observation data dif-
A material (in Unreal Engine) is an asset that is applied fering in terms of spatial, spectral, and temporal resolutions
to a mesh or a landscape to define several properties like as well as platforms (satellites, ISS) have been integrated in
colour, transparency and roughness (Epic Games 2019). augmented and virtual reality applications focusing prob-
For the material, the aforementioned orthomosaic was used lem-based learning to mediate school curricular topics.
and augmented with several snow and rock textures (see Even products of complex Earth observation technology
Fig. 12). A blend function that activates while the user flies like hyperspectral remote sensing data presented in a virtual
close to surface was introduced to increase visual fidelity. data cube could be adjusted for curricular needs. Moreover,
To increase the immersive experience, the landscape of Mt. the popular visualisation technique of a hyperspectral data

13
PFG

mobile devices. This has to be done without compromising


quality for a smooth frame rate. Regarding the input devices,
different control schemes, i.e. Bluetooth controllers and eye
tracking, are tested in the near future.

Funding  Columbus Eye and KEPLER ISS are supported by the


German Aerospace Center (DLR) with funds of the Federal Minis-
try for Economic Affairs and Energy [Grant numbers 50JR1307 and
50JR1701] based on a resolution of the German Bundestag.

Compliance with Ethical Standards 

Conflict of interest  The authors declare that they have no conflict of


interest.

References
Agisoft (2018) Agisoft Metashape User Manual Standard Edition,
Version 1.5 https​://www.agiso​ft.com/pdf/metas​hape_1_5_en.pdf.
Accessed 12 Sep 2019
Baqersad J, Poozesh P, Niezrecki C, Avitabile P (2017) Photogramme-
try and optical methods in structural dynamics—a review. Mech
Syst Signal Process 86:17–34. https​://doi.org/10.1016/j.ymssp​
.2016.02.011 (accessed 20 Mar 2020)
BHG: Baltic Hydrographic Commission (2013) Baltic Sea Bathymetry
Database, https​://data.bshc.pro. Accessed 20 Mar 2020
Epic Games (2019) Unreal Engine 4 Documentation. Materials. Con-
trolling the appearance of surfaces in the world using shaders.
https​://docs.unrea​lengi​ne.com/en-US/Engin​e/Rende​r ing/Mater​
ials/index​.html. Accessed 20 Mar 2020
Franc T (2012) Tides in the Earth-Moon System. Proc 21st Ann Conf
Doct Stud. WDS 2012. Charles University, Prague. https​://www.
mff.cuni.cz/veda/konfe​rence​/wds/proc/pdf12​/WDS12​_318_f12_
Fig. 13  Worksheet Mt. Fuji Franc​.pdf. Accessed 10 Aug 2019
Goetzke R, Hodam H, Rienow A, Voß K (2013) Tools and learning
management functions for a competence-oriented integration of
cube has been made accessible so that the whole information remote sensing in classrooms. In: Jekel T, Car A, Strob J, Grieseb-
can be interpreted and not only one row. ner G (eds) GI_Forum 2013. Wichmann, Offenbach, Germany, pp
458–463. https​://doi.org/10.1553/gisci​ence2​013s4​58 (Accessed
It has been shown how AR apps enhance static, printable
20 Mar 2020)
worksheets with animated content and inexpensive, effective GPDN: Geopotential Deutsche Nordsee (2013) Bathymetrie. www.
digital experiments in Earth observation—all independent gpdn.de/. Accessed 20 Mar 2020
of school IT infrastructure. Much content can be developed Ikropoulos TA, Natsis A (2011) Educational virtual environments.
A ten-year review of empirical research (1999–2009). Com-
using Unity and Vuforia without additional scripts; however,
put and Educ 56(3):769–780. https​://doi.org/10.1016/j.compe​
since it is a game development environment and has no spe- du.2010.10.020 (Accessed 20 Mar 2020)
cialisation for image processing, even basic GIS functions Jensen L, Konradsen F (2018) A review of the use of virtual reality
and the digital experiments have to be scripted in C#, using head-mounted displays in education and training. Educ and Inf
Technol 23(4):1515–1529. https​://doi.org/10.1007/s1063​9-017-
methods that consume as little memory as possible. The use
9676-0 (accessed 20 Mar 2020)
of specialised shaders written in HLSL is necessary for such Keil J, Edler D, Dickmann F (2019) Preparing the HoloLens for user
methods as well. studies: an augmented reality interface for the spatial adjustment
Additionally, a VR environment has been augmented with of holographic objects in 3D indoor environments. Kartogr Nachr
69(3):205–215. https​://doi.org/10.1007/s4248​9-019-00025​-z
a real DEM derived from astronauts’ photography to enable
(accessed 20 Mar 2020)
students to do an excursion to Mount Fuji and experience Kersten T, Deggim S, Tschirschwitz F, Lindtstaedt MU, Hinrichsen N
its geology. Since the expensive VR hardware is not widely (2018) Segeberg 1600—Eine Stadtrekonstruktion in virtual real-
available, it should be converted from a desktop application ity. Kartogr Nachr 68(4):183–191
Korwan DR, Lucke RL, McGlothlin NR, Butcher SD, Wood DL,
into a mobile one to reach more interested teachers and stu-
Bowles JH, Corson M, Snyder WA, Davis CO, Chen DT (2009)
dents. For the mobile version, the landscape has to be simpli- Laboratory Characterization of the Hyperspectral Imager for the
fied heavily to work with the limited resources available on Coastal Ocean (HICO). IEEE IGARSS proc.

13
PFG

Kreit E, Mäthger LM, Hanlon RT, Dennis PB, Naik RR, Forsythe E, PERC: Planetary Exploration Research Center (2018) ISS Meteor
Heikenfeld J (2013) Biological versus electronic adaptive colora- Observation Project “METEOR”, www.perc.it-chiba​.ac.jp/proje​
tion: how can one inform the other? J R Soc Interface. https​://doi. ct/meteo​r/. Accessed 20 Aug 2019
org/10.1098/rsif.2012.0601 (accessed 20 Mar 2020) PTC (n.d.) Vuforia Developer Library—optimizing target detection
Lindner C, Hodam H, Ortwein A, Schultz J, Selg F, Jürgens C, Rienow and tracking stability. https​://libra​r y.vufor​ia.com/artic​les/Solut​
A. (2018) Towards a new Horizon for Planetary Observation in ion/Optim​izing​-Targe​t-Detec​tion-and-Track​ing-Stabi​lity.html.
Education. Proc 2nd Symp Space Educ Activ, https​://www.hit. Accessed 31 Jan 2020
bme.hu/~bacsa​rdi/SSEA/SSEA2​018_proce​eding​s.pdf. Accessed Reiners T, Wood L, Gregory S (2014) Experimental study on con-
17 Sep 2019 sumer-technology sup-ported authentic immersion in virtual
Lindner C, Rienow A, Jürgens C (2019) Augmented reality appli- environments for education and vocational training. In: Hegarty
cations as digital experiments for education—an example in B, McDonald J, Loke S (ed) Proc 31st Ann Ascilitite Conf, pp
the earth-moon system. Acta Astronaut 161:66–74. https​://doi. 171–181
org/10.1016/j.actaa​stro.2019.05.025 (accessed 20 Mar 2020) Rienow A, Graw, V, Heinemann, S, Schultz J, Selg, F, Menz G (2016)
Liu GZ, Hwang GJ (2010) A key step to understanding paradigm shifts Earth Observation from the ISS Columbus Laboratory—an Open
in e-learning: towards context-aware ubiquitous learning. BJET Education Approach to Foster Geographical Competences of
41(2):E1–E9. https​://doi.org/10.1111/j.1467-8535.2009.00976​.x Pupils in Secondary Schools. In: Proc ESA LPS 2016
(accessed 20 Mar 2020) Runco S (2015), International Space Station—High Definition Earth
Martinis S, Twele A, Plank S, Zwenzner H, Danzeglocke J, Strunz Viewing (HDEV), https​://www.nasa.gov/missi​on_pages​/stati​
G, Lüttenberg HP, Dech S (2017) The International Charter on/resea​rch/exper​iment​s/explo​rer/Inves​tigat​ion.html?#id=892.
‘Space and Major Disasters’: DLR’s Contributions to Emer- Accessed 20 Aug 2019
gency Response Worldwide. PFG 85(5):317–325. https​://doi. SBJF: Senatsverwaltung für Bildung, Jugend unf Familie (2015)
org/10.1007/s4106​4-017-0032-1 (accessed 20 Mar 2020) Rahmenlehrplan Teil C—Geografie. https​://bildu​ngsse​rver.berli​
METI and NASA (2009) ASTER Global digital elevation model n-brand​enbur​g.de/filea​dmin/bbb/unter ​r icht​/rahme​nlehr ​plaen​
[Data set]. NASA EOSDIS Land Processes DAAC. https​://doi. e/Rahme ​ n lehr ​ p lanp ​ r ojek ​ t /amtli ​ che_Fassu ​ n g/Teil_C_Geogr​
org/10.5067/ASTER​/ASTGT​M.002 afie_2015_11_10_WEB.pdf. Accessed 31 Jan 2020
Michel B (2013) Digital Stereoscopy: Scene to Screen 3D Production Siegmund A (2011) Satellitenbilder im Unterricht–eine Länderver-
Workflows. Stereoscopy News, Stereoscopy New gleichsstudie zur Ableitung fernerkundungs-didaktischer Grund-
Milgram P, Kishino FA (1994) Taxonomy of mixed reality visual dis- sätze. Dissertation at the Heidelberg Univ of Educ
plays. IEICE Trans Inform and Syst E77-D(12):1321–1329 Schultz J, Lindner C, Hodam H, Ortwein A, Selg F, Weppler J, Rie-
Milgram P, Takemura H, Utsumi A, Kishino F (1995) Augmented now A (2017) Augmenting Pupil’s Reality from Space—Digital
reality: a class of displays on the reality-virtuality continuum. Learning Media based on Earth Observation Data from the ISS.
Proc Telemanipulator Telepresence Technol. https ​ : //doi. In: Proc. IAC
org/10.1117/12.19732​1 (accessed 20 Mar 2020) Schultz J, Ortwein A, Rienow A (2017) Technical Note: using ISS
Muri, P, Runco S, Fontanot C, Getteau C, (2017) The High Definition Videos in Earth Ob-servation—implementations for Science
Earth Viewing (HDEV) payload. Proc IEEE Aerospace Conf, Big and Education. Eur J Remote Sens 51(1):28–32. https​://doi.
Sky, MT, pp 1-7 org/10.1080/22797​254.2017.13968​80
Ott M, Freina L (2015) A literature review on immersive virtual reality Schultz J, Hodam H, Lindner C, Ortwein A, Selg F, Rienow A (2018)
in education: state of the art and perspectives. In: Proc. eLearning Ableitung von 3D-Modellen aus Daten des High Definition Earth
and Softw for Educ (eLSE). Viewing-Experiments (ISS)—Anwendungen für den Schulunter-
Ortwein A, Graw V, Heinemann S, Menz G, Schultz J, Selg F, Rie- richt. Pub DGPF 27:129–140
now A (2016) Pushed Beyond the Pixel—interdisciplinary Earth Stefanov W, Evans CA (2014) The international space station: A
Observation Education from the ISS in Schools. In: Proc. IAC unique platform for remote sensing of natural disasters. JSC Bien-
MKJS BW: Ministerium für Kultus, Jugend und Sport Baden-Würt- nal Res Rep, pp 108–110.
temberg (2016) Bildungsplan des Gymnasiums – Bildungsplan Tschirschwitz F, Richerzhagen C, Przybilla HJ, Kerten TP (2019) Duis-
2016 – Geographie. https​://www.bildu​ngspl​aene-bw.de/site/bildu​ burg 1566: transferring a historic 3D city model from google earth
ngspl​an/get/docum​ents/lsbw/expor​t-pdf/depot​-pdf/ALLG/BP201​ into a virtual reality application. PFG 87(1–2):47–56. https​://doi.
6BW_ALLG_GYM_GEO.pdf. Accessed 31 Jan 2020 org/10.1007/s4106​4-019-00065​-0 (accessed 20 Mar 2020)
MSB NRW: Ministerium für Schule und Bildung des Landes Nor- Voß K, Goetzke R, Thierfeldt F (2007) Integration von Fernerkundung
drhein-Westfalen (2019) Kernlehrplan für die Sekundarstufe I – im Schulunterricht. Pub DGPF 16:41–50
Gymnasium in nordrhein-Westfalen. Erdkunde. https:​ //www.schul​ Voß K, Goetzke R, Thierfeldt F, Menz G. (2007) Integrating Applied
entwi​cklun​g.nrw.de/lehrp​laene​/lehrp​lan/200/g9_ek_klp_%20340​ Remote Sensing Methodology in Secondary Education. In: IEEE
8_2019_06_23.pdf. Accessed 31 Jan 2020 IGARSS Proc, pp 2167–2169.
Pagán B (2006) Positive contributions of constructivism to educational
design. EJOP 2(1):23–32. https​://doi.org/10.5964/ejop.v2i1.318
(accessed 20 Mar 2020)

13

You might also like