VR Remote Senors
VR Remote Senors
https://doi.org/10.1007/s41064-020-00113-0
NEWS ITEM
© Deutsche Gesellschaft für Photogrammetrie, Fernerkundung und Geoinformation (DGPF) e.V. 2020
Abstract
Facing global challenges, a qualified education in remote sensing technologies needs to start in school to sensitise teachers and
thus young people for ecological issues and develop their technological skills. Remote sensing is part of the STEM (Science,
Technology, Engineering and Mathematics) curricular topics, all of which are either a requirement for or benefit from remote
sensing. However, implementing its data and methods into regular curricula poses technological and educational challenges,
given the lack of appropriate school IT infrastructure and remote sensing education for aspiring STEM teachers. Immersive
media can overcome these structural issues, using the teachers’ and pupils’ smartphones to display and interact, e.g. with
video data, astronaut photographs, or complex hyperspectral data. The data needs to be pre-processed for Augmented and
Virtual Reality applications (AR and VR, resp.) to fit topic, available devices and software. Game development software
allows for easy integration in apps for AR and VR, but does not contain functions for remote sensing methods. These are
either implemented directly in scripts, worked around using less computing-intensive methods or replaced by methods unique
to augmented and virtual reality. Following these premises, five AR apps are presented in this report, teaching about tropical
cyclones, anthropogenic desertification, energy consumption, gravitation, and algal blooms, all of which use remote sensing
data from different sensors aboard the ISS. One VR app teaching about Mount Fuji was developed using a DEM derived from
astronaut photographs. All have different levels of interactivity and are embedded in worksheets that fill a double period.
Keywords Education · Digital experiments · 3D modelling · Remote sensing · International space station
Zusammenfassung
Augmented- und Virtual-Reality-Anwendungen für den Schulunterricht auf Basis satelliten- und ISS-gestützter Fernerkun-
dungsdaten. Angesichts aktueller globaler Herausforderungen muss eine qualifizierte Ausbildung in Fernerkundungstech-
nologien bereits in den Schulen erfolgen, um Lehrerinnen und Lehrer (LuL) und damit auch Schülerinnen und Schüler
(SuS) für ökologische Themen zu sensibilisieren und technologische Kompetenzen zu schulen. Fernerkundung ist Teil
der MINT-Fächer (Mathematik, Informatik, Naturwissenschaften, Technik) an Schulen. Die Fernerkundung kann hier die
physikalisch-mathematischen Grundlagen thematisieren oder auch die biologisch-geographischen Anwendungen betrachten.
Allerdings stellt die Implementierung von Daten und Methoden in die regulären Lehrpläne technische und didaktische
Herausforderungen dar: Einerseits haben die Schulen nicht die notwendige Hard- und Software-Ausstattung, andererseits
erhalten nur wenige Lehramtsstudierende eine ausreichend umfassende Fernerkundungsausbildung. Eingebettete Medien
können zumindest die technischen Hürden teilweise überwinden, indem die Smartphones der SuS und LuL genutzt werden,
um beispielsweise Videos, Astronautenbilder oder sogar komplexe hyperspektrale Daten darzustellen und interaktiv zu
* Claudia Lindner
[email protected]
1
Geomatics Research Group, Department of Geography,
Ruhr-University Bochum, Universitätsstr 150,
44801 Bochum, Germany
13
Vol.:(0123456789)
PFG
nutzen. Diese Daten müssen für den Einsatz in Augmented Reality (AR) und Virtual Reality (VR) aufbereitet werden, um
an das Lehrplanthema, vorhandene mobile Endgeräte und die verfügbare Software angepasst zu werden. Spieleentwick-
lungsumgebungen erlauben die einfache Implementierung von AR- und VR-Apps, enthalten jedoch keine Funktionen für
Fernerkundungsmethoden. Unter diesem Gesichtspunkt wurden fünf Apps entwickelt, über deren Umsetzung hier berichtet
wird. Diese behandeln die Themen tropische Zyklone, Desertifikation, Energieverbrauch, Gravitation und Algenblüten. Alle
Apps verwenden Fernerkundungsdaten von verschiedenen Sensoren auf der ISS. Weiterhin wurde eine VR-App mit einem
DEM entwickelt, welches aus Astronautenfotos abgeleitet wurde. Die Apps haben unterschiedliche Interaktivitätsgrade und
sind in Arbeitsblätter integriert, die etwa eine Doppelstunde in der Schule umfassen.
13
PFG
3. Which limitations are faced when working with aug- phenomena that are invisible to the eye or imply danger of
mented reality (AR) and virtual reality (VR) applica- life is facilitated with increasing level of virtuality. Experi-
tions on Earth observation in school lessons? ments can be simulated and interactively manipulated in
dynamic and responsive virtual environments. Authentic
The paper is structured as follows: Sect. 2 introduces visualisation allows first-hand experiences and involve-
the reality–virtuality continuum. Section 3 demonstrates ment while creating sustainable knowledge. Correspond-
the possibilities of augmenting everyday school resources ingly, advantages of real environments include immediate
with the help of remote sensing data to foster media and response and direct communication, and first of all ubiquity
methodological skills in STEM lessons. Section 4 gives an and, therefore, availability and access at all times and eve-
example of a success story dealing with the derivation of a rywhere without technological restraints (Jensen and Kon-
DEM based on astronaut photography and integrated in a radsen 2018; Ott and Freina 2015; Mikropoulos and Natsis
VR environment to explore the Mount Fuji. A conclusion 2011). The key issues of employing applications on the
and outlook on future research is given in Sect. 5. virtuality–reality continuum successfully in school lessons
regard availability of technology and media literacy on the
one hand, and motivation and interactivity on the other hand
2 The Reality–Virtuality Continuum (Reiners et al. 2014) (Fig. 1).
13
PFG
Fig. 2 Stages of Developer’s experience resulting in progressively more powerful apps over the years
For the development of the apps, Unity was chosen due applied immediately. Unity also comes with a wide range
to its relative simplicity as a game development platform, of shaders—essentially scripts that tell the device how to
Vuforia for the same reason regarding image recognition render each screen pixel—that are also specialised on game
for AR. functions and thus represent different surfaces and how they
AR depends on an automatic matching between an object deal with lighting in a 3D scene. These shaders are written
in reality and a dataset that represents the same object in the in High-Level Shading Language (HLSL), for which there
form of graphic data. The Vuforia Developer Library offers are no development environments, which means no autofill
functions that takes an image of the real object, extracts its or compilation possibilities. Compiling errors are, however,
edges, and compares them with the edges of the graphic quite noticeable as Visual Studio will shut down on occa-
dataset. Such a dataset is called an “Image Target”. “Image sion when trying to autocompile incomplete or faulty HLSL
Targets represent images that Vuforia Engine can detect and code. Due to these complexities, shader development is rec-
track. … The Engine detects and tracks the features that are ommended only for experienced developers.
naturally found in the image itself by comparing these natu-
ral features against a known target resource database. Once 3.1 Exploring Interactive Elements
the Image Target is detected, Vuforia Engine will track the
image as long as it is at least partially in the camera’s field of All apps work in conjunction with analogue work sheets.
view.” (PTC (n.d.), https: //librar y.vufori a.com/conten t/vufor These are printed material, mostly A4 in size, and prefer-
ia-library/en/articles/Training/Image-Target-Guide.html). ably in colour for comprehensibility. However, black and
Colour is irrelevant, as the Image Targets are processed in white works just as well. The work sheets contain tasks for
greyscales. Low image quality is not necessarily bad—frag- the pupils and all the information that is needed for a solu-
ments can serve as edges as well. However, when printed, tion: Texts, maps, and graphs, some of which serve as Image
the images need to be of high quality. Unity includes a lot of Targets for the app.
objects and attached scripts that make scripting unnecessary The first AR app, named “The Eye of the Cyclone”, was
for basic game development. Scripts can be added using C# developed as a test to find out whether pupils and teach-
for all functions that are not included. This is done in Visual ers would accept the technology as well as to let the pro-
Studio that has a Unity extension and thus the scripts can be ject team experiment with the development environment.
13
PFG
Furthermore, the app shall add value to the worksheet that In the next app, “Earth by Night”, teaching about light
teaches the emergence of tropical cyclones by the example pollution in Central Europe, a first attempt at scripting was
of Maysak, a category 5 typhoon that hit Micronesia in made. The accompanying worksheet includes a Sentinel-2
March 2015. infrared, green, blue composite of the Rhine–Ruhr area for
As a first test, this app stays close to the tutorials for the pupils to discuss false colour images. The app superim-
using Vuforia 6.2 in Unity 5.6 and displays a video over- poses a video taken by the ISS-based Japanese METEOR
lay over static images that are used as Image Targets. The project (PERC 2018) that takes nighttime videos of Earth
first Image Target is a weather map depicting the air pres- with a standard HDTV camera. The video shows a flyo-
sure at sea level and 500 hPa geopotential altitude. In the ver from London via Belgium and the Rhine–Ruhr area to
app, this image is superimposed with an animation using Frankfurt. Since orientation in nighttime images of Earth
consecutive maps of the same type of the typhoon, show- is complicated for pupils, who are used to annotated maps
ing its path across Micronesia towards the Philippines. and not unaltered satellite images in their school material,
The second Image Target is a cross-sectional diagram annotation of selected features was necessary, but would
of a hurricane. It is superimposed with a video taken by have been disruptive due to the size and nature of the video.
the HDEV (High-Definition Earth Viewing) experiment A solution was found with a virtual button: a button that is
that shows Maysak in front of the curvature of the Earth pressed on paper although it only exists in the programme
(Runco 2015). The video is annotated with wind speeds (see Fig. 3).
and air pressure, but the main feature is the perspective, A plane object that includes names of large cities in the
giving the pupils a grasp of the dimensions of tropical correct places was superimposed on the superimposed video.
cyclones. Using the additional information from the two The standard Vuforia VideoPlayerHelper script had to be
videos, the pupils are to answer questions in the worksheet extended to access the frame position of the played video to
(Ortwein et al. 2016). define the position of the cities plane object, ensuring they
The second app, “Aralkum”, enhances a worksheet about always stay in the right places. The cities plane object is
the decline of the Aral Sea. Again, videos overlaying static activated on the “push” of the Virtual Button and deactivated
images were used to convey additional information and give on release, allowing the pupils to see the nighttime view of
the pupils a perspective. An image of the already diminished Europe unobstructed or annotated for orientation.
Aral Sea in 1964 receives an overlay of the development of
the lake between 2000 and 2016 using Landsat images. A 3.2 Experimenting Digitally
photograph taken from the ground, depicting a rusted anchor
in what was formerly the lake bed and is now a desert, is So far, the AR was used for 2D animation overlays. While
enhanced to a video from above, showing the blue western the next app, the “Earth–Moon system” (Lindner et al. 2019)
basin and the sparsely filled eastern basin of the Aral Sea. also includes an HDEV experiment video, it is the first app
The video was recorded by the HDEV experiment in Sep- in this project to utilise the third dimension and extensive
tember 2016. C# scripts. Earth and Moon form a multi-body system. The
This app features the first truly interactive function: a barycentre is its centre of mass around which all participat-
Sentinel-2 image of the Aral Sea from 2017 is superimposed ing bodies orbit. To help the pupils to understand the con-
with an animation of the shorelines from 2000 to 2016. Eve- cept, a 2D image in the worksheet corresponds to the Image
rything except the shorelines is transparent and the anima- Target for a 3D animation that displays two celestial bodies
tion can be halted. Pupils can then mark the lake’s extent
on the Sentinel-2 image of 2017, answer questions in the
worksheet and discuss the future of the lake using both the
information given in the worksheet and in the animation
(Schultz et al. 2017a, b).
Since mp4 does not support an alpha channel for transpar-
ency but was the easiest to overlay for the Vuforia software,
the way the video is rendered on the screen by Unity had
to be adjusted instead. To achieve this, an existing Shader
was used that turns one colour in the video transparent. The
Shader adds a high alpha value to all pixels with the colour
defined as mask. By reducing that alpha value, a smoothing
effect is achieved, the further the colour differs from the
Fig. 3 Demonstration of the app “Earth by Night” when using the
mask colour. Virtual Button, adding a layer with city names for orientation to the
night-time video of Central Europe
13
PFG
of different sizes revolving around each other with their orbit the screen. In Unity, this is achieved by attaching the Moon
lines (see Fig. 4). object to the camera object.
The implementation in Unity uses an invisible sphere in Using the Image Target, Unity constantly computes
which two differently coloured smaller spheres represent the camera’s position in relation to the printed picture of
Earth and Moon. An attached script lets the invisible large the Earth at the wall to calculate where the superimposed
sphere rotate together with both celestial bodies which leads objects must be rendered, and where not. The determination
to the impression that Earth and Moon revolve around their of the positions can also be exploited for further calcula-
common barycentre. tions. The geometric resolution applied to the calculation
The Unity LineRenderer imports an array of coordinate- depends on the size of the Image Target. Hence, if the Earth
triples (x, y, z), forms a smoothed polygon through theses in the Image Target has a start-radius of 6.37 cm when load-
points, and returns it to the application. The smoothness of ing the Image Target into Unity, all objects are defined in cm
the line is programmer controlled: for better visual impact, and all calculations are scaled to 1:100,000,000. This results
more line segments and thus more computation power is in a distance to the Moon of 3.6–4.1 m. The pupils thus
required. Fewer line segments result in faster processing, see the accurate relative distance between the Earth and the
but also result in an orbit with angles. Moon to scale, i.e. The Earth with a diameter of 12.7 cm on
The first digital experiment was created in the same app
and simulates the influence of the Moon on the Earth and
vice versa, while the distance between both varies. Pupils
shall observe the consequences of a much larger or a much
smaller distance than today. The teaching unit consists of a
prediction by the pupils, e.g. regarding tidal effects (hypoth-
esis), and a following test using the app. They shall also
discuss the results, especially with regard to the history of
the Earth–Moon system since its formation by a crash of
Proto-Earth and Theia until the far future when it will be
swallowed by the expanding sun shortly before its death.
The setup of the experiment consists of a printed image Fig. 5 Digital Moon on the smartphone-display linked to the printed
of the Earth and a digital image of the Moon (Fig. 5). The picture of the Earth, pinned on the wall
Earth’s image is a satellite composite. It is surrounded by
the picture of a frame of a round porthole. The porthole and
its screws support the tracking process by stabilising the
computations. The Moon’s image appears on the smartphone
display. Further, the smartphone–computer holds an Image
Target of the Earth (and the porthole), which allows tracking
the printed Earth’s image at the wall (Fig. 6).
The app displays the Earth in 3D as a sphere with an
equirectangular map as texture with a slow spinning script
attached. While the Earth slowly rotates, it stays in its posi-
tion on the wall. The Moon, however, again a sphere with
an equirectangular map as texture, is affixed to the smart-
phone camera position, staying stationary when viewed on
Fig. 4 Two celestial bodies seem to revolve around a common bar- Fig. 6 Picture of the Earth and its surrounding porthole, overlaid with
ycentre the detected points for referencing the Image Target
13
PFG
ag ≈ ±2RGM∕r3 . (1)
13
PFG
be developed for the app. Hence, a red, a green and a blue The image cube that shows an RGB composite of a scene
image object with different transparency settings are added and the border pixels’ spectra in a rainbow colour scale is
on top of each other and receive the band data as Main Tex- a popular way of displaying hyperspectral data. The main
ture. However, Unity does not offer a predefined shader that drawback is the limited visibility because the cube can only
can display the three images as required for the task, i.e. be viewed from the outside. For the app, a different approach
with only colour and transparency. Thus, a shader had to be was developed. Based on the 87 bands being available in
written specifically for this purpose. The shader needs access greyscale, a newly developed shader colours them in the
to the band image’s pixel colours as well as the colour and “Rainbow-Black” colour gradient set by ENVI as the default
the transparency assigned to the image object. The greyscale for the image cubes. The resulting coloured images are then
value from each pixel of the band image is multiplied by stacked atop of each other with a few millimetres apart, with
the image object colour and the new value, together with the lowest band number at the top. Users can use a slider to
the image object’s transparency, set as the rendered texture. remove band-by-band beginning at the top. The RGB viewer
Three sliders for the three colours can then be adjusted by only allows them to see individual bands in greyscale if they
the user to define which band is red, which green, and which move all sliders to the same band. However, humans are only
blue. The user can slide through the colour combinations in able to differentiate between about 30 shades of grey (Kreit
real time (see Fig. 8). et al. 2013) instead of the 256 shades available in the image.
The coloured cube allows them to see individual bands with
a much finer colour gradient. This gradient shader uses a
gradient map, an image that has the desired colour gradient
on the horizontal axis. The gradient map works as a colour
palette in any GIS; but as Unity does not have a way to mul-
tiply the greyscale value with a colour palette, it has to be
implemented in this shader.
The shader was adjusted to not render pixels outside user-
defined borders. Two double sliders for North/South and
East/West can be used to adjust the borders of the images
and due to the stacked, gradient-coloured band images,
profiles of the entire scene from all four directions can be
viewed (see Fig. 9).
The app will be tested for functionality and educational
value in spring 2020.
13
PFG
In contrast to AR, the usage of VR equipment encapsulates On 8.2.2016, 32 high-resolution images of the 3776 m
the user in an immersive environment. Therefore, interac- above sea-level Mt. Fuji were taken by ISS astronauts. The
tive school lessons in virtual environments have to com- images are available in the “Gateway to Astronaut Photog-
pensate for a lack of shared experience. To overcome this raphy of Earth” (Earth Science and Remote Sensing Unit,
issue, a collaborative mixed virtuality–reality approach was NASA Johnson Space Center). The great advantages of this
designed to ensure a learning effect for all participants. The image series are the short time interval between the images,
example application presented in this paper conveys knowl- different shooting angles, the high resolution (4928 × 3280
edge about Mount Fuji’s geology. Directed at an audience of pixels) and the small Ground Sampling Distance (GSD) of
younger pupils, the participants work in teams to solve the ≈ 4 m using a 1150-mm lens. Nevertheless, the problems
accompanying worksheet (see Fig. 13). While one partici- for low-resolution 3D models (GSD of 500 m) derived from
pant moves in VR, the other participants supply additional NASA’s HDEV experiment on the ISS (Muri et al. 2017)
information needed to solve questions in the virtual envi- were already described by Schultz et al. (2017a, b) and are
ronment. To make the application as realistic as possible, similar to those which were observed for the here presented
an astronaut’s imagery of Mount Fuji has been integrated high-resolution Fuji model. As a previous study has shown
into our VR framework. The workflow is described in the for Mt. Fuji, the high relief energy (large altitude differ-
following subchapters. ences of over 2600 m), clouds in the images, relief shad-
ows as well as the lack of information regarding shooting
4.1 Photogrammetry to Derive 3D Models for VR angle and ISS position makes it difficult to create a geo-
Applications referenced 3D model without considerable errors (Schultz
et al. 2018). The minimization of artefacts in the 3D model
Photogrammetry is the science of deriving measurements is particularly important to derive visually attractive and
from photographs. Commonly, the field of photogrammetry functional VR applications, which do not cause nausea. In a
can be split, based on camera location during image acquisi- first attempt, it turned out that at least seven Mt. Fuji images
tion, into aerial photogrammetry and terrestrial photogram- were required to create a “visually appealing” 3D model
metry. It is used to obtain information about objects or even (Schultz et al. 2018). However, a relatively large number of
a land surface through measuring and interpreting photo- errors occurred, which makes the model insufficient for VR.
graphic images. Photogrammetry is a remote (non-contact) These errors could be significantly reduced using 10 manu-
measurement technique and can be applied to one image or ally chosen ground control points for all 32 images. These
to a series of images to detect, measure and record complex points are also used to georeference the complete model.
2D and 3D fields (Baqersad et al. 2017). These techniques All calculations were performed with Agisoft Metashape©
are used in several applications and scientific disciplines, (Agisoft 2018). The first step is to align the images and to
e.g. analysis and restoration of cultural heritage, topographic calculate the exact camera position based on the common
mapping, architecture, engineering, and medicine. overlap between all images. Based on this data, a 3D point
Here, stereophotogrammetry, which is applied in aerial cloud and a mesh are calculated. This process is computa-
and terrestrial photogrammetry, is used to derive high-reso- tionally intensive but can be accelerated with GPU (graph-
lution georeferenced 3D models for VR applications. These ics processing unit). In total, the Mt. Fuji model has 3202
3D models form the basis for virtual animations and VR so-called tie points, which represent the number of points
applications, which are used in schools in conjunction with detected on two (or more) different images and a total num-
working sheets to foster the STEM skills of pupils. ber of 610,526 points for the dense point cloud.
Especially high-resolution astronauts’ images of the The individual images are then projected onto the mesh
Earth, taken from the ISS, are particularly suitable for mod- (see Fig. 10) and are the basis to generate the texture. To
elling individual structures, such as volcanoes or mountain enhance the quality of the model, gaps were filled and arte-
ranges. Most of the current satellite systems from NASA facts corrected manually. Especially in regions with clouds,
or ESA like Landsat or Sentinel are not able to record the the mesh was insufficient and large gaps occurred (see
Earth’s surface stereoscopically. The ISS itself is an Earth Fig. 10).
observation platform but has several unique characteristics The complete orthomosaic has 6804 × 6628 pixels with
which distinguishes it from satellite- or airborne remote the GSD of 2.68 m. The DEM shown in Fig. 10 has a lower
sensing: it is manned, sun-asynchronous in a low Earth resolution with a GSD of 5.8 m and 3,173 × 3,111 pixels.
orbit (400 km) and faces varying dynamics regarding roll, Consequently, the coloured 3D model shown in Fig. 11
pitch and yaw angles (Stefanov and Evans, 2014). Astro- covers an area of 15.1 km by 10.4 km. The results can be
nauts’ images profit from these unique advantages and can displayed as a simple, freely rotatable 3D model. In addi-
provide a high-resolution view of the Earth’s surface with a tion, anaglyph images (Michel 2013), which convey a three-
focal length of up to 1600 mm. dimensional visual impression, can be generated. The 3D
13
PFG
13
PFG
References
Agisoft (2018) Agisoft Metashape User Manual Standard Edition,
Version 1.5 https://www.agisoft.com/pdf/metashape_1_5_en.pdf.
Accessed 12 Sep 2019
Baqersad J, Poozesh P, Niezrecki C, Avitabile P (2017) Photogramme-
try and optical methods in structural dynamics—a review. Mech
Syst Signal Process 86:17–34. https://doi.org/10.1016/j.ymssp
.2016.02.011 (accessed 20 Mar 2020)
BHG: Baltic Hydrographic Commission (2013) Baltic Sea Bathymetry
Database, https://data.bshc.pro. Accessed 20 Mar 2020
Epic Games (2019) Unreal Engine 4 Documentation. Materials. Con-
trolling the appearance of surfaces in the world using shaders.
https://docs.unrealengine.com/en-US/Engine/Render ing/Mater
ials/index.html. Accessed 20 Mar 2020
Franc T (2012) Tides in the Earth-Moon System. Proc 21st Ann Conf
Doct Stud. WDS 2012. Charles University, Prague. https://www.
mff.cuni.cz/veda/konference/wds/proc/pdf12/WDS12_318_f12_
Fig. 13 Worksheet Mt. Fuji Franc.pdf. Accessed 10 Aug 2019
Goetzke R, Hodam H, Rienow A, Voß K (2013) Tools and learning
management functions for a competence-oriented integration of
cube has been made accessible so that the whole information remote sensing in classrooms. In: Jekel T, Car A, Strob J, Grieseb-
can be interpreted and not only one row. ner G (eds) GI_Forum 2013. Wichmann, Offenbach, Germany, pp
458–463. https://doi.org/10.1553/giscience2013s458 (Accessed
It has been shown how AR apps enhance static, printable
20 Mar 2020)
worksheets with animated content and inexpensive, effective GPDN: Geopotential Deutsche Nordsee (2013) Bathymetrie. www.
digital experiments in Earth observation—all independent gpdn.de/. Accessed 20 Mar 2020
of school IT infrastructure. Much content can be developed Ikropoulos TA, Natsis A (2011) Educational virtual environments.
A ten-year review of empirical research (1999–2009). Com-
using Unity and Vuforia without additional scripts; however,
put and Educ 56(3):769–780. https://doi.org/10.1016/j.compe
since it is a game development environment and has no spe- du.2010.10.020 (Accessed 20 Mar 2020)
cialisation for image processing, even basic GIS functions Jensen L, Konradsen F (2018) A review of the use of virtual reality
and the digital experiments have to be scripted in C#, using head-mounted displays in education and training. Educ and Inf
Technol 23(4):1515–1529. https://doi.org/10.1007/s10639-017-
methods that consume as little memory as possible. The use
9676-0 (accessed 20 Mar 2020)
of specialised shaders written in HLSL is necessary for such Keil J, Edler D, Dickmann F (2019) Preparing the HoloLens for user
methods as well. studies: an augmented reality interface for the spatial adjustment
Additionally, a VR environment has been augmented with of holographic objects in 3D indoor environments. Kartogr Nachr
69(3):205–215. https://doi.org/10.1007/s42489-019-00025-z
a real DEM derived from astronauts’ photography to enable
(accessed 20 Mar 2020)
students to do an excursion to Mount Fuji and experience Kersten T, Deggim S, Tschirschwitz F, Lindtstaedt MU, Hinrichsen N
its geology. Since the expensive VR hardware is not widely (2018) Segeberg 1600—Eine Stadtrekonstruktion in virtual real-
available, it should be converted from a desktop application ity. Kartogr Nachr 68(4):183–191
Korwan DR, Lucke RL, McGlothlin NR, Butcher SD, Wood DL,
into a mobile one to reach more interested teachers and stu-
Bowles JH, Corson M, Snyder WA, Davis CO, Chen DT (2009)
dents. For the mobile version, the landscape has to be simpli- Laboratory Characterization of the Hyperspectral Imager for the
fied heavily to work with the limited resources available on Coastal Ocean (HICO). IEEE IGARSS proc.
13
PFG
Kreit E, Mäthger LM, Hanlon RT, Dennis PB, Naik RR, Forsythe E, PERC: Planetary Exploration Research Center (2018) ISS Meteor
Heikenfeld J (2013) Biological versus electronic adaptive colora- Observation Project “METEOR”, www.perc.it-chiba.ac.jp/proje
tion: how can one inform the other? J R Soc Interface. https://doi. ct/meteor/. Accessed 20 Aug 2019
org/10.1098/rsif.2012.0601 (accessed 20 Mar 2020) PTC (n.d.) Vuforia Developer Library—optimizing target detection
Lindner C, Hodam H, Ortwein A, Schultz J, Selg F, Jürgens C, Rienow and tracking stability. https://librar y.vuforia.com/articles/Solut
A. (2018) Towards a new Horizon for Planetary Observation in ion/Optimizing-Target-Detection-and-Tracking-Stability.html.
Education. Proc 2nd Symp Space Educ Activ, https://www.hit. Accessed 31 Jan 2020
bme.hu/~bacsardi/SSEA/SSEA2018_proceedings.pdf. Accessed Reiners T, Wood L, Gregory S (2014) Experimental study on con-
17 Sep 2019 sumer-technology sup-ported authentic immersion in virtual
Lindner C, Rienow A, Jürgens C (2019) Augmented reality appli- environments for education and vocational training. In: Hegarty
cations as digital experiments for education—an example in B, McDonald J, Loke S (ed) Proc 31st Ann Ascilitite Conf, pp
the earth-moon system. Acta Astronaut 161:66–74. https://doi. 171–181
org/10.1016/j.actaastro.2019.05.025 (accessed 20 Mar 2020) Rienow A, Graw, V, Heinemann, S, Schultz J, Selg, F, Menz G (2016)
Liu GZ, Hwang GJ (2010) A key step to understanding paradigm shifts Earth Observation from the ISS Columbus Laboratory—an Open
in e-learning: towards context-aware ubiquitous learning. BJET Education Approach to Foster Geographical Competences of
41(2):E1–E9. https://doi.org/10.1111/j.1467-8535.2009.00976.x Pupils in Secondary Schools. In: Proc ESA LPS 2016
(accessed 20 Mar 2020) Runco S (2015), International Space Station—High Definition Earth
Martinis S, Twele A, Plank S, Zwenzner H, Danzeglocke J, Strunz Viewing (HDEV), https://www.nasa.gov/mission_pages/stati
G, Lüttenberg HP, Dech S (2017) The International Charter on/research/experiments/explorer/Investigation.html?#id=892.
‘Space and Major Disasters’: DLR’s Contributions to Emer- Accessed 20 Aug 2019
gency Response Worldwide. PFG 85(5):317–325. https://doi. SBJF: Senatsverwaltung für Bildung, Jugend unf Familie (2015)
org/10.1007/s41064-017-0032-1 (accessed 20 Mar 2020) Rahmenlehrplan Teil C—Geografie. https://bildungsserver.berli
METI and NASA (2009) ASTER Global digital elevation model n-brandenburg.de/fileadmin/bbb/unter r icht/rahmenlehr plaen
[Data set]. NASA EOSDIS Land Processes DAAC. https://doi. e/Rahme n lehr p lanp r ojek t /amtli che_Fassu n g/Teil_C_Geogr
org/10.5067/ASTER/ASTGTM.002 afie_2015_11_10_WEB.pdf. Accessed 31 Jan 2020
Michel B (2013) Digital Stereoscopy: Scene to Screen 3D Production Siegmund A (2011) Satellitenbilder im Unterricht–eine Länderver-
Workflows. Stereoscopy News, Stereoscopy New gleichsstudie zur Ableitung fernerkundungs-didaktischer Grund-
Milgram P, Kishino FA (1994) Taxonomy of mixed reality visual dis- sätze. Dissertation at the Heidelberg Univ of Educ
plays. IEICE Trans Inform and Syst E77-D(12):1321–1329 Schultz J, Lindner C, Hodam H, Ortwein A, Selg F, Weppler J, Rie-
Milgram P, Takemura H, Utsumi A, Kishino F (1995) Augmented now A (2017) Augmenting Pupil’s Reality from Space—Digital
reality: a class of displays on the reality-virtuality continuum. Learning Media based on Earth Observation Data from the ISS.
Proc Telemanipulator Telepresence Technol. https : //doi. In: Proc. IAC
org/10.1117/12.197321 (accessed 20 Mar 2020) Schultz J, Ortwein A, Rienow A (2017) Technical Note: using ISS
Muri, P, Runco S, Fontanot C, Getteau C, (2017) The High Definition Videos in Earth Ob-servation—implementations for Science
Earth Viewing (HDEV) payload. Proc IEEE Aerospace Conf, Big and Education. Eur J Remote Sens 51(1):28–32. https://doi.
Sky, MT, pp 1-7 org/10.1080/22797254.2017.1396880
Ott M, Freina L (2015) A literature review on immersive virtual reality Schultz J, Hodam H, Lindner C, Ortwein A, Selg F, Rienow A (2018)
in education: state of the art and perspectives. In: Proc. eLearning Ableitung von 3D-Modellen aus Daten des High Definition Earth
and Softw for Educ (eLSE). Viewing-Experiments (ISS)—Anwendungen für den Schulunter-
Ortwein A, Graw V, Heinemann S, Menz G, Schultz J, Selg F, Rie- richt. Pub DGPF 27:129–140
now A (2016) Pushed Beyond the Pixel—interdisciplinary Earth Stefanov W, Evans CA (2014) The international space station: A
Observation Education from the ISS in Schools. In: Proc. IAC unique platform for remote sensing of natural disasters. JSC Bien-
MKJS BW: Ministerium für Kultus, Jugend und Sport Baden-Würt- nal Res Rep, pp 108–110.
temberg (2016) Bildungsplan des Gymnasiums – Bildungsplan Tschirschwitz F, Richerzhagen C, Przybilla HJ, Kerten TP (2019) Duis-
2016 – Geographie. https://www.bildungsplaene-bw.de/site/bildu burg 1566: transferring a historic 3D city model from google earth
ngsplan/get/documents/lsbw/export-pdf/depot-pdf/ALLG/BP201 into a virtual reality application. PFG 87(1–2):47–56. https://doi.
6BW_ALLG_GYM_GEO.pdf. Accessed 31 Jan 2020 org/10.1007/s41064-019-00065-0 (accessed 20 Mar 2020)
MSB NRW: Ministerium für Schule und Bildung des Landes Nor- Voß K, Goetzke R, Thierfeldt F (2007) Integration von Fernerkundung
drhein-Westfalen (2019) Kernlehrplan für die Sekundarstufe I – im Schulunterricht. Pub DGPF 16:41–50
Gymnasium in nordrhein-Westfalen. Erdkunde. https: //www.schul Voß K, Goetzke R, Thierfeldt F, Menz G. (2007) Integrating Applied
entwicklung.nrw.de/lehrplaene/lehrplan/200/g9_ek_klp_%20340 Remote Sensing Methodology in Secondary Education. In: IEEE
8_2019_06_23.pdf. Accessed 31 Jan 2020 IGARSS Proc, pp 2167–2169.
Pagán B (2006) Positive contributions of constructivism to educational
design. EJOP 2(1):23–32. https://doi.org/10.5964/ejop.v2i1.318
(accessed 20 Mar 2020)
13