0% found this document useful (0 votes)
11 views

A Modular Reconfigurable and Portable Framework For On-Board Data Processing Architecture and Applications

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

A Modular Reconfigurable and Portable Framework For On-Board Data Processing Architecture and Applications

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

A Modular, Reconfigurable and Portable Framework

for On-Board Data Processing: Architecture and


Applications
Murray Ireland Charlotte Crawshaw Mikulas Cebecauer Lucy Donnell
Craft Prospect Craft Prospect Craft Prospect Craft Prospect
Glasgow, Scotland, UK Glasgow, Scotland, UK Glasgow, Scotland, UK Glasgow, Scotland, UK
[email protected] [email protected] [email protected] [email protected]

Craig Hay
Craft Prospect
Glasgow, Scotland, UK
[email protected]

Abstract—A balance between meeting customer data require- service segment resources can all impact the requirements and
ments and delivering timely, tailored solutions is critical. The As- constraints on any OBDP and IE activities. An OBDP solution
tral Intelligence Toolbox is a framework that provides a modular, developed for one use case may therefore be unsuitable for a
reconfigurable architecture for developing and deploying soft-
ware applications for on-board processing. Different use cases, similar use case with different instrument, platform, ground or
processing targets and data sources can be rapidly targeted using service segment. This makes solution development bespoke,
a combination of reusable software components and development costly and, until flight-tested for a given configuration, high-
tools. This allows customer requirements to be quickly mapped risk.
to component selection and configuration. Applications can then One solution to these issues is to employ a modular,
be iteratively developed from demonstration through to flight.
Case studies for configured applications are presented and result component-based approach to developing on-board software
shown, based on engagements with end users during an ESA applications for data processing and information extraction.
InCubed activity. This approach utilises standard building blocks which can
Index Terms—on-board processing, machine learning, artificial be configured, integrated and deployed together to target
intelligence, Earth observation, architecture variations on use cases or entirely new ones.
This paper presents a brief overview of the requirements,
I. I NTRODUCTION
architecture and sample applications for a modular on-board
The ever-growing interest in on-board data processing processing and information extraction framework, the Astral
(OBDP) and information extraction (IE) capabilities in satellite Intelligence Toolbox (AITB). The approach taken ensures that
missions is well known, arising from a need to address developed components can be re-used across multiple use
challenges in data bottlenecks, the latency of actionable data, cases and applications, instruments and computing platforms,
big data management and space data service provision. On- ensuring a balance between meeting customer requirements
board processing tasks can be used to reduce data and generate and efficient, timely development of solutions. The sample
in-orbit products and metadata, while machine learning or applications have been developed in tandem with Surrey
other information extraction methods can add further value Satellite Technology for the InCubed activity ”Demonstrating
to these outputs and enable unique products such as feature an Innovative, Flexible and Intelligent Payload Chain for High
masks, content tags and other ”actionable” metadata. Data Throughput on Small EO Satellites” [2].
Use cases for OBDP and IE range from the generic and
instrument-focussed (e.g. reducing/compressing hyperspectral II. R EQUIREMENTS
data) to highly application-specific (e.g. generating specific The framework has the following driving requirements:
data products and meeting requirements on information la- • Employ a component-based approach to aid re-use, unit
tency and quality). Even within a given use case, there can testing, continuous development and portability.
be huge variety in the components of the data pipeline – • Allow simple, run-time configuration of key component
operational behaviour, instrument, on-board computation re- parameters.
sources, ground station costs, bandwidths and revisit rates, and • Allow targeting of new instruments and target platforms
with minimal impact on existing component source code.
This work was funded under the ESA InCubed activity ”Demonstrating an
Innovative, Flexible and Intelligent Payload Chain for High Data Throughput • Facilitate the generation of customer-focussed mission
on Small EO Satellites”, led by Surrey Satellite Technology Ltd. metrics – cost, quality, timeliness and readability.

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on June 04,2024 at 14:44:52 UTC from IEEE Xplore. Restrictions apply.
returning a mask as a number array of identical dimen-
sions with one or more channels. The parameters of the
component define key semantic segmentation properties such
as confidence threshold, chipping grid dimensions, pre- and
post-inference number scaling ranges and more. The ML
model itself is stored as a unique component object which
provides a common I/O interface for different ML libraries
(e.g. TensorFlow, OpenVINO, PyTorch).
The Data Manager component provides a comprehensive
set of tasks for creating, retrieving, editing and saving data
products within the application. Data products themselves are
stored in a component array parameter and interacted with
exclusively via these tasks. Other components are structured
similarly.
Critical parameter values are hard-coded at build-time and
can be modified by the application itself or dedicated interfaces
during execution. Non-critical or runtime parameters such as
visualisation settings or filesystem paths can be optionally set
at runtime by users.
Fig. 1: High-level framework architecture, showing compo- IV. T ESTING
nents for handling, processing and visualising data.
A combination of built-in tests such as time and throughput
profiling and external tools for others such as accuracy and
• Enable a balance between feasible, in-orbit solutions and mission-contextual metrics are used to ensure compliance
high performance. with functional and performance requirements, including those
derived from end user needs.
These requirements are defined in order to favour re-use and In addition to testing within a specialised test harness,
reconfiguration of on-board processing solutions over highly applications are also deployed within a third party OBDP run-
problem-specific solutions. This re-use also provides benefits time system, which is being developed in parallel by project
in terms of software assurance, rapid deployment and testing. partners under InCubed funding. This validates the portability
claim of the AITB applications and ensures compatibility with
III. A RCHITECTURE
near-term flight systems for in-orbit demonstration opportuni-
The high-level framework is presented in Fig. 1. Each com- ties.
ponent category (pre-processing, CDH, etc.) contains several
modular software components which are designed to perform V. C ASE S TUDY: H YPERSPECTRAL DATA R EDUCTION
specific related tasks. For example, a radiometric correction In the first case study, a software application is developed to
component records current calibration parameters and provides enable intelligent reduction of hyperspectral data, overcoming
tasks to radiometrically calibrate raw data. A classification data bottlenecks in the downlink, and reducing the low-value
component records prediction thresholds, data scaling values content in the data (clouds, in this case). The test data in this
and model architectures and provides tasks to perform chip case are raw hyperspectral images of resolutions between 17
classification on an input image. m/px or 34 m/px, depending on the operating mode.
Components can be tailored to the use case. For example,
a data loader component can use local OS file system calls or A. Requirements
utilise the API of a host framework. A visualisation component The driving requirements for the hyperspectral data reduc-
can be enabled for demonstration and verification purposes tion application are:
during development. • Process raw data into a “science-ready” product (top-of-
Any given software application then comprises a combina- atmosphere reflectance).
tion of core components (data manager, data loader) and use • Reduce compressed file size relative to a traditional base-
case-specific components, which aim to generate from raw line (CCSDS 122 compression of raw data) by optimising
data a suite of data and/or information products and metadata compression parameters for cloudy regions.
which meet the requirements of end users. • Increase the number of useful products per downlink.
Components are selected based on their functionality and
then configured either at build-time or runtime as required. B. Application Configuration
For example, a Masking component offers inference and pre- The application is configured from the AITB framework by
and post-processing tasks (e.g. normalisation, image chip- first configuring the parameters of each relevant component
ping/stitching), ingesting an image as a number array and and then scripting the sequence order and interactions of key

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on June 04,2024 at 14:44:52 UTC from IEEE Xplore. Restrictions apply.
Raw data

Radiometric
L1 data
calibra�on

(a) Before calibration.

Seman�c Cloud
Segmenta�on mask

Feature
Removal

(b) After calibration.

Compression

L1 data
(comp.)

Fig. 2: Processing stages for the hyperspectral data reduction


application. (c) Calibrated image showing cloud mask contours.
Fig. 3: Data products generated by hyperspectral data reduc-
tion application from raw hyperspectral data.
processing tasks within each component. The data flow and
output products are shown in Fig. 2.
VI. C ASE S TUDY: W ILDFIRE A LERTING
C. Results In the second case study, a software application is developed
Visual results are presented in Fig. 3. These show the to enable generation of human-readable, lightweight alerts
progression from raw image data to radiometrically calibrated indicating the presence and location of wildfires. This acts as
product to a cloud mask (a high-level “information” product). an alternative or supplement to services such as Copernicus
The cloud mask can then enable improved compression gains or FIRMS which rely on large raw data files and extensive
by either zeroing cloudy pixels or performing cloud-sensitive ground service infrastructure. The test data here are Landsat
lossless hybrid (lossy + lossless) compression. 8 multispectral images of Oregon, USA.
Using this method, compression gains vs a purely-lossless A. Requirements
baseline (CCSDS 122.2) are calculated based on the region
of interest. In the example below, non-cloudy pixels comprise The driving requirements for the wildfire alerting applica-
79.5% of the image, yielding a relative increase in compression tion are:
ratio of 1.35. For average cloud cover (50%), the relative • Process multispectral image data into actionable informa-
compression ratio gain is 2. tion (a human-readable alert message).

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on June 04,2024 at 14:44:52 UTC from IEEE Xplore. Restrictions apply.
TABLE I: Wildfire alerting application metrics.
Type Metric Value
L1B data
Model accuracy MeanIoU 93%
True positive 99.6%
False positive 0.4%
True negative 97.2%
False negative 2.8%
Latency Input image to alert generation 4.84 s
Geolocation Precision 10 m
Georeferencing L1C data

small text and image files, which are on the order of kilobytes
and can be used to provide a high level of confidence in fire
detections and location.
Seman�c
Fire mask VII. C ASE S TUDY: L AND C OVER T HEMATIC M APPING
Segmenta�on
In the third case study, a software application is create
thematic land cover maps on-board the satellite using RGB
data from a video sensor. An additional cloud detection stage
allows the land cover inference to be skipped if cloud coverage
Product is high enough. Here, the test data are Bayer pattern images
crea�on of various land scenes at a resolution of 1 m/px.
A. Requirements
The driving requirements for the land cover thematic map-
ping application are:
Compression • Process Bayer pattern image data into psuedo-reflectance
products.
• Generate composite land cover and cloud map products
from the image data.
• Meet end user-defined targets for classification accuracy.

Alerts, B. Application Configuration


thumbnails
The application is configured as shown in Fig. 6.
The land cover model is trained on a custom dataset created
Fig. 4: Processing stages for the wildfire alerting application. using ESA WorldCover labels [1]. The label set includes:
tree cover, shrubland, grassland, cropland, built up, bare/sparse
vegetation, snow and ice, permanent water bodies, herbaceous
• Deliver verification products to supplement alert, such as wetland, mangroves and moss and lichen.
annotated, lightweight image files.
C. Results
• Meet end user-defined targets for classification accuracy.
Performance testing results from the land cover mapping
B. Application Configuration application are forthcoming at the time of writing. Preliminary
The application is configured as shown in Fig. 4. results are shown in Fig. 7.

C. Results VIII. ROADMAP AND F UTURE


Performance metrics for the wildfire alerting application Further applications and capabilities are being developed
are presented in Table 1. These meet the requirements for by Craft Prospect under R&D and commercial funding. These
trustworthy utilisation of alert and thumbnail data on the include capabilities extending beyond payload data processing
ground, having both low false positives (falsely detected into mission-critical autonomy, such as real-time decision-
fire → wasted time and resources) and false negatives (missed making, reactive mission planning and distributed tasking.
fire → environment damage, loss of life). As with the existing framework, these applications comprise
Visual results are presented in Fig. 5. These show the pro- components and standardised interfaces to facilitate re-use,
gression from L1 image data to an annotated image indicating flexibility, portability and compliance with mission require-
fire boundaries and a plain-text alert message. The multispec- ments.
tral, high-quality image data is not the primary product for Flight heritage for elements of the AITB are ancticipated
the end user in this case and can be deprioritised in favour of in the short-term. CPL’s cloud detection model is expected to

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on June 04,2024 at 14:44:52 UTC from IEEE Xplore. Restrictions apply.
Raw data

Debayering

Radiometric
L1B data
calibra�on

(a) False colour image.


Seman�c Cloud
Segmenta�on mask

Land
Seman�c
cover
Segmenta�on
map

Mask stacking
and product
crea�on

(b) False colour image showing wildfire contours.

Feature map

Fig. 6: Processing stages for the land cover mapping applica-


tion.

be demonstrated in-orbit imminently. The Forwards Looking


Imager, its own models and its geolocation functionality are
expected to gain flight heritage in 2024. Further in-orbit
demonstrations of full AITB applications are expected from
2024. Further collaboration with SSTL is also planned.
(c) Alert product for detected fires.
R EFERENCES
Fig. 5: Data products generated by wildfire alerting application [1] ESA, ESA WorldCover https://esa-worldcover.org
from Landsat 8 test data. [2] SSTL, FIPC InCubed https://incubed.esa.int/portfolio/fipc

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on June 04,2024 at 14:44:52 UTC from IEEE Xplore. Restrictions apply.
(a) Raw Bayer pattern image. (b) Debayered image.

(c) Calibrated image. (d) Land cover mask.


Fig. 7: Data products generated by land cover mapping application from raw RGB test data. The model accurately predicts
high coverage for built up areas and water bodies, but also provides several spurious outputs, including a lack of confidence
(< 50%) in some areas, for which ”no data” is returned.

Authorized licensed use limited to: J.R.D. Tata Memorial Library Indian Institute of Science Bengaluru. Downloaded on June 04,2024 at 14:44:52 UTC from IEEE Xplore. Restrictions apply.

You might also like