0% found this document useful (0 votes)
53 views24 pages

1 s2.0 S2468067223000330 Main

Uploaded by

juan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views24 pages

1 s2.0 S2468067223000330 Main

Uploaded by

juan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

HardwareX 14 (2023) e00426

Contents lists available at ScienceDirect

HardwareX
journal homepage: www.elsevier.com/locate/ohx

ROMR: A ROS-based open-source mobile robot


Linus Nwankwo ⇑, Clemens Fritze, Konrad Bartsch, Elmar Rueckert
Chair of Cyber-Physical Systems, Montanuniversität, 8700 Leoben, Austria

a r t i c l e i n f o a b s t r a c t

Article history: Currently, commercially available intelligent transport robots that are capable of carrying
up to 90 kg of load can cost $5,000 or even more. This makes real-world experimentation
prohibitively expensive and limits the applicability of such systems to everyday home or
Keyword: industrial tasks. Aside from their high cost, the majority of commercially available platforms
Mobile robot are either closed-source, platform-specific or use difficult-to-customize hardware and firm-
ROS ware. In this work, we present a low-cost, open-source and modular alternative, referred to
open-source robot
herein as ‘‘ROS-based Open-source Mobile Robot (ROMR)”. ROMR utilizes off-the-shelf (OTS)
differential-drive robot
autonomous robot
components, additive manufacturing technologies, aluminium profiles, and a consumer
hoverboard with high-torque brushless direct current (BLDC) motors. ROMR is fully compat-
ible with the robot operating system (ROS), has a maximum payload of 90 kg, and costs less
than $1500. Furthermore, ROMR offers a simple yet robust framework for contextualizing
simultaneous localization and mapping (SLAM) algorithms, an essential prerequisite for
autonomous robot navigation. The robustness and performance of the ROMR were validated
through real-world and simulation experiments. All the design, construction and software
files are freely available online under the GNU GPL v3 license at https://doi.org/10.17605/
OSF.IO/K83X7. A descriptive video of ROMR can be found at https://osf.io/ku8ag.
Ó 2023 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC
BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Specification table

Hardware name A ROS-based open-source mobile robot (ROMR)


Subject area  Robotics
 Sensor fusion
 Simultaneous localisation and mapping (SLAM)
 Navigation
 Teleoperation
 Research and development in robotics
 General
Hardware type  Mechatronic
 Robotic
Open source license GNU GPL v3
Cost of hardware < $1500
Source file repository https://doi.org/10.17605/OSF.IO/K83X7

⇑ Corresponding author.
E-mail address: [email protected] (L. Nwankwo).

https://doi.org/10.1016/j.ohx.2023.e00426
2468-0672/Ó 2023 The Author(s). Published by Elsevier Ltd.
This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

1. Hardware in context

Intelligent transport robots (ITRs) are becoming an integral part of our daily activities in recent years [1–3]. Their appli-
cation for day-to-day activities especially in industrial logistics [4], warehousing [5], household tasks [6,7], etc., provides not
only a cleaner and safer work environment but also helps to reduce the high costs of production. These robots offer the
potential for a significant improvement in industrial safety [8], productivity [9] and general operational efficiency [10]. How-
ever, several challenges such as the reduced capacity[11,10], affordability [12], and the difficulties in modifying the inbuilt
hardware and firmware still remain [13].
Although significant effort has been undertaken by the scientific community in recent years to develop a standardised
low-cost mobile platform, there is no open-source system that fulfils the high industrial requirements. Most available
open-source, low-cost platforms are still limited in their functions and features, i.e., they are commercially not available,
do not match the required payloads for logistic tasks, or cannot be adapted. For example, while [14–16], are inexpensive,
they cannot be used in day-to-day tasks that require transporting materials of high loads of 5 kg or more.
On the other hand, many commercially available industrial platforms exist that feature high payloads, see Table 1 for an
overview. Unfortunately, they are expensive, closed-source, and platform-specific with inbuilt hardware, and firmware that
may be difficult to modify [16]. This restrains many users’ ability to explore multiple customisation or reconfiguration
options to accelerate the development of intelligent systems.
Consequently, there is a crucial need to have low-cost robots with comparable features that can easily be scaled to adapt
to any useful purpose. To this end, we propose ROMR, a modular, open-source and low-cost alternative for general-purpose
applications including research, navigation [17], and logistics.
ROMR is fully compatible with ROS, has a maximum payload of 90 kg and costs less than $1500. It features several lidar
sensor technologies for potential application for perception [18], simultaneous localisation and mapping [19,20], deep learn-
ing tasks [21], and many more. Figs. 1 and 8 present the pictorial view and the cyber-physical anatomy of ROMR respectively.

1.1. Related hardware platforms

In Table 1, we present a comparative evaluation with ROMR, similar hardware developed in recent years for research, nav-
igation and logistics applications. Our comparison focused on some key features that define the open-sources, robustness
and versatility of the robot design, e.g., ease of reconfiguration or modification, cost performance, load carrying capacity,
and full compatibility with the ROS [22].
In Table 1, the ROS feature indicates whether the robot is fully compatible with ROS or not. Custom determines whether
the platform satisfies easy modification of its design and integration of additional components. OpenS determines whether
the hardware (electronic circuits, design files, etc.) and software (source codes, ROS packages, etc.) are fully open-source and
maintained by the open-source community such that external hobbyists can replicate the same design without the need to
contact the developer. Finally, the cost feature determines the affordability of the system. As shown in Table 1, the majority
of the robots are not open-source and are expensive. This limits wide application, e.g., in research, navigation and logistics.
Our ROMR has been developed as a low-cost open-source alternative. The approximate cost to redevelop it currently stands
at less than $1,500.00. Recently, ROMR is been used for both B.Sc. and M.Sc. projects in our laboratory.

2. The ROMR description

ROMR is compactly and robustly designed to ensure stability, ease of integration of additional components, and low cost
of reproducing the system. In this section, we describe the robot’s hardware and software platforms. Afterwards, we describe
the technical specifications and tools, as well as the detailed architecture of the control unit.

Table 1
Comparison of existing mobile platforms to our ROMR development.

Robot Name ROS Payload (kg) Custom Cost ðk$Þ OpenS

RMP Lite 220 U 50 U 2.99 


Ackerman Pro Smart U 22 U 3.99 
Nvidia Carter U 50 U 10.00 U
Panther UGV U 80 U 15.90 
Tiago base U 100 U 11.50 
Clearpath TurtleBot 4 U 9 U 1.90 U
Summit XL U 65 U 11.50 
MIR100 U 100 U 24.00 
AgileX Scout 2.0 U 50 U 12.96 
Jackal J100 U 20 U 18.21 
4WD eXplorer U 90 U 15.70 
ROSbot 2.0 U 10 U 2.34 
ROMR U 90 U 1.50 U

2
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Fig. 1. ROMR is built from consumer hoverboard wheels with high torque brushless direct current (BLDC) motors. It utilises Arduino Mega Rev3, an Nvidia
jetson Nano, and the components described in Tables 4 and 5. (a) front view (b) side view (c) back view.

2.1. Hardware description

We leveraged off-the-shelf (OTS) electronics that are commercially available online, additive manufacturing technologies
(3D printing), and aluminium profiles for the robot’s structural design. The main reason for using aluminium profiles is to
achieve a lightweight structure that can hold the hardware and associated electronics of the robot without increasing its
overall weight. At the same time, the profiles could resist load stress and damage during everyday use. The profile bars with
slots were also used to enclose all the electronics and power subsystems at the base of the robot for proper weight distri-
bution. This increases the flexibility to connect any additional hardware component to it.
ROMR is equipped with an Arduino Mega Rev 3, an Nvidia Jetson Nano board, an Odrive 56 V V3.6 brushless direct current
(DC) motor controller, and two 350 W hoverboard brushless motors with five inbuilt hall sensors. The Arduino board is
responsible for low-level tasks, such as gesture-based control of the robot using an inertial measurement unit (IMU) sensor
and teleoperation from remote-controlled (RC) devices. In contrast, the Jetson Nano board handles high-level processing
tasks such as deep learning, SLAM, and ROS navigation tasks with the RGB-D cameras and LiDARs, etc. The two boards com-
municate with each other through the serial UART interface.
Furthermore, ROMR is powered by rechargeable lithium-ion batteries (36 V 4400mAh), which are affordable, inexpensive
to maintain, and eco-friendly (i.e., they do not contain heavy metals such as lead or cadmium which are harmful to the envi-
ronment and human health). Additionally, the robot is endowed with an Intel Realsense D435i RGB-D camera for visual per-
ception and depth sensing, and an Intel Realsense T265 for localization and tracking. ROMR is also equipped with a 9-axis
MPU 9250 IMU sensor for tracking and localisation. The IMU sensor is used also for gesture-based teleoperation which
allows a non-robotics expert to intuitively control the robot using hand gestures (see SubSection 6.2.3 for more details).
For 2D mapping, we used an RPlidar A2 M8 with a 360-degree field of view (FOV). This lidar has a maximum range distance
of 16m and operates at a frequency of 10Hz. The lidar sensor allowed us to create a 2D occupancy-grid map of the environ-
ment, which was then used to support the ROMR navigation, localization, and obstacle detection within the environment.
We supported ROMR with a small caster wheel in addition to its drive wheels to increase stability, manoeuvrability, and
proper weight distribution. Although it is possible to use a bigger or two caster wheels, however, we chose a smaller caster
wheel to make it easier to navigate around tight spaces and obstacles and provide better stability and balance for the robot,
especially if it has to change direction quickly or make sudden turns. Furthermore, by using a smaller caster wheel, the
weight of the robot could be distributed more evenly, which can help to prevent tipping or loss of balance. Note, it is rec-
ommended to add a piece of rubber between the caster wheel and the ROMR’s base frame to compensate for hyperstaticity.

2.2. Software description

The robot’s main software is based primarily on the ROS framework, which runs on both Ubuntu 18.04 (ROS Melodic ver-
sion) and Ubuntu 20.04 (ROS Neotic version). The ROS framework provides a set of tools, libraries, and conventions for build-
ing the robot system.
The software subsystems include the Arduino sketches, ODrive calibration programs, the ROS workspace containing the
ROMR universal robot description format (URDF) files for a Gazebo simulation, the Gazebo plugins files, the ROMR meshes,
the launch files, the joint and rviz configuration files, and the RPlidar packages. The files are listed in Table 6 and are pub-
lished using the open-source license GNU GPL V3, which allows the community to reproduce, modify, redistribute, and
republish them.

3
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

2.3. Control unit description

The control unit includes multiple options for controlling the robot. In addition to the existing hardware described in
SubSection 2.1, the control unit includes RC receivers and transmitters, an Android device running a ROS-mobile app, an
IMU sensor, an nRF24L01 + module, and an additional Arduino board. The RC receivers and transmitters are used to provide
manual control of the robot. The Android device running the ROS-mobile app serves as an alternative control interface for the
robot. The nRF24L01 + module provides wireless communication between the control unit and the robot. The additional
Arduino board is used to interface with the nRF24L01 + module and IMU sensor to handle wireless communication with
the robot in case of gesture-based teleoperation. SubSections 6.2.1, 6.2.2, 6.2.3, and Figs. 11, 12 and 14 provide insights into
the architecture of the control unit.

2.4. Technical specifications and features

The technical specifications of ROMR are summarized in Table 2. Furthermore, in Table 3, we present the tools and key
features of ROMR in line with other robots with comparable specifications. Initially, when envisioning the design, one of
our core goals was to develop a low-cost scalable platform that robotic developers and the open-source community could
easily adapt, to foster research in mobile navigation. For this reason, we tailored our design considerations based on this goal
such that the ROMR should be:

 Modular – To offer the users the opportunity to easily integrate additional parts or units and reconfigure them to suit their
needs.
 Portable and simple – To ensure that minimal and off-the-shelf hardware components could be used for its construction
and replication. This would allow users not to worry about purchasing costly components and instead focus on the sys-
tem’s functional design.
 Low-cost and open-source – To ensure affordability and commercialisation of the system so that users can leverage the
ROMR framework in any form to develop novel and trivial robotic applications.
 Versatile and suitable – To ensure that it takes minimal time to be re-programmed for any useful purpose, whether nav-
igation, logistics etc as well as adapt to new processes and changes in the environment.
 Unique – To enable hobbyists and users to learn new tools, techniques and methods useful to accelerate the development
of intelligent systems.

Table 2
ROMR technical specifications. L ! Length, W ! Width, H ! Height, / ! Wheel diameter.

Parameters Technical specifications


Robot dimensions L  W  H ¼ 0:46 m  0:34 m  0:43 m
Wheel dimensions Drive wheel ð/ ¼ 0:165 mÞ; caster wheel ð/ ¼ 0:075 mÞ
Inter-wheel distance 0:29 m
Robot weight 17:1 kg
Max. payload 90 kg
Max. speed Up to 3:33 m=s
Max. stable speed < 2:5 m=s
Battery capacity 36 V; 4400 mAh
Motor type BLDC with 15 pole pairs (350 W x 2)
Ground clearance 0:065 m
Operation environment Indoor and outdoor
Run time (full charge) Approximately 8 h with the robot weight (17:1kg) only

Table 3
Overview of the ROMR hardware and software tools.

Features Tools
Actuation ODrive 56 V V3.6 brushless DC motor controller, Nvidia Jetson Nano, and Arduino Mega Rev 3
Sensing & feedback RPlidar A2, IMU, Depth cameras (Intel Realsense D435i & T265)
Operating system Ubuntu 20.04 (ROS Neotic) or Ubuntu 18.04 (ROS Melodic)
Communication ROS architecture (ROS C/C++ & ROS Python libraries), WiFi 802.11n, USB & Ethernet (for debugging)
Navigation & drive ROS navigation stack, position & joint trajectory controller, joystick, rqt-plugin, ROS-Mobile (Android devices), hand
interfaces gesture, web-based GUI
SLAM Hector-SLAM, Cartographer, Gmapping, RTAB-Map, etc
Simulation & Gazebo, Rviz, MATLAB
visualisation

4
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

3. Design files summary

The ROMR design files are categorised into three different units, (a) the mechanical unit, (b) the software unit, and (c) the
power, sensors and electronics unit. Each of these units is briefly described in Tables 4–6. Table 4 describes the additively
manufactured part (3D printed) and the 3D CAD models of the aluminium profiles and their accessories. The CAD files were
used to generate the universal robot description format (URDF) [23] description of the ROMR in order to simulate it using the
ROS framework [22]. These parts were designed using the Solid Edge CAD tool. Table 5 lists all off-the-shelf (OTS) electronics
components. Table 6 contains the software files. All design and construction files can be downloaded at our repository:
https://doi.org/10.17605/OSF.IO/K83X7.

3.1. Mechanical unit

The mechanical unit includes the additively manufactured parts, the 3D CAD models of the aluminium profiles, and their
accessories as summarised in Table 4. The labels P n with n ¼ 1; 2; ::., refers to the individual parts.

 P1  P4; P7  P9 and P12 are aluminium profiles and accessories used for the robot’s chassis construction.
 P5 and P6 are used to cover the base of the robot, where the electronic components are placed.
 P10 is the full CAD assembly of the ROMR.

Table 4
Summary of the ROMR mechanical structure unit.

Des. Description File type Open S. license File location


P1 Alum. 40  40 mm slot 8 .stp GNU GPL v3 https://osf.io/qpb2v
P2 Alum. 40  80 mm slot 8 .stp GNU GPL v3 https://osf.io/zmqe6
P3 Corner bracket I-type .stp GNU GPL v3 https://osf.io/63jwa
P4 Mounting bracket I-type .stp GNU GPL v3 https://osf.io/ncbu9
P5 Bottom cover .stp GNU GPL v3 https://osf.io/ae9um
P6 Top cover .stp GNU GPL v3 https://osf.io/9v5j4
P7 Swivel caster .stp GNU GPL v3 https://osf.io/g7pxb
P8 Screw/bolt .stp GNU GPL v3 https://osf.io/jcefn
P9 Corner bracket cover cap .stp GNU GPL v3 https://osf.io/sfgb9
P10 ROMR full assemble .asm GNU GPL v3 https://osf.io/jybzf
P11 0.42  0.32  0.15 m box .stp GNU GPL v3 https://osf.io/5v3ra
P12 Lock nut .stp GNU GPL v3 https://osf.io/dk3ha
P13 Front hole plate .stp GNU GPL v3 https://osf.io/yhuw9
P14 RPLidar base holder .stl (3D print) GNU GPL v3 https://osf.io/envdh
P15 Drive wheel .stp GNU GPL v3 https://osf.io/vwztk

Table 5
OTS electronics, sensors and control devices.

Des. Description File type Qty File location


P16 Nvidia Jetson Nano B01 64 GB png 1 https://osf.io/72xtm
P17 Arduino Mega Rev 3 png 1 https://osf.io/d3qmj
P18 ODrive V3.6 56 V png 1 https://osf.io/jghnv
P19 IMU (Invensense MPU-9250 9DOF) png 1 https://osf.io/k2h35
P20 Intel Realsense D435i camera png 1 https://osf.io/xu368
P21 Intel Realsense T265 camera png 1 https://osf.io/6cj5r
P22 nRF24L01 + PA + LNA module png 1 https://osf.io/43kd6
P23 Turnigy 2.4 GHz 9X 8-Channel V2 transmitter & receiver png 1 https://osf.io/3wu2x
P24 125 mm & 225 mm M2M, M2F, F2F GPIO wires png 24 https://osf.io/kv982
P25 RPLidar A2 M8 png 1 https://osf.io/pej62
P26 36 V Lithium Ion battery 4400mAh png 1 https://osf.io/umskh
P27 Power bank 2400 mA png 1 https://osf.io/z98gv

Table 6
Software files.

Des. Name File Name Description Type Open source license File location
S1 sketches.zip Folder containing the Arduino sketches Arduino sketches GNU GPL v3 https://osf.io/r5cgp
S2 romr_robot.zip Folder containing the ROMR ROS files ROS files GNU GPL v3 https://osf.io/e4syc

5
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

 P11 and P13 are used for material carriage and for mounting the RGB-D cameras respectively.
 P14 is 3D-printed to attach the RPlidar sensor.
 P15 is the ROMR drive wheel from a consumer hoverboard scooter.

3.2. Power, sensors, and electronics units

Table 5 shows a summary of the used electronics, sensors and control devices, as well as the power sources. P16  P27 are
off-the-shelf (OTS) components from different vendors. All vendors are listed in Table 7.

 P16 is the main brain of the robot. It handles the high-level control task required to run all the sensing, perception, plan-
ning and control modules.
 P17 is one of the most successful open-source platforms with relatively easy-to-use free libraries compared to other
open-source micro-computers. It is used for the low-level control, to send the control command to P18 which in turn
drives and steers the P15. Furthermore, it was used to handle communication between P16 and P18, as well as for any
future compatible devices which support ROS serial communication e.g., Raspberry Pi, STM32, etc.
 P18 is a high-performance, open-source brushless direct current (BLDC) motor driver from ODrive robotics [24]. It regu-
lates all computations required to drive the two inbuilt hoverboard brushless DC motors with five hall-effect sensors. The
sensors are used for the motor position feedback. Note that in the case of the P18 end-of-life (EOL), the Odrive Pro found
at https://odriverobotics.com/shop/odrive-pro can be used as the motor driver replacement since it provides even more
advanced features than the one used in this work.
 P19 is a 9-axis inertial measurement unit (IMU) sensor specifically used for tracking, localization and gesture-based con-
trol tasks.
 P20 and P21 are 3D vision cameras for visual perception, depth sensing, tracking, localization and mapping tasks. Both
cameras generate 3D image data of the task environment, process the data and then publish it to the appropriate topic
in the ROS network.
 P22 and P23 are communication devices for wireless control of the ROMR.
 P24 are general-purpose input-out (GPIO) connection wires.
 P25 is a 2D lidar with a 360-degree field of view (FOV), a maximum range distance of 16 m, operating at a frequency of
10 Hz. It is used for generating a 2D occupancy grid map of the robot’s operational environment and for collision detection
and avoidance.
 P26 and P27 are the system’s power sources for the motors, the sensors, and the control boards.
 S1 and S2 are the folders containing the Arduino control programs and the ROMR ROS files respectively.

Table 7
BOM for building ROMR, and the respective links of where they were purchased. The BOM reflects only the component prices and does not include labour costs
(purchasing, manufacturing, marketing, . . .).

Designator Qty Unit cost (€) Total cost (€) Source of material Material type
Profile 40x40L I-type slot 8 (1.98 m long) 1 26.49 26.49 www.motedis.at Other
Profile 40x80L I-type slot 8 (1.1 m long) 1 25.64 25.64 www.motedis.at Other
Corner bracket I-type 20 0.51 10.20 www.motedis.at Other
Mounting bracket I-type 40 0.18 7.20 www.motedis.at Other
Bottom & Top cover (50 x 50 cm) 1 6.99 6.99 https://www.obi.at/ Other
Screw/bolt 100 0.13 12.64 www.motedis.at Other
Corner bracket cover cap 20 0.18 3.60 www.motedis.at Other
Swivel caster 1 4.51 4.51 www.motedis.at Other
Hoverboard brushless DC motor wheels 2 24.50 49.00 www.voltes.nl Other
Nvidia Jetson Nano B01 64 GB 1 260.22 260.22 www.amazon.de Other
Arduino Mega Rev 3 1 32.78 32.78 www.amazon.de Other
ODrive V3.6 56 V 1 249.00 249.00 www.odriverobotics.com Other
IMU (MPU-9250) 1 15.92 15.92 www.distrelec.at Other
nRF24L01 + PA + LNA module 1 9.99 9.99 www.amazon.de Other
Turnigy 2.4 GHz 9X 8-Channel V2 transmitter & receiver 1 69.99 69.99 www.hobbyking.com Other
125 mm & 225 mm M2M, M2F, F2F GPIO wires 1 3.99 3.99 www.amazon.de Other
36 V Lithium-Ion battery 4400mAh 1 33.63 33.63 https://de.aliexpress.com Other
22nF capacitors 6 0.05 0.30 https://www.conrad.at Other

6
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

4. Bill of materials (BOM) summary

Table 7 provides a summary of the bill of materials, which includes the lidar, the depth cameras, the Turnigy 2.4 GHz 9X
8-Channel V2 transmitter & receiver, the 22nF capacitors, and the MPU-9250 sensor, that were sourced in the laboratory. The
BOM reflects only the component prices and does not include labour costs (purchasing, manufacturing, marketing, . . .).

5. Build instructions

Once the materials described in Table 4 and Table 5 are available, the subsequent task is to assemble them accordingly. To
do that, the sequence of steps presented in this section should be followed. Some basic tools such as pliers, screwdrivers, a
wire stripper, a 3D printer, a saw, a soldering iron and others are required for building the hardware.

5.1. Hardware build instruction

The building of the ROMR chassis is done using the parts described in Table 4. To assemble the chassis, we considered the
system’s compactness to maintain an acceptable mode of operation for a light and reliable system not compromising stabil-
ity. The mechanical construction is done in several steps as follows:

1. Top base assembly: For this step, the parts required are P1; P3, and P8 as referenced in Table 4. First, cut out two each
0.38 m, 0.26 m and 0.24 m lengths of P1. For each end of the piece, use a tap to cut thread and a clearance hole of a diam-
eter of 0.008 m (M8). A description of how to use a tap can be found here https://www.wikihow.com/Use-a-Tap. Assem-
ble the parts as illustrated in Fig. 2, by following the direction of the arrows. The first block shows the components
required, the middle block represents the exploded view with the interconnection of the components, and finally, the last
block shows the outcome of the assembly.
2. Bottom and top chassis assembly: Parts required in this step are P2; P3; P4; P8; P9 and the result of step 1. Assemble the
parts as illustrated in Fig. 3.
3. Mounting of the caster and the drive wheels: Parts required are P4; P7; P8; P15 and the base from step 2. Note that P8
ðM8Þ and P8 ðM6Þ are required to mount P15 and P7 firmly to the base respectively. Assemble the parts as illustrated
in Fig. 4.

Fig. 2. Top base chassis assembly.

Fig. 3. Assembling the top and bottom chassis.

7
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Fig. 4. Mounting of the caster and the drive wheels to the base frame.

4. Mounting the bottom base plate: The base plate is rectangular green plastic meant to hold and protect all the electronics
subsystems. Parts required are P5; P8 ðM3Þ and the resulting base from step 3. These parts are assembled as shown in
Fig. 5.
5. Mounting the internal electronics, the power subsystems to the bottom base plate and the RGB-D cameras: Parts required
are P13; P17; P18; P19; P20; P21; P22; P23; P26, and the result from step 4. A couple of M3 screws (P8) are required for
mounting the electronic components at the respective position. Fig. 6 illustrates the procedure.
6. Mounting the top cover, external electronics, and final coupling: Parts required are P1; P3; P4; P6; P8 ðM3Þ; P8
ðM8Þ; P9; P11; P16; P25, and the resulting hardware from step 5. Fig. 7 illustrates the procedure.

Fig. 5. Mounting of the bottom plate to the base frame for attaching the electronics subsystems.

Fig. 6. Mounting of the internal electronics and power subsystems to the bottom base plate.

8
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Fig. 7. Mounting of the external electronics and the final coupling.

5.2. General connection and wiring instruction

The electronics, vision and sensor subsystem of ROMR are composed of several components interconnected by energy
links and bidirectional or unidirectional information links as shown in Fig. 8. Each link either sends or receives information
from the interconnected components. The instruction below shows how the system wiring was done.

1. ODrive BLDC controller (P18): The ODrive controller has to be wired to the motors as illustrated in Fig. 9. Each of the three
phases of the motors has to be connected to the motor outputs M0 and M1. The order in which the motor phases are con-
nected is not important. The ODrive controller will figure it out during a calibration phase. However, after calibration, the
order cannot be changed. If changed, consider re-calibrating the motors.

Fig. 8. The ROMR hardware architecture is based on an ODrive board (light green) to actuate the motors, an Nvidia Jetson Nano (light blue) as a computing
interface for high-level tasks, and an Arduino Mega Rev3 (orange) as low-level computing interface.

9
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Fig. 9. Internal wiring of the ODrive board. On the bottom, the three phases of the motors (M0 and M1) are connected. On the top, 22nF capacitors are used
as noise filters for the hall sensors.

Furthermore, hoverboard motors are equipped with five hall sensors coloured red, black, blue, green and yellow for posi-
tion feedback. Unfortunately, the ODrive controller has no noise-filtering capacitors and consequently, the hall sensors
are susceptible to noise. To get consistent and clean readings from the hall-effect sensors, noise filtering capacitors are
required to be connected to the corresponding ODrive’s J4 pinouts as described in Table 8.
The required value of the filtering capacitor is approximately 22 nF. However, if you do not have exactly the required
22 nF, connecting two 47 nF (c1 = c2 = 47 nF) in series to obtain approximately 23.5 nF will also work. Also, a 50 W power
resistor is required if the robot runs on a battery. This prevents the ODrive from unexpected shutdown as a result of
regenerated energy into the battery. Usually, the resistors come along with the ODrive controller during supply.
2. Arduino to ODrive interconnection: ODrive communicates with Arduino through a serial port or a universal asynchronous
receiver/transmitter (UART). The UART pinouts are GPIO 1 (TX) and GPIO 2 (RX) which should be connected to the Ardu-
ino Arduino’s RX (pin 18) and TX (pin 17) respectively. Also, there is a need to connect the grounds (denoted by GND) of
the Arduino and the ODrive controller boards.
3. Power distribution: It is recommended to use two separate power sources for the ODrive controller and Jetson Nano to
avoid a ground loop [25]. 36 V lithium-ion 4400mAh battery is wired directly to the ODrive motor controller, and an addi-
tional power-bank battery (P27) was used to power the Jetson which requires only 5 V 2500 mA.
4. The nRF24L01 + module, MPU-9250 and Arduino connections: From Fig. 14, the control unit consists of P17; P19, and P22.
These parts are required for the wireless transmission of data. The MPU-9250 and the Arduino are connected with four
cables, the ground (GND), the power supply (VCC) and two cables for the I2C communication (SDA, SCL). The SDA pin of
the MPU-9250 is connected to the SDA (pin 20) of the Arduino, and the SCL pin to the SCL (pin 21) of the Arduino. A power

Table 8
Hall sensor wiring at the J4 signal port of the Odrive.

Hall wire J4 signal port


Red 5V
Yellow A
Blue B
Green Z
Black GND

10
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Fig. 10. Illustrated is nRF24L01 module with its pinout [27]. The module is used for the wireless communication between the Arduino at the remote control
unit and the Arduino at the robot unit (see Fig. 14).

2200

2000

1800

1600
Amplitude

1400

1200

1000

800
0 500 1000 1500 2000 2500 3000
Pulse widths in micro-seconds

Fig. 11. Setting up the RC control on the Turnigy 9x. The red line indicates PWM activation, the blue line indicates throttle control and the green line
indicates steering control.

supply between 2.4 V and 3.6 V is needed. As a consequence, the 3.3 V power supply pin of the Arduino is recommended
to be used [26]. The communication between the nRF24L01 + module and the Arduino is established via an SPI interface.
In Fig. 10, a detailed description of the nRF24L01 + pinout can be seen.
The SPI pins of the Arduino (Mega 2560 Rev3) are MISO ! pin 50; MOSI ! pin 51, and SCK ! pin 52. In this work, the pins
are connected as follows: nRF2401 GND ! Arduino GND; nRF2401 VCC ! Arduino 3.3 V; nRF2401 CE ! Arduino digital
7; nRF2401 CSN ! Arduino digital 8; nRF2401 MOSI ! Arduino digital 51; nRF2401 SCK ! Arduino pin 52; nRF2401
MISO ! digital 50. Furthermore, at the robot unit (Fig. 14), the nRF24L01 + module and the Arduino are wired in the same
way as the control unit.
5. RC receiver (P23) wiring to Arduino: RC receivers are needed to drive the motor using pulse-width modulated (PWM)
signals [28]. A typical RC receiver such as the one used in this work (P23) has three kinds of pins, two of which are
GND and 5 V respectively, and the remaining are for PWM signals. Hoverboard motors require three signal pins, i.e., throt-
tle, steering and enable [29]. Therefore, the receiver to the Arduino wiring is described as follows: RC_GND ! Ardui-
no_GND, RC_5V ! Arduino_5V, RC_CH1 ! Arduino_port2, RC_CH2 ! Arduino_port3, and RC_CH3 ! Arduino_port18.

6. Operation instructions

To get started operating the ROMR in real-time for the first time, the first step is to power ON the robot by pressing the
ON–OFF switch beside the Jetson nano board (P16) and start controlling it with the transmitter. Note that after powering ON
the robot, a red LED blinks. You have to wait until the blinking stops, then it is ready to be used. However, if you have chan-
ged the default configuration, or probably wish to rebuild and reconfigure the robot from scratch to operate it in other
modes, then this section provides step-by-step instruction to get started. Before these steps, make sure that all the robot
parts are properly assembled and wired according to the instructions in Section 5.
11
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Fig. 12. The ROMR real-time control and monitoring with ROS-Mobile device (a) System communication structure (b) Control and monitoring from ROS-
Mobile [30] or Android-based devices.

6.1. Initial configuration and setup instruction

This section provides details about the initial configuration of the ROMR. It is recommended to follow the instruction in
this section very careful as it determines how well the system would perform.

6.1.1. Nvidia Jetson Nano set up and ROS installation


To set up the Nvidia Jetson Nano, some basic tools such as a microSD card (32 GB minimum recommended), a USB key-
board and a mouse, a computer display (HDMI or DP) and a micro-USB power supply are required. The microSD card and
micro-USB power supply usually come with the Jetson Nano during supply, if you purchased the full development kit.
The setup instructions are as follows:

1. Download the Jetson Nano developer SD card image (JetPack), and write the image to the microSD card that usually come
along with the Jetson Nano board. The reference instruction to configure the JetPack for the first time can be found at
https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit.
2. Install ROS and its packages on the JetPack. The instruction for the installation is provided at the official ROS wiki page at
http://wiki.ros.org/melodic/Installation/Ubuntu for ROS Melodic, which is supported per default by the JetPack. Or follow
the instructions here: https://github.com/Qengineering/Jetson-Nano-Ubuntu-20-image to configure Ubuntu 20.04 OS
image for ROS Noetic installation. Thereafter, install ROS Noetic from http://wiki.ros.org/noetic/Installation/Ubuntu.
3. Create a workspace. A workspace is a set of directories with which you can store the ROS code that you may have written.
Instructions can be found at http://wiki.ros.org/catkin/Tutorials/create_a_workspace.

6.1.2. Setup the ODrive tool and calibrate the BLDC motors
To begin the initial configuration and calibration of the hoverboard BLDC motors, make sure that all the necessary wiring
has been completed as described in subSection 5.2. Also, ensure that all relevant switches are turned ON, and the motors are
positioned in such a way that they can freely move. The ODrive need to be connected to the host computer (the Nvidia Jetson
Nano). ODrive has a python3 programming interface, called ‘‘odrivetool” for configuring the BLDC motors and commanding
them to move at a specific number of revolutions per minute or rotations.
Therefore, python3 is required to be installed first on the host computer before setting up the ‘‘odrivetool”. The instruc-
tion for the setup can be found at https://docs.odriverobotics.com. Alternatively, you could simply download and run the
calibration script which has been prepared to avoid the tedious task of following the tutorial to calibrate the motors at
https://osf.io/awf9t. The script will configure the axes of the motors and their respective encoders as well as set motor
parameters such as the velocity gain, the position gain, the bandwidth, and more. After the successful calibration, you can
test the motors from the ‘‘odrivetool” command line to ensure that it is properly configured and ready to receive velocity
commands. First, start the ‘‘odrivetool” and from the command line send the following commands to the motors:
12
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Repeat the same test with the second motor that is connected to axis 1 of the ODrive board. If the configuration and cal-
ibration of the motors are correct, then both motors should spin until they receive 0 as commanded velocities. Or receives
idle state commands. At any point during the calibration and testing with the interactive ‘‘odrivetool”, always use the code
below to list calibration errors and to clear them.

6.1.3. Install Arduino integrated development environment (IDE) and connect to ROS
The Arduino IDE allows one to write software programs (sketches) and upload them to the Arduino board for robot con-
trol. The steps to set up the IDE, and connect it to ROS are described below:

1. First, download the Arduino IDE at https://www.arduino.cc/en/Guide and follow the onscreen instructions to set it up.
2. Integrate the Arduino to communicate with the ROS via rosserial node. The rosserial_arduino package enables the Ardu-
ino to communicate with the Jetson Nano via a USB-A male to USB-B male cable. The setup instruction can be found at
the http://wiki.ros.org/rosserial_arduino. Note: It is advised to use a udev rule for the USB devices. This will allow the
devices to be recognized and configured automatically when it is plugged in.
3. Launch the ROS serial server by running the code below at the command line to ensure that the setup was successful.

For a detailed explanation of the above rosserial-python node, visit the ROS wiki address at http://wiki.ros.org/rosserial_
python#serial_node.py. Make sure that you check the port to which your Arduino is connected and the baud rate of your
device. In our case, it is ttyACM0 and 115200 respectively. In your case, it may be different. Take note of it always.
4. Connect the Arduino to the ODrive. First, install the ODriveArduino library. Clone or download the repository https://
github.com/odriverobotics/ODrive/tree/master/Arduino. From the Arduino IDE, select Sketch ! Include Library ! Add.
ZIP Library and select the enclosed zip folder. Run the ‘‘ODriveArduinoTest.ino” sketch with the motors connected to
ensure that it is properly configured and ready to accept commands. If everything went successfully, then the motors
should move accordingly.

6.1.4. The MPU-9250 and nRF24L01 + modules setup


As shown in Fig. 14, the Arduino at the control unit reads the MPU-9250 data and forwards it to the robot via the
nRF24L01 + module. To run the Arduino sketch, the ‘‘FaBo 202 9Axis MPU9250” library by Akira Sasaki released under
the Apache license, version 2.0 must be installed, as well as the ‘‘rf24” library by TMRh20 Avamander released under the
GNU general public license. The FaBo 202 9Axis MPU9250 library is used for reading the data measured by the MPU-
9250 sensor, while the rf24 library is needed for the usage of the nRF24L01 + module. Both libraries can be found in the
library manager of the Arduino IDE.
The Arduino at the robot unit primarily forwards the raw MPU-9250 data that it receives from the nRF24L01 + module to
the Jetson Nano. This is done by publishing the data to the ‘‘imu/data_raw” topic of the ROS system, which is running on the
Jetson Nano. The Jetson Nano converts the raw data into velocity commands that include a target linear velocity and a target
rotation speed of the robot. The Arduino receives these velocity commands by subscribing to the ‘‘cmd_vel” topic. After the
Arduino receives the velocity commands, it converts them into target speed values of the motors that are set on the ODrive
board.

13
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

6.1.5. Setup the RPlidar


The RPlidar sensor provides the scan data required for mapping, localization and navigation purposes. It is connected to
the Jetson Nano or the host computer through a USB serial port. The procedure for its setup is summarized in the following
steps.

1. Clone the RPlidar ROS packages https://github.com/Slamtec/rplidar_ros to your ROS catkin workspace source directory
and run the code below to build the rplidarNode and rplidarNodeClient.

2. Check the authority of the rplidar serial port by typing at the command window:

Take note of the port in which the USB is connected e.g., . . ./ttyUSB0. Add authority to write the USB:

3. Launch the RPlidar node to view and test if the setup was successful.

If correctly set up, you will obtain an output similar to the one displayed in Fig. 15b with the lidar scan represented as red
dots.

After the above setups, the ROMR is ready for experimentation.

6.2. Experimentation & remote operation instruction

To allow a human operator to intuitively control the ROMR, we developed three remote control techniques. In this section,
we provide step-by-step instructions on how these techniques can be implemented.

6.2.1. ROMR teleoperation from remote-control (RC) devices


As per default, ROMR is configured to operate in RC mode. However, if the default configuration has been altered or chan-
ged, then, the following steps must be taken to reconfigure it. Before these steps, make ensure that the ODrive controller has
already been calibrated and configured to accept commands (see sub-subSection 6.1.2 for instructions). Also, it is important
to ensure that the RC receiver is properly wired according to the instructions in subSection 5.2.

1. Upload the ‘‘romr_remote_control.ino” sketch to Arduino. Before that, the ‘‘Metro” library has to be included in the Arduino
IDE library. The ‘‘Metro” library can be downloaded from https://github.com/thomasfredericks/Metro-Arduino-Wiring.
2. With the motors switched off, move the RC transmitter sticks and monitor it from Arduino serial plotter. If there is com-
munication between the receiver and the transmitter, you would obtain a similar response as the one shown in Fig. 11
from the Arduino serial plotter.

6.2.2. ROMR control from Android-based device


One of the main features of the ROMR is the ability to be teleoperated from any Android-based device. The idea is to alle-
viate the need for complex robot teleoperation devices such as a joystick, an RC transmitter, etc., and to provide an intuitive
way of controlling the robot by simply touching the Android device screen. To achieve this, we leveraged the framework
developed by Rottmann Nils et al. [30]. The setup is straightforward. First, you have to make sure that Arduino has been
set up to communicate with ROS. If you have not done that yet, it is advisable to follow the instructions in Section 6. ‘‘rosse-
rial” is very important for this section. Therefore, make sure that the ‘‘rosserial” python node (rosserial_python serial_node.
py) is running properly. The whole communication structure is described in Fig. 12a.
14
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

As illustrated in Fig. 12, the ‘‘rosserial” python node allows all the compatible connected electronics to communicate
directly with the robot using the ROS topics and messages. All the information between the interconnected systems is com-
municated with the help of the rosserial package. Make sure that the robot is switched ON, the battery is connected, and the
wheels are free to spin. If you have changed the default ODrive calibration, make sure that the calibration is completed
before continuing. Open the downloaded ROS-Mobile App, which enables ROS to control the robot’s joint velocities. The
App supports linear (forward and backward movement) and angular (rotation around the z-axis) movements. See
Fig. 12b for the setup. An SSH connection has to be established between the devices, and all the devices have to be on
the same wireless network. The steps are summarised as follows:

1. Download the ROS-Mobile App from the Google Playstore.


2. Configure the IP address. First, connect the robot and the Android device to the same wireless network. From the com-
mand window terminal, type ‘‘ifconfig”, this will display the IP address, e.g., 192. 168.1.15.
3. At the ‘‘master” node URI of the ROS-Mobile App, enter the IP address and 11311 for the ‘‘master” port. Ensure that roscore
is running, and click on the connect button.
4. Once the above steps are completed, upload the ‘‘ros_mobile_control.ino” sketch to Arduino. While the roscore is still run-
ning, run the rosserial python node in a separate terminal:

Take note of the . . ./dev/ttyACM0 Arduino port. It may be different in your case.
5. At the ‘‘details” tab of the ROS-Mobile App, select ‘‘Add widget”, select ‘‘joystick” and set the XYZ-coordinates accordingly.
Click on the ‘‘viz” tab to visualise and control the robot. Ensure that the rosserial python node is running and the cmd_vel
topic is been subscribed to. Once done, the robot can be controlled by simply touching the respective coordinates on the
screen.

6.2.3. Gesture-based control of the ROMR


Unlike the traditional or ‘‘ready-to-use” robot control approaches such as a joystick or the ROS rqt plugin, a gesture-based
approach has the potential to control the robot in a very intuitive way. This strategy allows the operator to focus on the robot
instead of the controller. The goal is for a non-robotics expert to be able to remotely navigate the robot depending on the
direction in which the operator’s hand is tilted. For example, if the hand is tilted forward (pitch angle), the robot should move
forward. If the hand is tilted to the side (roll angle), the robot should rotate. Since the MPU-9250 sensor does not measure the
orientation of the operator’s hand, but only the acceleration and the rotations speed of the sensor, the IMU sensor (P19) data
must be fused to determine the orientation of the operator’s hand and the tilt angle of the sensor.
We implemented four gestures with different hand motions (see Fig. 13). We used the IMU sensor (P19) to measure the
hand gestures needed and map them onto the robot’s linear and angular velocities. The data collected by the IMU sensor is
sent via a wireless connection to the robot, which uses it to calculate the movement commands for the motors. We leveraged
the framework proposed in [31,32] to achieve the gesture-based control strategy.

Fig. 13. ROMR real-time control and monitoring based on hand movement (a) hand tilt forward ! the robot moves forward (b) hand tilt backwards ! the
robot moves back (c) hand tilt to the right ! the robot rotates in a clockwise direction (d) hand tilt to the left ! the robot rotates in a counter-clockwise
direction.

15
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Fig. 14. Block diagram illustrating the architecture and flow of information during the gesture control approach.

Fig. 15. ROMR simulation platform. (a) The operational environment model with the robot models together with the sensor model. The blue lines are the
lidar sensor scan of the operational environment in the Gazebo. (b) Rviz visualisation. The red dotted lines are the lidar scan showing the location of
obstacles (or objects) within the robot environment.

A detailed overview of the hardware and software architecture and the data flow is shown in Fig. 14. The components are
divided into a control unit and a robot unit, as the components of these two units are physically located in different places.
While the control unit is been attached to the user’s arm/hand, the robot unit is within the ROMR board.
The step-by-step instruction to operate the robot based on gesture demonstration is described as follows:

1. At the control unit, upload the ‘‘IMUDataNRF24L01_Transmitter.ino” sketch to Arduino, and at the robot unit, upload
‘‘NRF24L01Receiver_PC_WheelController_WheelMonitoring.ino” sketch to the Arduino.
2. From the host computer, run roscore to start the ROS master, and then:

16
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

to start rosserial python node and establish a connection between the Arduino and the Jetson Nano. Depending on which
port the Arduino is connected to the Jetson Nano, the port ttyACM0 must be adjusted accordingly.
3. Finally, execute the following script to start the complementary filter node:

By switching the robot ON, the motors are automatically powered and ready to be controlled with the IMU sensor.
Depending on the direction the IMU sensor is tilted, a corresponding movement of the robot is obtained as shown in
Fig. 13a-d. During the operation, attention must be paid so that the hand is not tilted about 90 .

7. Validation and characterization

ROMR has been successfully developed, tested and validated both in simulation and in real-world scenarios. Experiments
were performed to characterise its performance, robustness and suitability for research, navigation and logistics applica-
tions. The evaluation results are presented in this section. Furthermore, the validation video can be viewed at https://osf.
io/ku8ag.

7.1. Simulation scenarios

Although the development of ROMR focused on real-world applications, it is also important to have a 3D simulation
model of the robot, to enable users to work in virtual environments to explore tools, techniques and methods. Furthermore,
the simulation model could also enable the ROMR users to become familiar with 3D simulation and visualisation tools such
as Gazebo [33] and Rviz [34].
The simulation platform includes three parts: the environment model, the robot model, and the sensors model. For the
environmental model, we created the floor plan of our laboratory environment using the Gazebo model editor (see Fig. 15a).
Taking advantage of the ROS framework [22], the ROMR model was implemented in accordance with the unified robot
description format (URDF) [23]. The URDF is an extended mark-up language (XML) format that describes all kinematic
and dynamic properties of the robot, the physical elements, such as the links, the joints, the actuators, and the sensors[35].
We generated the URDF of the robot including the sensors model using the 3D CAD models described in Table 4. Further,
all the models were verified with several simulation tests as depicted in Fig. 15. The step-by-step procedure for this simu-
lation is as follows:

1. Download the ROMR ROS files at https://osf.io/e4syc to your catkin workspace and build it.
2. Open three terminal windows, and execute the following in each of the terminals:

to launch the ROMR world (operational environment model in Gazebo [33]) with the robot spawned and the sensor model
active (see Fig. 15a). At the third terminal run the following node to visualise in Rviz (see Fig. 15b).

3. Navigate the robot within the operational environment using the ROS rqt plugin, keyboard or the framework described
in sub-subSection 6.2.2.

17
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

7.2. General system validation test

This section provides information about the robustness of the system by completing different tests with the robot and its
sub-components. The outcome of each test is presented in Table 9.

7.2.1. Payload capacity test


In Fig. 16, the result of the payload capacity test carried out as described in Table 9 is presented. The robot is teleoperated
to move along linear and angular trajectories with different loads ranging from the robot weight only (17.1 kg) to 90 kg. The
goal is to verify how much load the robot can carry without affecting the controller. Although the ROMR can carry a load up
to a maximum of 90 kg, it is, however, not recommended to operate it at maximum load continuously to extend its life span.
All the various load tests were carried out on the robot while moving on a flat surface. Thus, we did not evaluate the per-
formance on irregular or unstructured surfaces.

7.2.2. Stability test


We performed this test to evaluate the stability of the robot under high speed and different turning radii. We tested the
robot stability in both Gazebo simulation and in real-world at four different speeds: 0:5 m=s; 1:0 m=s; 1:5 m=s, and 2:5 m=s.
For each speed, we tested the robot at different turning radii (0:5 m; 1:0 m; 1:5 m; 2:0 m, and 2:5 m), three different payloads
(ROMR weight, 25 kg, and 85 kg), and three different positions of the centre of gravity (0:1 m; 0:0 m, and 0:1 m). Note, the
positions of the centre of gravity (pCOG) are defined relative to the midpoint on the ROMR’s base frame. The centre of gravity
(COG) is slightly altered by adjusting the loads (25 kg and 85 kg) at positions 0:1 m (front), 0:1 m (behind), and 0:0 m (COG
unaltered) from the midpoint of the ROMR base frame. The results are summarized in Table 10.

Table 9
General systems validation test.

Test name Test purpose Test Process Expected Result Outcome


Chassis test To Verify the solidity and Complete five different movement tasks at high All the subsystems must Passed
stability of the robot’s chassis speed with all the components mounted be stable and rigid
throughout the test
Payload test To Verify the maximum load Place different loads on the robot and check the The robot should be able Carried up to
capacity the robot can carry response to convey the load up to 90 kg (see
without affecting the the maximum capacity Fig. 16)
controller
Reconfiguration To Verify how long and easy to Dismantle all the subsystems including the Should not take more Took about 1 h
test dismantle and re-assemble the chassis and wiring and re-assemble them. than 3 h 15 min
system Record the time taken to complete the process
Battery live test To evaluate how long the Operate the robot continuously with all the The battery should last The battery
battery can power the robot electronics parts active for a long period of time up to the maximum lasted for about
for a long mission task capacity 8h
ROMR stability To verify the stability of the Drive the robot forward and backwards, at high The robot should be See
test robot when driven at high speed, and in a circular path of different radii. stable throughout the SubSection 7.2.2
speed and at small turning Record the linear and angular velocity data stability test process and Fig. 17 for
radii the results

Fig. 16. Validation of the maximum load capacity of the ROMR. Shown are (a) ROMR weight only (17:1 kg), (b) 16 kg, (c) 25 kg, (d) 85 kg, and (e) 90 kg.

18
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Table 10
Minimum turning radius of the robot for different linear velocities, payload weights, and positions of the centre of gravity (pCOG).

Linear Vel. (m/s) Payload (kg) pCOG (m) Turning radius (m) Stability
0.5 ROMR weight (17.1) 0.0 0.5 stable
0.5 25 0.1 1.0 stable
0.5 85 - 0.1 1.5 stable
1.0 ROMR weight (17.1) 0.0 2.0 stable
1.0 25 0.1 2.5 stable
1.0 85 - 0.1 1.5 stable
1.5 25 - 0.1 1.0 stable
1.5 85 0.1 2.5 stable
1.5 ROMR weight (17.1) 0.0 1.5 stable
2.5 25 - 0.1 1.5 stable
2.5 85 0.1 2.5 stable
2.5 ROMR weight (17.1) 0.0 0.5 unstable

Fig. 17 shows the results of the stability test. The robot was driven in a circular path at different linear velocities, turning
radii, payload weights, and pCOG. The odometry, command velocity and IMU data were recorded in a rosbag file used for our
analysis. At the lowest speed of 0:5 m=s, the robot showed some minor oscillations when turning at the smallest turning
radius of 0:5 m. However, these oscillations were not significant enough to cause the robot to lose control or become unsta-
ble. At the higher velocities of 1:0 m=s and 1:5 m=s, the robot remained stable even when turning at the smallest radius of
0:5 m. The robot was stable up to a linear velocity of 2:5 m=s. At 2:5 m=s, the robot started to show signs of instability, such as
tilting and sliding. Therefore, it can be concluded that the maximum stable linear velocity of the robot is 2:5 m=s.

(a) (b) (c)

(d) (e) (f)

Fig. 17. The ROMR stability test. The robot was controlled to follow circular paths at different linear velocities, turning radii, payload weights, and positions
of the centre of gravity. As shown in the sub-figures c - f, the robot was unstable at about 215 s when the linear velocity increased to 2:5 m=s (see the red
rectangles). (a) depicts the robot’s trajectory as it follows the circular path. (b) shows the robot’s position on the x (red) and y (blue) axes at each time stamp.
(c) shows the linear velocity in m=s (blue) and the roll angle in radians (dark red) at each time stamp. (d) shows the angular velocity of the robot along the x
(blue), y (green) and z (red) axes at each time stamp. (e) is the orientation at x (red) and y (yellow) axes respectively. (f) is the linear acceleration in x (green)
and y (cyan) axes.

19
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

To conduct the stability test, the following steps should be taken:

1. Launch the robot in the Gazebo environment:

2. The following nodes are needed only when launching the robot in the real-world. For the Gazebo simulation, they are not
necessary.

3. Start the stability test by launching the following node. The robot trajectory and a CSV file containing the necessary data
for further analysis will be generated at the end of the test.

4. If needed, record the IMU, the odometry, and the velocity data in a rosbag file for further analysis:

A video showing the result of the above test can be found at https://osf.io/wcd4n.

7.3. Application of the ROMR for SLAM

As stated earlier, ROMR provides a framework for evaluating and developing SLAM algorithms. The SLAM problem is usu-
ally to build a map of an unknown environment, i.e, mapping while simultaneously keeping track of the estimates of a
robot’s pose (the position x, y and the orientation). Given a series of control and sensor observations [19,36], the map is built
and the pose is estimated. We evaluated the Hector-SLAM algorithm [37] on our RPlidar sensor (P25) to build the map of our
real laboratory environment. The adaptive Monte Carlo localization (AMCL) [38] approach was used to localise the robot
within the built map. The advantage of the Hector-SLAM technique over other 2D SLAM techniques such as Gmapping
[39], and Google Cartographer [40] is that it only requires laser scan data and does not need odometry data to build the
map. To generate the map, the following steps have to be followed:

1. Set up the RPlidar as described in sub-subSection 6.1.5.


2. Download or clone the Hector-SLAM packages at https://github.com/tu-darmstadt-ros-pkg/hector_slam.git to your ROS
workspace, and set the coordinate frame parameters according to the instruction at the ROS wiki page http://wiki.ros.
org/hector_slam. Build the ROS workspace including the Hector-SLAM and RPlidar packages (catkin_make), then proceed
to the next step.
3. Open four terminal windows, and run the following in each of the terminal windows:

20
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

While all the nodes are running, navigate the robot around the environment by employing any of the control approaches
implemented in sub-subSections 6.2.1, 6.2.2, and 6.2.3. While building the map, it is recommended to move at a low
speed such that a quality map is created.
4. After the mapping is completed, execute the following at the fourth terminal:

Take note of the location where the map is saved. It would be required for localisation and autonomous navigation. The
saved map can be viewed on your screen by running:

Fig. 18 shows the map built in the virtual laboratory environment with the ROMR. Fig. 19a shows the map built in the
real-laboratory world. For the localisation of the robot, the employed AMCL approach uses a particle filter to track the pose
of the robot [41]. It maintains a probability distribution over a set of all the possible robot poses [38], and updates this dis-
tribution using the data from the ROMR odometry and rplidar scan (P25). Fig. 19b shows the localisation of the ROMR within
the 2D occupancy grid map, where the dark green clusters denote the AMCL particles representing the estimates of the loca-
tion of the robot.

7.4. Proposed maintenance for the ROMR

Finally, to increase the life span of ROMR, predictive and corrective maintenance is necessary. This aims to maintain or
repair the robot to ensure that the robot works at its maximum efficiency. Predictive maintenance such as listening to
any abnormal noise; performing a visual inspection of all the parts of the robot; checking the energy storage level; checking
for vibrations, mechanical defects, improper connections, calibration errors, etc., are proposed before each operation. This is
intended to reduce the probability of failure during operation. Furthermore, curative maintenance on the other hand should
be performed after detecting a failure.

Fig. 18. Generation of the 2D occupancy grid map of the environment using the 360° lidar sensor (P25) and a ‘‘Hector-SLAM” algorithm [37]. (a) Virtual
laboratory world in Gazebo. The blue lines represent the lidar scan. (b) Occupancy grid map of the operational environment in Rviz. The pale grey areas
indicate the unoccupied (free) spaces that the robot can navigate, the black lines represent occupied areas not transversal by the robot, and the green line
represents the robot’s trajectory.

21
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Fig. 19. Applying ROMR to a real-world SLAM problem. (a) Generated the 2D occupancy grid map of the operational environment with the P25 lidar and the
Hector-SLAM algorithm. The robot’s trajectory is represented with the green line. (b) Localising the ROMR within the map with the Adaptive Monte Carlo
Localisation (AMCL) algorithm. The dark green clusters are the AMCL particles that represent the estimates of the location of the robot.

8. Conclusion

In this paper, we presented a ROS-based open-source mobile robot ROMR for research and industrial applications. We
provided detailed information about the hardware design, the architecture, the operation instructions, and the advantages
it offers compared to the commercial platforms. The entire design utilises off-the-shelf electronic components, additive man-
ufacturing technologies and aluminium profiles that are commercially available to speed up the re-prototyping of the frame-
work for custom or general-purpose applications. We implemented several control techniques that can enable a non-
robotics expert to operate the robot easily and intuitively. Furthermore, we demonstrated the applicability of the ROMR
for logistics problems by implementing navigation, simultaneous localisation and mapping (SLAM) algorithms, which are
fundamental prerequisites for autonomous robots. The experimental validation of the robustness and performance is illus-
trated in the video https://osf.io/ku8ag. Future work will focus on porting the whole platform to ROS 2. As an open-source
platform, the scientific community has been granted permission to use all the design files published at https://doi.org/10.
17605/OSF.IO/K83X7. This open-source strategy supports rapid progress in the development of intelligent mobile robots
and dexterous systems.

CRediT authorship contribution statement

Linus Nwankwo: Conceptualization, Software, Writing - original draft. Clemens Fritze: Software, Writing – review &
editing. Konrad Bartsch: Construction and CAD design. Elmar Rueckert: Supervision, Validation, Writing – review & editing.

Declaration of Competing Interest

The authors declare the following financial interests/personal relationships which may be considered as potential com-
peting interests: This project has received funding from the Deutsche Forschungsgemeinschaft (DFG, German Research
Foundation) No #430054590 (TRAIN).

Acknowledgements

This project has received funding from the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) No
#430054590 (TRAIN).

References

[1] R. Sell, A. Rassõlkin, R. Wang, T. Otto, Integration of autonomous vehicles and Industry 4.0, Proceedings of the Estonian Academy of Sciences, vol. 68, pp.
389–394, 12 2019, DOI: 10.3176/proc.2019.4.07, URL: http://vana.kirj.ee/public/proceedings_pdf/2019/issue_4/proc-2019-4-389-394.pdf.
[2] P.A. Hancock, I. Nourbakhsh, J. Stewart, On the future of transportation in an era of automated and autonomous vehicles, Proc. Nat. Acad. Sci. 116 (16)
(2019) 7684–7691, https://doi.org/10.1073/pnas.1805770115.
[3] R. Cupek, M. Drewniak, M. Fojcik, E. Kyrkjebø, J.C.-W. Lin, D. Mrozek, K. Øvsthus, A. Ziebinski, Autonomous Guided Vehicles for Smart Industries – The
State-of-the-Art and Research Challenges, in: V.V. Krzhizhanovskaya, G. Závodszky, M.H. Lees, J.J. Dongarra, P.M.A. Sloot, S. Brissos, J. Teixeira (Eds.),
Computational Science – ICCS 2020, Springer International Publishing, Cham, 2020, pp. 330–343.
[4] A. Markis, M. Papa, D. Kaselautzke, M. Rathmair, V. Sattinger, M. Brandstötter, Safety of Mobile Robot Systems in Industrial Applications,, 05 2019,
10.3217/978-3-85125-663-5-00, URL: https://www.researchgate.net/publication/337339431_Safety_of_Mobile_Robot_Systems_in_Industrial_
Applications.
[5] G. Fragapane, R. de Koster, F. Sgarbossa, J.O. Strandhagen, Planning and control of autonomous mobile robots for intralogistics: Literature review and
research agenda, Eur. J. Oper. Res. 294 (2) (2021) 405–426, https://doi.org/10.1016/j.ejor.2021.01.019.

22
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

[6] J. Zhong, C. Ling, A. Cangelosi, A. Lotfi, X. Liu, On the Gap between Domestic Robotic Applications and Computational Intelligence, Electronics 10 (03
2021,) 793, https://doi.org/10.3390/electronics10070793, URL: https://www.mdpi.com/2079-9292/10/7/793.
[7] J. Palacín, E. Rubies, E. Clotet, The Assistant Personal Robot Project: From the APR-01 to the APR-02 Mobile Robot Prototypes, Designs 6 (07 2022,) 66,
https://doi.org/10.3390/designs6040066, URL: https://www.mdpi.com/2411-9660/6/4/66.
[8] H. Unger, T. Markert, E. Müller, Evaluation of use cases of autonomous mobile robots in factory environments, Procedia Manufacturing, vol. 17, pp.
254–261, 2018, doi: 10.1016/j.promfg.2018.10.044. 28th International Conference on Flexible Automation and Intelligent Manufacturing (FAIM2018),
June 11-14, 2018, Columbus, OH, USAGlobal Integration of Intelligent Manufacturing and Smart Industry for Good of Humanity.
[9] V.M. Pawar, J. Law, C. Maple, Manufacturing Robotics: The Next Robotic Industrial Revolution,, 2016, URL: https://www.ukras.org.uk/wp-content/
uploads/2021/01/UKRASWP_ManufacturingRobotics2016_online.pdf.
[10] R.D. Atkinson, Robotics and the Future of Production and Work, 2019.
[11] L. Pagliarini, H. Lund, The future of Robotics Technology, J. Robot. Networking Artif. Life 3 (2017) 270, https://doi.org/10.2991/jrnal.2017.3.4.12, URL:
https://www.researchgate.net/publication/315986147_The_future_of_Robotics_Technology.
[12] G.I. Fragapane, D.A. Ivanov, M. Peron, F. Sgarbossa, J.O. Strandhagen, Increasing flexibility and productivity in Industry 4.0 production networks with
autonomous mobile robots and smart intralogistics, Ann. Oper. Res. 308 (2022) 125–143, https://doi.org/10.1007/s10479-020-03526-7.pdf.
[13] F. Grimminger, T. Flayols, J. Fiene, A. Badri-Spröwitz, L. Righetti, A. Meduri, M. Khadiv, J. Viereck, M. Wuthrich, M. Naveau, V. Berenz, S. Heim, F.
Widmaier, An Open Torque-Controlled Modular Robot Architecture for Legged Locomotion Research, IEEE Robot. Autom. Lett. PP (02 2020,), https://
doi.org/10.1109/LRA.2020.2976639, 1–1. URL: https://arxiv.org/abs/1910.00093.
[14] N.B. Hui, D.K. Pratihar, Design and Development of Intelligent Autonomous Robots, Springer Berlin Heidelberg, Berlin, Heidelberg, 2010, pp. 29–56,
https://doi.org/10.1007/978-3-642-11676-6_3.
[15] D. Betancur-Vásquez, M. Mejia-Herrera, J. Botero-Valencia, Open source and open hardware mobile robot for developing applications in education and
research, HardwareX 10 (2021), https://doi.org/10.1016/j.ohx.2021.e00217 e00217.
[16] W. Jo, J. Kim, R. Wang, J. Pan, R.K. Senthilkumaran, B.-C. Min, SMARTmBOT: A ROS2-based Low-cost and Open-source Mobile Robot Platform, arXiv
preprint arXiv:220308903, 2022, URL: https://doi.org/10.48550/arXiv.2203.08903.
[17] R. Prinz, R. Bulbul, J. Scholz, M. Eder, G. Steinbauer-Wagner, Off-Road Navigation Maps for Robotic Platforms using Convolutional Neural Networks,
AGILE: GIScience Ser. 3 (2022) 55, https://doi.org/10.5194/agile-giss-3-55-2022, URL: https://agile-giss.copernicus.org/articles/3/55/2022/.
[18] S. Kolski, Mobile Robots: Perception & Navigation, IntechOpen, London, United Kingdom, Feb 2007, 10.5772/36, URL: https://doi.org/10.5772/36.
[19] U. Frese, R. Wagner, T. Röfer, A SLAM overview from a users perspective, KI 24 (09 2010,) 191–198, https://doi.org/10.1007/s13218-010-0040-4.
[20] S. Thrun, Probabilistic Algorithms in Robotics, AI Mag. 21 (2000) 07.
[21] J. Shabbir, T. Anwer, A survey of deep learning techniques for mobile robot applications, arXiv preprint arXiv:180307608, 2018, URL: https://arxiv.org/
abs/1803.07608.
[22] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, A. Ng, ROS: an open-source Robot Operating System, vol. 3, 01 2009, URL: http://
robotics.stanford.edu/ang/papers/icraoss09-ROS.pdf.
[23] wiki.ros.org, Learning URDF Step by Step, ROS Wiki, URL: http://wiki.ros.org/urdf/Tutorials, available online.
[24] odriverobotics.com, Getting Started, Wikipedia, URL: https://docs.odriverobotics.com/v/0.5.4/getting-started.html, available online at.
[25] odriverobotics.com, ODrive Pro Documentation, Wikipedia, URL: https://docs.odriverobotics.com/v/latest/ground-loops.html, available online at.
[26] G. Coviello, G. Avitabile, Multiple Synchronized Inertial Measurement Unit Sensor Boards Platform for Activity Monitoring, IEEE Sens. J. PP (03 2020,),
https://doi.org/10.1109/JSEN.2020.2982744, 1–1. URL: https://www.researchgate.net/publication/340110168_Multiple_Synchronized_Inertial_
Measurement_Unit_Sensor_Boards_Platform_for_Activity_Monitoring.
[27] lastminuteengineers.com, How nRF24L01+ Wireless Module Works & Interface with Arduino, URL: https://lastminuteengineers.com/nrf24l01-
arduino-wireless-communication/.
[28] D. Madison, How to Use an RC Controller with an Arduino, Wikipedia, URL: https://www.partsnotincluded.com/how-to-use-an-rc-controller-with-
an-arduino/, available online, August 27, 2020.
[29] L. Kaul, Hoverboard motors turned into an RC skater, Wikipedia, URL: https://blog.arduino.cc/2019/06/10/hoverboard-motors-turned-into-an-rc-
skater/, available online, June 10, 2019.
[30] N. Rottmann, N. Studt, F. Ernst, E. Rueckert, ROS-Mobile: An Android application for the Robot Operating System, arXiv preprint arXiv:201102781,
2020.
[31] G. Fu, E. Azimi, P. Kazanzides, Mobile Teleoperation: Feasibility of Wireless Wearable Sensing of the Operator’s Arm Motion, IEEE/RSJ International
Conference on Intelligent Robots and Systems (IROS) (January 2022), https://doi.org/10.1109/IROS51168.2021.9636838.
[32] S. Li, J. Jiang, P. Ruppel, H. Liang, X. Ma, N. Hendrich, F. Sun, J. Zhang, A Mobile Robot Hand-Arm Teleoperation System by Vision and IMU, ArXiv, 03
2020, URL: https://doi.org/10.48550/arXiv.2003.05212.
[33] gazebosim.org, Simulate before you build, Gazebosim Wiki, URL: https://gazebosim.org/home, available online.
[34] wiki.ros.org, 3D visualization tool for ROS, Wikipedia, URL: http://wiki.ros.org/rviz, available online August 16, 2022.
[35] F. Sanfilippo, O. Stavdahl, P. Liljeback, SnakeSIM: A ROS-based rapid-prototyping framework for perception-driven obstacle-aided locomotion of snake
robots, in: 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1226–1231, 2017. DOI: 10.1109/ROBIO.2017.8324585.
[36] E. Rueckert, Simultaneous localisation and mapping for mobile robots with recent sensor technologies, Dissertação de Mestrado, Technical University
Graz, 2010, URL: https://cps.unileoben.ac.at/wp/MScThesis2009Rueckert.pdf, Article File.
[37] S. Kohlbrecher, J. Meyer, T. Graber, K. Petersen, U. Klingauf, O. Von Stryk, Hector Open Source Modules for Autonomous Mapping and Navigation with
Rescue Robots, pp. 624–631, 01 2014, DOI: 10.1007/978-3-662-44468-9_58.
[38] D. Fox, W. Burgard, F. Dellaert, S. Thrun, Monte Carlo Localization: Efficient Position Estimation for Mobile Robots, pp. 343–349, 01 1999, URL: http://
robots.stanford.edu/papers/fox.aaai99.pdf.
[39] G. Grisetti, C. Stachniss, W. Burgard, Improved Techniques for Grid Mapping With Rao-Blackwellized Particle Filters, IEEE Trans. Rob. 23 (1) (2007) 34–
46, https://doi.org/10.1109/TRO.2006.889486, URL: https://ieeexplore.ieee.org/document/4084563.
[40] W. Hess, D. Kohler, H. Rapp, D. Andor, Real-Time Loop Closure in 2D LIDAR SLAM, in: 2016 IEEE International Conference on Robotics and Automation
(ICRA), pp. 1271–1278, 2016, URL: https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/45466.pdf.
[41] I. Rekleitis, A particle filter tutorial for mobile robot localization, Wikipedia (2004), URL: https://www.cim.mcgill.ca/yiannis/particletutorial.pdf.

23
L. Nwankwo, C. Fritze, K. Bartsch et al. HardwareX 14 (2023) e00426

Linus Nwankwo is pursuing his PhD in robotics at the Chair of Cyber-Physical Systems (CPS), Montanuniversität Leoben,
Austria. Prior to joining CPS as a PhD student, he worked as a research intern at the Department of Electrical and Computer
Engineering, Technische Universität Kaiserslautern, Germany. In 2020, he obtained his Master of Science (M.Sc.) degree in
Automation and Robotics, a speciality in control for Green Mechatronics (GreeM) at the University of Bourgogne Franche-Comte
(UBFC), France. His research interests include robot dynamics modelling, sensor fusion, simultaneous localization & mapping
(SLAM), and path planning.

Clemens Fritze is a master’s student in Mechatronics at the Johannes Kepler Universität Linz. Prior to his master’s program, he
studied Mechanical Engineering for his bachelor’s degree at the Montanuniversität Leoben, Austria. His research interest is in
robotics.

Konrad Bartsch is a senior technician at the chair of Cyber-Physical-Systems, Montanuniversität Leoben, Austria. He studied
Building Electronics and Mechanical Engineering at Höhere Technische Bundeslehranstalt Graz-Gösting, Austria. His research
interests include cyberphysical systems, modern technologies, machine learning and robotics.

Elmar Rueckert has been the chair of Cyber-Physical-Systems Institute at the Montanuniversität Leoben, Austria since March
2021. He received his PhD in computer science at the Graz University of Technology in 2014. He worked for four years as a senior
researcher and research group leader at the Technical University of Darmstadt. Thereafter, he worked for three years as an
assistant professor at the University of Lübeck. His research interests include stochastic machine and deep learning, robotics and
reinforcement learning and human motor control. For more information visit https://cps.unileoben.ac.at/univ-prof-dr-elmar-
rueckert/.

24

You might also like