0% found this document useful (0 votes)
35 views

An Open Low-Cost Robotic Platform With Camera For STEM Education

Uploaded by

Ye Jiang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

An Open Low-Cost Robotic Platform With Camera For STEM Education

Uploaded by

Ye Jiang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.

v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

Article
PiBot: an open low-cost robotic platform with camera
for STEM education
Julio Vega 1,†,‡ , José M. Cañas 1,‡
1 Rey Juan Carlos University; [email protected]
2 Rey Juan Carlos University; [email protected]
* Correspondence: [email protected]; Tel.: +34-914-888-755
† Correspondence concerning this article should be addressed to Julio Vega, Department of Telematic Systems
and Computation, Rey Juan Carlos University, Camino del Molino S/N, 28934 Fuenlabrada, Madrid, Spain.
‡ These authors contributed equally to this work.

1 Abstract: This paper presents the robotic platform, PiBot, that has been developed and that is aimed
2 at improving the teaching of Robotics with vision to secondary students. Its computational core is the
3 Raspberry Pi 3 controller board, and the greatest novelty of this prototype is the support developed
4 for the powerful camera mounted on board, the PiCamera. An open software infrastructure written
5 in Python language was implemented so that the student may use this camera, or even a WebCam, as
6 the main sensor of this robotic platform. Also, higher level commands have been provided to enhance
7 the learning outcome for beginners. In addition, a PiBot 3D printable model and the counterpart for
8 the Gazebo simulator were also developed and fully supported. They are publicly available so that
9 students and educational centers that do not have the physical robot or can not afford the costs of
10 these, can nevertheless practice and learn or teach Robotics using these open platforms: DIY-PiBot
11 and/or simulated-PiBot.

12 Keywords: Teaching Robotics; Science teaching; STEM; robotic tool; Python; Raspberry Pi; PiCamera;
13 vision system

14 1. Introduction
15 The appearance of robotic devices in the mass market such as robotic vacuum cleaners and mops,
16 as well as numerous applications and existing domotic services have made this technology increasingly
17 present in the daily routine of society, not to mention other frequently automated tasks: withdrawing
18 money at the ATM, automatic payment in supermarkets, or the massive use of Internet, shopping,
19 banking, and much more.
20 Furthermore, autonomous cars or drones make the use of this technology more visible and
21 reinforce its appeal. In fact, the short and mid-term future is/will be marked by industrial production
22 dominated by intelligent machines ([1]). The presence of humans in these intelligent factories tends to
23 be increasingly reduced and will eventually be symbolic and sporadic.
24 There is no doubt that a machine’s capacity for taking optimum decisions in real time and
25 simultaneously handling an enormous quantity of data, is far greater than that of a human being.
26 The so-called Industrialization 4.0 ([2]) involves the integration of complex robotic systems in factories
27 (Figure 1 right), logistics and what is known as the Internet of things, where sophisticated automatons
28 handle an immense quantity of data to take strategic decisions for companies.
29 These mobile and intelligent robots need, in addition to a large computational capacity, a complex
30 sensory system to act intelligently not only in factories but in robot-human interaction at general level
31 ([3]). The fixed automation of structured production chains is giving way to an unpredictable world

© 2018 by the author(s). Distributed under a Creative Commons CC BY license.


Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

Figure 1. 4th Industrial revolution: Intelligent robots at Glory Ltd.

32 and a totally unstructured reality which makes evident the need for a wide complementary range of
33 sensors and actuators to attain complete autonomy ([4]).
34 Although visual sensory modality has not been the most used for some years in mobile robotics
35 (sonar and/or laser have been more used as sensors), at present it has become the most widely
36 used sensor and will definitely be the most commonly used in the long-term future, because of the
37 possibilities it offers and the power of calculation of current computers. They are low-cost devices
38 which are potentially very computationally rich, since they provide a lot of information.
39 However, visual capacity in robots, in contrast to that of living beings, is not an easy technique.
40 The main difficulty lies in extracting useful information from the large amount of data that a camera
41 provides, for which good algorithms are needed.
42 Summarizing, as described, the advance of Artificial Intelligence (AI), Robotics and automation
43 in society ([5]), the future of work and industry in particular converge in what is already mentioned
44 as the fourth industrial revolution. According to the analysis of the University of Oxford ([6]) and
45 the professional services of Deloitte ([7]), almost half of all jobs will be occupied by robots in the next
46 25 years. Furthermore, as the Mckinsey institute shows in its last report on the global economy ([8]),
47 robots will perform the work of about 800 million jobs in 2030.

Figure 2. Different robotic prototypes to work in different educational areas

48 It is therefore of vital importance to incorporate technology, and specifically Robotics with vision
49 systems, in the pre-university educational system since todays’ youngest students will be those who,
50 within a decade, have to confront a labour market that will demand profiles related to automation
51 of systems ([9]). From the educational point of view, Robotics is a field where many areas converge:
52 electronics, physical (Figure 2 left), mechanical (Figure 2 right), computer sciences, telecommunications,
53 mathematics, etc.
54 That is why it is a fact that Robotics is growing in importance in pre-university education, either
55 as a field of knowledge in itself, or as a tool to present technology and other subjects to young students
56 in an attractive way. Furthermore, Robotics has the power to motivate students and this allows us
57 to bring technology closer to boys and girls ([10]) using robotics as a tool to present basic concepts
58 of science ([11]), technology, engineering and mathematics (STEM) ([12]). Students learn, almost
59 through playing, notions which are difficult and complex to explain or to assimilate through the classic
60 masterclass ([13,14]).
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

61 To support this increasing presence of educational robotics, there are many teaching frameworks
62 used to teach robotics to children, from those focused on primary education to more powerful ones
63 oriented to secondary education and high school. They are usually composed of a concrete robotic
64 platform, that is to say a robot, which is programmed in a certain language using software tools. Different
65 exercises, challenges or projects are then proposed to the students (practice activities). They teach the
66 basic operation of sensors, actuators and the rudiments of programming.

67 2. Educational robots
68 The most of robots we can find among the commercial educational platforms are closed. It is
69 worth mentioning the well known Lego, which has been presented for some years in educational
70 Robotics kits, with different versions: Mindstorms RCX, NXT, EV3 and WeDo ([14,15]).
71 Nevertheless, Arduino boards appeared some years ago, in an effort to work around the
72 closed-platforms limitation, providing cheaper and more adapted robotic platforms. This is a free
73 hardware board which lets add a wide variety of low-cost robotic components ([16], [17], [15], [18],
74 [19]). Thus, beginning with a basic and affordable Arduino platform, teachers and students can freely
75 adapt it to their necessities, developing an effective and low-cost robot as described in ([20], [21], [22],
76 [23]).

Figure 3. Robots Thymio, VEX IQ and VEX CORTEX

77 Another platforms are Thymio (Figure 3 left) ([24], [25], [26]), Meet Edison’s or VEX robots
78 (Figures 3 middle and right), and simulated environments such as TRIK-Studio ([19], [27]) or Robot
79 Virtual Worlds (RVW) ([28]).
80 In addition, we can find different software environments. Lego has its own option, EV3-software,
81 as Arduino does with Arduino-IDE simple text language; not to mention Scratch ([23], [29]) or variants:
82 Blockly ([30]), Bitbloq or VPL. All of them contain graphic blocks that typically connect in sequence in
83 a graphic editor. Languages such as the mentioned Arduino-IDE, or C++ (which Arduino is based
84 on) are not suitable for pre-university students due to their complexity, but they are widely used at
85 university level.
86 Exploring the existing literature we found many other works which have presented robotic
87 platforms for educational purposes and the underlying philosophy. In [31], authors focused on a 6
88 Degree of Freedom (DOF) serial robotic arm as a robotic platform for training purposes. They derived
89 the kinematic and dynamic models of the robot to facilitate the controller design. In includes an
90 on-board camera to scan the arm workspace.
91 Alers and Hu showed in [32] the AdMoVeo robotic platform, which was developed for the purpose
92 of teaching the industrial design students basic skills of programming. It is a platform which lets
93 students to explore their creativity with their passions in graphical and behavioral design.
94 Jamieson asked in [17] whether Arduino was a platform suitable for teaching computer engineers
95 and computer scientists an embedded system course with. He described a project based learning
96 embedded system course that they have taught and identify which topics were covered in it compared
97 to the IEEE/ACM recommendations. He finally concludes by saying that students expressed high praise
98 for the Arduino platform and that students’ final projects compared to the previous years were better
99 and more creative.
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

100 In [33] authors presented eBug as a low-cost and open robotics platform designed for
101 undergraduate teaching and academic research in areas such as multimedia smart sensor networks,
102 distributed control, mobile wireless communication algorithms and swarm robotics. This prototype
103 used the Atmel AVR XMEGA 8/16-bit micro-controller.
104 Miniskybot was presented in [34] as a mobile robot aimed for educational purposes which included
105 3D-printable on low cost reprap-like machines, fully open source (including mechanics and electronics),
106 and designed exclusively with open source tools. It is based on an 8-bit pic16f876a micro-controller.
107 Nevertheless, there is no system, and even less a guided one, that maintains a constant level of
108 motivation and challenge, especially where vision plays an important role. In fact, the majority of
109 these kits or robotic platforms existing in the market are focused on doing some tasks or are designed
110 to arouse the interest of the youngest and university students in Robotics, but not so that students
111 in pre-university courses acquire correct and complete training in programming, something which
112 is in great demand and so widespread in almost any degree. Although it is true that other kits exist
113 which are more specialized in specific scientific fields ([35]), the proposed framework goes further
114 and provides all the necessary open tools for both students and teachers ([36]) required to develop a
115 complete academic year in a versatile way by putting at their disposal numerous and sophisticated
116 algorithms, including vision, with a pleasant and intuitive interface.
117 In addition, an enormous gap has been identified between the level of the academic training at
118 university level in scientific and technological degrees and the official curriculum implemented at
119 pre-university levels, specifically in science subjects at Secondary Education level. Thus, this work
120 proposes to mitigate this academic gap, developing a complete teaching framework for Robotics with
121 vision, which today is non-existent, integrating:

122 1. A RaspberryPi-based open hardware platform, economically suitable for secondary education centers
123 to satisfy the needs of a complete class, but at the same time standardized and powerful, which
124 allows the execution of algorithms of Robotics with vision.
125 2. An open software infrastructure that is simple and intuitive for young students to manage but that
126 at the same time is powerful and versatile, incorporating enough resource libraries to provide
127 practical exercises that are sufficient in both, number and complexity, on programming robots
128 with vision, so as to continuously motivate students ([37]), as well as diverse examples.
129 3. A wide repertoire of practice activities that can be followed during a complete academic year and
130 that includes sufficient and properly staggered sessions for correct assimilation by the students
131 ([38]).

132 3. Design of the PiBot tool for STEM education


133 After the analysis of most relevant available educational robots, the design of the new proposed
134 robot is described in this section. It takes benefit of some new possibilities offered by different
135 technologies and aims to overcome some observed limitations in current platforms (like having no
136 cameras or being not usable with programming languages like Python). It is not intended for primary
137 education or first year secondary education, where visual languages like Scratch are better starting
138 point. Instead it is designed for secondary education above 12 years and even introductory university
139 courses.
140 Better tools improve the learning processes in kids. The PiBot education tool follows an
141 architecture of three parts, as shown in Figure 4: the robot platform, the software drivers and the
142 exercises. The robot and the drivers can be seen as the infrastructure for the exercises, which can be
143 organized in courses or levels and focus on different aspects of robotics.
144 The creation of the PiBot tool has followed several design principles:

145 1. Low cost (under 180 euros), to make it affordable for most schools and students.
146 2. Open: first, the robot hardware should be easily assembled by the students themselves, which
147 may also make most pieces with a 3D printer. This way the assembly of a PiBot can be an
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

Figure 4. Architecture of the PiBot tool: hardware (platform) and software (drivers and exercise)

148 educative activity and interesting for the makers community. Second, drivers should be open
149 source, publicly available.
150 3. Compatibility with common sensors and actuators in (arduino-based) educational robots. This
151 way, if an Arduino-based robot is already available, the transition to PiBot is quite affordable; and,
152 in any case, the acquisition of components for PiBot is very simple, given the large availability
153 of components for Arduino.
154 4. Include vision in an easy way. Cameras are very useful sensors and this platform may expose
155 students to vision in an easy and practical way.
156 5. It has to support not only the real robot but also a simulated robot. This way even with no physical
157 platform, the PiBot tool may be used to teach and learn robotics.
158 6. Python as a programming language because of its simplicity, its expressive power and because it
159 is widely used in higher levels of education and programming.

160 4. PiBot robotic platform


161 The robots are tipically composed of a computer or a microprocessor, several sensors, actuators
162 and some form of connectivity. Sensors provide information about the environment, the computer run
163 the robot software and actuators allow the robot to do things like moving itself or perform actions in
164 the world.

165 4.1. Hardware design


166 The block diagram of the PiBot hardware is shown on Figure 5. The main computer is a Raspberry
167 Pi 3 controller board (Figure 7 middle). It is more powerful than Arduino processors, keeps low cost, a
168 runs a functional operating system based on Linux; specifically, the Raspbian Stretch distribution. It
169 allows the use of standard development tools on the Linux community and the use of the PiCamera.
170 The sensors mounted onboard PiBot are:
171 • An ultrasound sensor model HC-SR04 (Figure 6 left)
172 • Infrared sensors
173 • Motor encoders
174 • Raspberry PiCamera (Figure 6 right). It is connected to the computer using a dedicated data bus.
175 Its technical details are included in Table 1.
176 The US, IR and encoders sensors are connected to the RaspBerryPI board through several GPIO
177 ports (General Purpose Input/Output). This protocol allows the connection and control of several devices
178 at the same time and requires some configuration on each port to serve as input and output of data
179 ([39]).
180 The actuators mounted onboard PiBot are two DC motors (Parallax Feedback 360◦ High Speed
181 Servo (Figure 7, left)). They allow movement and differential drive to the PiBot. The motors include
182 encoders and are connected to the main processor through GPIO bus.
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

Figure 5. Hardware design of the PiBot robot

Figure 6. Ultrasonic sensor model HC-SR04, IR sensors and RaspBerry PiCamera


PiCamera params. Values
Sensor type Sony CMOS 8-Mpx
Sensor size 3.6x2.7mm (1/4" format)
Pixel Count 3280x2464 (active px.)
Pixel Size 1.12 x 1.12 um
Lens f=3.04 mm, f/2.0
Angle of View 62.2x48.8 degrees
SLR lens equivalent 29 mm
Table 1. PiCamera (v2.1 board) technical intrinsic parameters

183 All these components are assembled into a body made of 3D printable pieces. The 3D printable
184 models of all the chassis pieces are publicly available1 . The body also allocates a battery of 10, 000
185 mAh which provides power to all electronic onboard devices. An official list of components and some
186 tentative providers are also available at the same webpage so that anyone can buy the components,
187 print the pieces and build a PiBot.

188 4.2. Simulated robot


189 For simulation of the PiBot platform the Gazebo simulator 2 has been selected. It is an open
190 source robotic simulator powered by Open Robotics Foundation and the de facto standard in the
191 robotics scientific community. It provides a physics engine so collisions and realistic movements are
192 provided.

1 https://github.com/JdeRobot/JdeRobot/tree/master/assets/PiBot
2 http://gazebosim.org
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

Figure 7. Motors, RaspBerryPi board and PiBot made with 3D printable pieces

193 The students may program an exercise and run their code seemlessly both on the physical PiBot
194 or on the simulated PiBot inside Gazebo, at will. The student code lie on top of the PiBot API
195 (Application Programming Interface), which is used to get sensor readings and to command actuator
196 orders. The API is exactly the same on both cases. In the first one some drivers will be used to connect
197 to the physical devices. In the second one other drivers will exchange messages with the simulator to
198 implement the same functions.
199 In order to support this new robot a 3D model of the robot was developed (Figure 8). In addition,
200 several plugins were also integrated for the simulation of the onboard camera, the distance sensor
201 (sonar) and IR sensors. IR support has been implemented using small cameras. Each IR consists of a
202 4x4 pixel camera and an additional code that computes the virtual IR measurement from the values of
203 those pixels. The movement was also supported with the corresponding Gazebo plugin, which also
204 provides a 2D position sensor (like encoders).
205 The 3D PiBot model and all the developed plugins are publicly available on 3 and 4 respectively.

Figure 8. PiBot robot simulated in Gazebo

3 https://github.com/JdeRobot/JdeRobot/tree/master/assets/gazebo
4 https://github.com/JdeRobot/JdeRobot/tree/master/src/drivers/gazebo/plugins/pibot
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

206 5. Software infrastructure


207 Python was chosen as a programming language for supporting PiBot because of its simplicity,
208 its expressive power and because it is widely used in higher levels of education and many industries
209 (in conjuntion with powerful libraries). It is a text language, interpreted and object oriented. This
210 language is easier to learn than other also widely used programming languages, such as C/C++ or
211 Java, and at the same time it has great power. It is a real world language but accessible for pre-university
212 students.
213 With the proposed educational tool the students program their exercises in Python by writing
214 the file exercise.py, for example, with a text editor. That program uses the PiBot Application
215 Programming Interface (API) to control the robot, which contains a set of natural methods to read the
216 measurements from the robot sensors (US, IR, camera) and methods to give commands to the robot
217 actuators (DC motors). The most important API methods are detailed in Table 2.
218 Two different libraries have been developed to support that API. One runs onboard the PiBot
219 RaspBerryPi and a second one communicates with the simulated robot inside Gazebo. As the
220 programming interface is the same in both cases, the student application works interchangeably on the
221 physical platform and on the simulated one. The final robot in each case is selected by specifying it on
222 the configuration file.
223 Using this API, students concentrate on the algorithm they are developing, on the robot’s
224 intelligence, avoiding the low level details such as ports, connectivity with the robot, etc. which
225 are stored in the library configuration file.

Actuators Sensors
RightMotor(V) readUltrasound
LeftMotor(V) readInfrared
getImage
move(V, W) getColoredObject(color)
getDistancesFromVision
getRobotPosition
Table 2. Application Programming Interface (API)

226 The API methods can be divided into raw methods and cooked methods. Raw methods provide
227 access to a single device, like readUltrasound, readInfrared or getImage. RightMotor(V) controls the
228 single right motor commands a desired speed to it, as LeftMotor(V) does for the other motor. The
229 cooked methods provide a simpler and more compact way to control the whole robot or two vision
230 functions to get useful information from the image in an easy way. They will be detailed later.

231 5.1. Drivers for the real PiBot


232 To support the real PiBot two modules were programmed as shown in Figure 9. One includes
233 the management of the PiCamera and the other deals with GPIO devices (US sensor, IR sensors and
234 motors). They were programmed in Python using standard available libraries in Python community. It
235 is publicly available5 . The image processing functionality also relies on OpenCV.

236 5.2. Drivers for the simulated PiBot


237 To support the simulated PiBot on Gazebo an specific library was developed which connects with
238 the plugins mentioned before exchanging messages through the ICE communication layer. It achieves

5 https://github.com/JdeRobot/JdeRobot/tree/master/src/drivers/PiBot/real
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

Figure 9. Connection of the library with the real PiBot

239 sensor readings and camera images through network interfaces built in the JdeRobot project 6 . It is
240 also publicly available 7 .

Figure 10. Connection of the library with the simulated PiBot

241 5.3. Movement control


242 Regarding motors, beyond the raw methods RightMotor(V) and LeftMotor(V) a new cooked
243 method is provided for simpler control of the robot movements: Move(V,W). This method accepts as
244 parameter the desired lineal speed V and the desired rotation speed W, it internally translates them
245 into commands to the left and right motors so that the whole robot moves accordingly to V and W. It
246 takes into account the geometry of the PiBot and its wheels.
247 This function provides general 2D movement control: the PiBot may rotate without displacement
248 (setting V = 0 and using W) both left or right (depending on the sign of W), may advance in straight
249 line (setting W = 0 and using V) both backwards and forward (depending on the sign of V), and may
250 move in generic arcs advancing and rotating at the same time.
251 It is a speed control which is useful when programming reactive behaviors, which is better that
252 position-based control.

253 5.4. Vision support


254 One advantage of PiBot educational tool is its support for the camera. This allows many
255 new possible exercises with vision and vision-based behaviors. It also introduces the students
256 to computer vision in a simple and natural way. Two functions (getColoredObject(color) and
257 getDistancesFromVision) have been included so far in the PiBot API to get easily useful information
258 from images, because the raw method getImage and the pixels processing are too complex for high
259 school students. They have been implemented and included in a vision library which performs
260 complex image processings, hides all the complexity inside and it is really simple to use, very intuitive.
261 It internally employs OpenCV library, a standard in Computer Vision community.

6 https://jderobot.org
7 https://github.com/JdeRobot/JdeRobot/tree/master/src/drivers/PiBot/Gazebo
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

262 First, the cooked method getColoredObject(color) accepts the desired color as input parameter
263 and it filters in the current camera image all the pixels of that color (some of them are already predefined
264 in the library: orange, red, blue...). It delivers as output the position of the colored object inside the
265 image (its mean X and Y value) and its size (the number of detected pixels of that color). It works with
266 single objects as can be seen in Figure 11.

Figure 11. GetColoredObject function for orange color

267 It uses HSV color space and OpenCV filtering methods. This function on PiBot API allows for
268 exercises like Object-Following, which will be detailed in the next section.
269 Second, the cooked method getDistancesFromVision computes the distance to obstacles in front
270 of the PiBot and provides a depth map from the robot to the surrounding objects. Typically the sonar
271 sensor measures the distances in one direction. Using the camera for the same the angular scope is
272 extended to the camera field of view (around 60 degrees).
273 The developed vision library contains an abstract model of the camera (pin-hole) and several
274 projective geometry algorithms. The camera parameters are known (K matrix and relative position
275 inside the robot). As the PiBot only has a single camera no stereo technique can be used for depth
276 estimation. Instead, the implementation of getDistancesFromVision method assumes that all objects
277 lie on the floor and the floor surface has a uniform color (ground hypothesis). It sweeps all the columns
278 of the current image from its bottom. When the first edge pixel is found on a column it is backprojected
279 into 3D space, using ray tracing and the pin-hole camera model. The intersection of such ray with the
280 floor plane is the estimated position of that edge in 3D space, and its distance to the robot is computed.
281 In this way, the 3D point corresponding to each bottom pixel of the obstacle in the image can be
282 obtained (Figure 12).

Figure 12. Ground Hypothesis assumes all objects are on the floor

283 For instance, Figure 13 shows in the left side the image coming from the camera, with the white
284 floor (the appearing battery was be safely ignored as only green pixels were taken into account for
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

285 explanatory purposes in this test). On the right side the estimated depths for the green object are
286 displayed as red points and the field of view is also shown as a white trapezoid. The estimated
287 distances are regularly consistent and correct.

Figure 13. Example of visual sonar reading with 25 cm object shown using 3D scene simulator

288 This getDistancesFromVision function on PiBot API allows for exercises like robot navigation
289 with obstacles. For instance the vision-based obstacle avoidance which will be detailed in the next
290 section.

291 6. Exercises for students using PiBot


292 Finally, a plan of activities using the PiBot is described.

293 6.1. Basic exercises


294 Students can begin assembling different components on the PiBot and review some basic concepts
295 of electronics so that they have no problems when connecting the different components, such as infrared
296 or ultrasound sensors (Figure 14).

Figure 14. Practice activity with PiBot to handle an ultrasonic sensor

297 6.2. Basic behaviors


298 The last step consists of a complete Robotics project where students can combine everything
299 previously learnt. Firstly, they can begin developing classic Robotics projects like line tracking using
300 infrared sensor (Figure 15) or navigation avoiding obstacles by means of ultrasounds (Figure 16).

301 6.3. Vision-based behaviors


302 Secondly, students can develop more advanced robotics projects, using vision as the main sensor.
303 Some projects developed are: following an colored object (Figure 17), line tracking (Figure 18) or
304 bump-and-go (Figure 19).
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

Figure 15. Practice line tracking task in both real and simulated PiBot platforms using IR sensor

Figure 16. Navigation practice avoiding obstacles through US in PiBot

Figure 17. Navigation exercise of following a colored object using vision in both real and simulated
PiBot platforms

Figure 18. Practice line tracking task in both real and simulated PiBot platforms using vision

Figure 19. Exercise of avoiding obstacles by visual depth estimation in PiBot

305 During the last month of July a Robotics workshop was taught to ten teachers at the Campus
306 of Fuenlabrada of the Rey Juan Carlos University (Madrid) (Figure 20), training them to use the
307 developed framework with PiBot as a robotic platform.
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

Figure 20. Workshop at Rey Juan Carlos University to train teachers for teaching with JdeRobot-Kids
framework using PiBot

308 7. Conclusions
309 This research is focused on incorporating Robotics and robots with vision in the classroom to
310 train pre-university students, satisfying the demands imposed by the Digital Age Society and the
311 motivational needs detected in students, who still study in a system of training still to be adapted to
312 the so-called Industrial Revolution 4.0.
313 Although there are numerous educational Robotics kits on the market, most of them are aimed
314 at very young students. They are generally based on building their robotic platforms with their own
315 programming environments, far from employing more standardized programming languages. They
316 usually have somewhat limited capabilities which means that these tools tend to trigger a low level
317 of motivation in students in the mid term (for instance in students that have already followed an
318 introductory robotics course). Furthermore, given the complexity involved in the processing of images,
319 cameras are not usually included in the educational robotic frameworks despite their great versatility
320 and extensive use in real life applications.
321 After studying the current market of the existing Robotics educational kits and conducting an
322 in-depth analysis what the future holds in the short and mid-term in terms of demands of the labor
323 market, the authors (one of them an experienced Secondary Education teacher) detected a deficiency
324 in the teaching-learning process of Robotics at pre-university curricular level. Therefore, a complete
325 new educational tool was developed, which includes:

326 • A robotic platform based on the free hardware controller board Raspberry Pi 3. This platform
327 was chosen for several reasons: low cost, power, versatility, standardization and inclusion of a
328 camera with its own data bus, the PiCamera. Thus, a fully functional robot, the PiBot, and the
329 counterpart for Gazebo simulator and for DIY 3D printable model were developed. Thanks to
330 the GPIO ports on the board, various sensors and actuators —both real and simulated— have
331 been connected, in addition to its own camera.
332 • A software infrastructure developed in Python language, which facilitated students’
333 programming of the robot, with simple and intuitive functions to handle the different sensors and
334 actuators. At the same time this infrastructure has great potential corresponding to its handling
335 of a camera as a sensor.
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

336 • A wide set of exercises that serve as a support to students for their progression in the learning of
337 the programming of robots with vision.

338 About future lines, one intended improvement in the short term are is to extend the vision support:
339 (a) developing new practical sessions with vision such as the detection and monitoring of people’s
340 faces, and materialize in the PiBot a visual memory; (b) the camera may also be seated on a servo and
341 so the current vision range could be extended to a wider field of view, thanks to the movement of the
342 camera.
343 It is also intended to develop the support for the encoders of the PiBot motors, which would
344 allow to develop more position-based sophisticated navigation.
345 Finally, authors are also working to support PiBot programming with the popular visual Scratch
346 language, so that younger students can start programming this robot in a very simple way. With the
347 same PiBot platform they could start learning robotics with Scratch and later on jumpo to Python and
348 face more appealing exercises.

349

350 1. Mies, G.; Zentay, P. Industrial Robots meet industry 4.0. XII Hadmernok, 2017.
351 2. Schwab, K. The Fourth Industrial Revolution; World Economic Forum, 2016.
352 3. Vega, J.; Cañas, J. Sistema de atención visual para la interacción persona-robot. Workshop on Interacción
353 persona-robot, Robocity 2030, pp. 91-110. ISBN: 978-84-692-5987-0, 2009.
354 4. Arbel, T.; Ferrie, F. Entropy-based gaze planning. Image and Vision Computing, vol. 19, no. 11, pp. 779-786
355 2001.
356 5. Solove, D. The Digital Person: Technology and Privacy in the Information Age; 2004.
357 6. Frey, C.; Osborne, M. The future of employment: how susceptible are jobs to computerisation?; 2013.
358 7. Deloitte. From brawn to brains: The impact of technology on jobs in the UK; 2015.
359 8. Institute, M. Jobs lost, jobs gained: workforce transitions in a time of automation, 2017.
360 9. UK-RAS. Manufacturing Robotics: The Next Robotic Industrial Revolution, 2016.
361 10. Rodger, S.H.; Walker, E.L. Activities to attract high school girls to computer science. National Science
362 Foundation’s Directorate for Education and Human Resources under the Model Projects for Woman and Girls 1996.
363 11. Altin, H.; Pedaste, M. Learning approaches to applying robotics in science education. Journal of baltic
364 science education 2013, 12, 365–377.
365 12. Mubin, O.; Stevens, C.J.; Shahid, S. A review of the applicability of robots in Education. Technology for
366 Education and Learning, 2013 2013.
367 13. Cerezo, F.; Sastrón, F. Laboratorios Virtuales y Docencia de la Automática en la Formación Tecnológica de
368 Base de Alumnos Preuniversitarios. Revista Iberoamericana de Automática e Informática Industrial RIAI 2015,
369 12(4), 419–431. doi:10.1016/j.riai.2015.04.005.
370 14. Jiménez, E.; Bravo, E.; Bacca, E. Tool for experimenting with concepts of mobile robotics as applied to
371 children education. IEEE Trans. Education 2010, 53, 88–95.
372 15. Navarrete, P.; Nettle, C.J.; Oliva, C.; Solis, M.A. Fostering Science and Technology Interest in Chilean
373 Children with Educational Robot Kits. Robotics Symposium and IV Brazilian Robotics Symposium
374 (LARS/SBR), 2016 XIII Latin American. IEEE, 2016, pp. 121–126.
375 16. Araujo, A.; Portugal, D.; Couceiro, M.S.; Rocha, R.P. Integrating Arduino-Based Educational Mobile Robots
376 in ROS. J Intell Robot Syst (2015) 77:281-298 2015.
377 17. Jamieson, P. Arduino for Teaching Embedded Systems. Are Computer Scientists and Engineering Educators
378 Missing the Boat? Miami University, Oxford, OH, 45056 2012.
379 18. Chaudhary, V.; Agrawal, V.; Sureka, P.; Sureka, A. An experience report on teaching programming and
380 computational thinking to elementary level children using lego robotics education kit. Technology for
381 Education (T4E), 2016 IEEE Eighth International Conference on. IEEE, 2016, pp. 38–41.
382 19. Filippov, S.; Ten, N.; Shirokolobov, I.; Fradkov, A. Teaching robotics in secondary school. IFAC-PapersOnLine
383 2017, 50, 12155–12160.
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 17 October 2018 doi:10.20944/preprints201810.0372.v1

Peer-reviewed version available at Electronics 2018, 7, 430; doi:10.3390/electronics7120430

384 20. Junior, L.A.; Neto, O.T.; Hernandez, M.F.; Martins, P.S.; Roger, L.L.; Guerra, F.A. A low-cost and simple
385 arduino-based educational robotics kit. Cyber Journals: Multidisciplinary Journals in Science and Technology,
386 Journal of Selected Areas in Robotics and Control (JSRC), December edition 2013, 3, 1–7.
387 21. Plaza, P.; Sancristobal, E.; Fernandez, G.; Castro, M.; Pérez, C. Collaborative robotic educational tool based
388 on programmable logic and Arduino. Technologies Applied to Electronics Teaching (TAEE), 2016. IEEE,
389 2016, pp. 1–8.
390 22. Afari, E.; Khine, M. Robotics as an educational tool: Impact of LEGO mindstorms. IJIET 2017, 7, 437–442.
391 23. Beyers, R.N.; van der Merwe, L. Initiating a pipeline for the computer industry: Using Scratch and LEGO
392 robotics. Information Communication Technology and Society (ICTAS), Conference on. IEEE, 2017, pp.
393 1–7.
394 24. Mondada, F.; Bonani, M.; Riedo, F.; Briod, M.; Pereyre, L.; Rétornaz, P.; Magnenat, S. Bringing robotics to
395 formal education: the thymio open-source hardware robot. IEEE Robotics & Automation Magazine 2017,
396 24, 77–85.
397 25. Roy, D.; Gerber, G.; Magnenat, S.; Riedo, F.; Chevalier, M.; Oudeyer, P.Y.; Mondada, F. IniRobot: a
398 pedagogical kit to initiate children to concepts of robotics and computer science. RIE 2015, 2015.
399 26. Magnenat, S.; Shin, J.; Riedo, F.; Siegwart, R.; Ben-Ari, M. Teaching a core CS concept through robotics.
400 Proceedings of the 2014 conference on Innovation & technology in computer science education. ACM,
401 2014, pp. 315–320.
402 27. Stone, A.; Farkhatdinov, I. Robotics Education for Children at Secondary School Level and Above.
403 Conference Towards Autonomous Robotic Systems. Springer, 2017, pp. 576–585.
404 28. Witherspoon, E.B.; Higashi, R.M.; Schunn, C.D.; Baehr, E.C.; Shoop, R. Developing computational thinking
405 through a virtual robotics programming curriculum. ACM Transactions on Computing Education (TOCE)
406 2017, 18, 4.
407 29. Plaza, P.; Sancristobal, E.; Carro, G.; Castro, M.; Blázquez, M.; Muñoz, J.; Álvarez, M. Scratch as Educational
408 Tool to Introduce Robotics. International Conference on Interactive Collaborative Learning. Springer, 2017,
409 pp. 3–14.
410 30. Naya, M.; Varela, G.; Llamas, L.; Bautista, M.; Becerra, J.C.; Bellas, F.; Prieto, A.; Deibe, A.; Duro, R.J.
411 A versatile robotic platform for educational interaction. Intelligent Data Acquisition and Advanced
412 Computing Systems: Technology and Applications (IDAACS), 2017 9th IEEE International Conference on.
413 IEEE, 2017, Vol. 1, pp. 138–144.
414 31. Manzoor, S.; Islam, R.U.; Khalid, A.; Samad, A.; Iqbal, J. An open-source multi-DOF articulated robotic
415 educational platform for autonomous object manipulation. Robotics and Computer-Integrated Manufacturing
416 2014, 30, 351 – 362.
417 32. Alers, S.; Hu, J., AdMoVeo: A Robotic Platform for Teaching Creative Programming to Designers. In
418 Learning by Playing. Game-based Education System Design and Development; 2009; pp. 410–421.
419 33. Dademo, N.; Lik, W.; Ho, W.; Drummond, T. eBug - An Open Robotics Platform for Teaching and Research.
420 Proceedings of Australasian Conference on Robotics and Automation 2011.
421 34. Gonzalez, J.; Valero, A.; Prieto, A.; Abderrahim, M. A New Open Source 3D-printable Mobile Robotic
422 Platform for Education. Advances in Autonomous Mini Robots 2012.
423 35. Schweikardt, E.; Gross, M.D. roBlocks: A Robotic Construction Kit for Mathematics and Science Education.
424 Proceedings of the 8th International Conference on Multimodal Interfaces, 2006, ICMI ’06.
425 36. Bers, M.U.; Ponte, I.; Juelich, C.; Viera, A.; Schenker, J. Teachers as Designers: Integrating Robotics in Early
426 Childhood Education. Information Technology in Childhood Education Annual 2002, 2002, 123–145.
427 37. Benitti, F. Exploring the educational potential of robotics in schools: A systematic review. Computers and
428 Education 2012, 58, 978 – 988.
429 38. Ainley, J.; Enger, L.; Searle, D., Students in a Digital Age: Implications of ICT for Teaching and Learning.
430 In International Handbook of Information Technology in Primary and Secondary Education; Voogt, J.; Knezek, G.,
431 Eds.; Springer US: Boston, MA, 2008; pp. 63–80.
432 39. Balachandran, S. General Purpose Input Output (GPIO). Technical report, ECE 480 Design team 3. Available
433 in the College of Engineering, Michigan State University website, 2009.

You might also like