0% found this document useful (0 votes)
9 views

RFD

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

RFD

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 2

An important personal realization was that

bringing together a team motivated to solve


the problem was essential. Moving forward,
we had 22 members on ‘Team ACRV’, mainly
undergrad and PhD students. Managing the
team was a challenge in itself, especially since
we were based thousands of kilometres apart
in Adelaide, Brisbane and Canberra. The
rules changed and allowed more flexibility in
design for the competition in 2017, so we built
our robot from scratch, both software and
hardware. We conducted weekly full system
tests, enabling the comparison of updates
and improvements from a holistic viewpoint.
This process kept us from merely improving
subsystems and losing focus on our end goal.
This iterative and flexible approach meant
that when something went wrong, we had a
pretty good guess what it was. In the end, our
solution was the only one that did not use
an industrial or humanoid arm. Instead, we
designed a Cartesian coordinate robot with a
claw and a suction gripper for a ‘hand’ and a
sliding mechanism that picked up objects from
above (pictured). We nicknamed our Cartesian
manipulation robot Cartman1
.
The 2017 challenge had three stages.
First, robots picked specified objects from
an assortment of items and placed them in
boxes — the ‘pick’ task. Second, robots selected
target items out of a container and placed
them in storage — the ‘stow’ task. And third,
robots put items into storage and then lifted a
selection of them and put them into boxes — a
combined pick and stow task. Compared to
previous years, robots had less space to work
in, forcing them to deal with objects next to
or on top of each other. Another change was
that half of the objects in a task were only
revealed 45minutes before the competition
started, so teams could not prepare in advance
by programming their robots to manipulate
specific objects. To tackle this last added
difficulty, we created a computer vision system
that could be trained on photos of new objects
taken from different angles, which were
fed into a deep neural network, the latest in
machine learning. Although we didn’t place in
the top three teams in the first two tasks, our
robot performed so well in the finals (the third
task) that we took home first place, including
an US$80,000 prize.
Robotics needs challenges that sit between
current challenges and a big unifying grand
challenge, such as the DARPA Robotics
Challenge. We recently proposed a Tidy Up My
Room Challenge, a teaser of which occurred at
the International Conference on Robotics and
Automation meeting in 2018. This challenge
asks, ‘How do you know that an object is
out of place?’ Visually, a book may look the
same on the floor or on the coffee table, yet
one place is considered ‘tidier’. The challenge
is multi-tiered, with increasing complexity
in perception, reasoning and manipulation.
It provides a way of benchmarking and
comparing robotic systems on a task level,
instead of focusing on sub-problems. This
framework allows researchers to explore a
wider design space, including robotic systems
that are soft, flexible and deformable, while
being less reliant on high-precision object
detection and localization. Fundamentally,
such challenges bring researchers together to
solve outstanding problems, getting us closer to
the robots of the future. ❐
Jürgen Leitner1,2
1
Australian Centre for Robotic Vision, Brisbane,
Queensland, Australia. 2
LYRO Robotics, Brisbane,
Queensland, Australia.
e-mail: [email protected]
Published online: 11 March 2019
https://doi.org/10.1038/s42256-019-0031-6
References
1. Morrison, D. et al. in IEEE International Conference on Robotics
and Automation 7757–7764 (IEEE, 2018).
Acknowledgements
Team ACRV is funded by the Australian Research Council
(project number CE14010001).
Nature Machine Intelligence | VOL 1 | MARCH 2019 | 162 | www.nature.com/natmachinte

You might also like