0% found this document useful (0 votes)
69 views

Development of Telepresence Teaching Robots With Social Capabilities

This document describes research on developing telepresence teaching robots with social capabilities to improve distance education. The researchers aim to equip telepresence robots, which allow remote video conferencing, with features of social robots to make the remote instructor seem more credible and the platform more usable. This may help maximize student learning outcomes. Challenges of using commercially available telepresence robots for education include a lack of education-specific design. The researchers are developing an innovative remote teaching platform combining telepresence and social robot technologies.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
69 views

Development of Telepresence Teaching Robots With Social Capabilities

This document describes research on developing telepresence teaching robots with social capabilities to improve distance education. The researchers aim to equip telepresence robots, which allow remote video conferencing, with features of social robots to make the remote instructor seem more credible and the platform more usable. This may help maximize student learning outcomes. Challenges of using commercially available telepresence robots for education include a lack of education-specific design. The researchers are developing an innovative remote teaching platform combining telepresence and social robot technologies.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/328979587

Development of Telepresence Teaching Robots With Social Capabilities

Conference Paper · November 2018


DOI: 10.1115/IMECE2018-86686

CITATIONS READS
12 2,968

4 authors:

Mingshao Zhang Pengji Duan


Southern Illinois University Edwardsville Southern Illinois University Edwardsville
33 PUBLICATIONS   267 CITATIONS    2 PUBLICATIONS   12 CITATIONS   

SEE PROFILE SEE PROFILE

Zhou Zhang Sven Esche


New York City College of Technology Stevens Institute of Technology
44 PUBLICATIONS   280 CITATIONS    117 PUBLICATIONS   2,245 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Virtual Laboratories (VLs) Developed with Machine Design, Robotic Sensing, Dynamics and Control View project

All content following this page was uploaded by Zhou Zhang on 24 December 2018.

The user has requested enhancement of the downloaded file.


Proceedings of the ASME 2018 International Mechanical Engineering Congress and Exposition
IMECE2018
November 9-15, 2018, Pittsburgh, PA, USA

IMECE2018-86686

DEVELOPMENT OF TELEPRESENCE TEACHING ROBOTS WITH SOCIAL


CAPABILITIES

Mingshao Zhang Pengji Duan


Southern Illinois University Edwardsville Southern Illinois University Edwardsville
Edwardsville, Illinois, USA Edwardsville, Illinois, USA

Zhou Zhang Sven Esche


CUNY New York College of Technology Stevens Institute of Technology
New York, New York, USA Hoboken, New Jersey, USA

effects such approaches have on student learning


ABSTRACT and perceptions of instructor credibility.
A telepresence robot is a device that allows
In order to maximize the students’ learning
people to participate in video conferences on a outcomes, it is very important to improve the
moveable platform from a remote location. The
usability of the telepresence robot platform for
users can remotely control the robot’s motion both the instructors and the students. In addition,
and interact with each other through a video the instructor credibility is also crucial to the
screen. Such systems, which were originally overall learning experience. In the research
designed to promote social interaction between presented here, an innovative remote teaching
people, have become popular in various platform, which includes features of
application areas such as office environments, telepresence robots and social robots (which are
health care, independent living for the elderly, autonomous robots that interact and
and distance learning. Although there is ample
communicate with humans by following social
published empirical work surrounding the use of behaviors and rules associated with their roles),
telepresence and computer-mediated
is developed. It is believed that telepresence
communication in education, few studies have robots equipped with the capabilities provided
examined telepresence robots in the classroom. by social robots can improve the credibility of
Although some studies have indicated positive the instructor and the usability of the education
learning experiences and outcomes in education platform, both of which contribute to the
facilitated by telepresence robots, further students’ overall learning outcomes.
research is needed to better identify the possible

1
Key words: Telepresence Robots, Social immediately required, such as brain surgery and
Robots, Distance Learning, Usability, Learning stroke care at Intensive Care Units [10, 11].
Outcomes.
Assisted Living for Elderly. With an increasing
elderly population globally, telepresence robots
1 BACKGROUND AND RELATED
can provide many desirable features in private
WORK
homes or residential facilities. It can facilitate
“Telepresence” refers to technologies that allow long-term health monitoring of the elderly [12],
a person to feel as if they were present, to have it can allow remote consultations, and more
the appearance, or to have a similar effect as importantly, it promotes social interactions
being present. For the users, telepresence among the elderly, who are prone to isolation
provides stimuli (sound, video, haptic feedback, and loneliness [13, 14, 15].
etc.) to enable the feeling of being in that remote
location. It also gives the users’ the ability to 2 DESIGN CHALLENGES IN
affect that remote location through position, EDUCATIONAL APPLICATIONS
motion, voice, etc. The basic features of
Telepresence robots enable video and audio
telepresence can be achieved using a simple
transmission between remote sites, which
videoconferencing platform. “Telepresence
provides unique opportunities in distance
robots” have become a commonly known term
education. Regardless of the teaching quality
to denote contemporary commercial robots
and the students’ learning outcomes, remote
designed specifically for remote presence and
teachers can give lectures and provide their
telecommunication [ 1 ]. Telepresence robots
skills to the students, which is extremely helpful
have gained widespread popularity, because they
in underserved geographic areas as well as for
can provide the users the function of being
students with disabilities and chronic illness
present as well as independent mobility through
[ 16 ]. It is also helpful in teaching language
teleoperation of the robots.
courses, allowing for instance foreign teachers to
Telepresence robots are suitable for a wide range teach English through robots in South Korea,
of applications. Among those areas, three are without physically relocating to that country
particularly dominant in the literature: office [17].
environments, health care, and assisted living for
According to a consumer report published on
elderly [ 2 ]. For commercially available
February 23, 2018 [18], the best commercially
telepresence robots, there exists a fourth
available telepresence robots are included in
category: unspecified intended application areas.
Figure 1. They provide one main common
Office Environment. This is the most commonly function, namely remote video conference. They
seen application, in both research and are not designed specifically for education
commercial products. The telepresence robots purpose, which makes them undesirable if used
allow remote coworkers to join their local in distance education.
coworkers in meetings, which can save
companies a lot of money for travelling. The
robots also allow immediate access to the local
site whenever the remote coworkers are needed,
which saves time for travelling as well [3, 4, 5, 6,
7].
Health Care. Telepresence robots, when used in
postoperative care, enable a better
communication between the patients and their Figure 1, best telepresence robots on the market; (a)
Double Robotics, (b) Suitable Tech Beam, (c) iRobot
doctors. Related research stated that this can
Ava 500, (d) Kubi Classic
improve the patients’ satisfaction and safety [8],
as well as benefit the patients financially [9]. It Challenges for Students. Contrary to their
is extremely helpful when consultation is counterparts in workplace telepresence,
2
telepresence robots used in the classroom share students’ learning outcomes. This is also the
some similarities with the applications in main target for the research presented here.
assisted elderly living. The users (elderly, While telepresence robots provide unique
students with significant illnesses, disabilities or advantage that the teachers do not need to be
impairments) may have difficulties in physically located in the same classroom as the
continuously controlling the robots [ 19 ]. The students, they also raise a concern – the social
wide range of differences between the students interaction between the teachers and the students
must also be taken into consideration, such as may potentially be hampered. It has been proven
differences in mental maturity, cognitive that the social interaction between students and
development, familiarity with technology, teachers positively impacts the students’
socioeconomic backgrounds, etc. [20]. learning [21]. Interactions improve the students’
motivation and help them to see the relevance of
Challenges for Teachers. In many cases, using
topics introduced by the teachers.
video-conferencing alone is not enough for
teaching classes. “In-person” interaction with To maximize the students’ learning outcomes, it
the induvial students is crucial for the students’ is very important to improve the usability of the
learning outcomes. The teachers also rely telepresence robot platform for both the
heavily on classroom equipment to assist their instructors and the students, as well as to ensure
teaching, such as computers, microphones, white smooth interactions. The research presented here
boards, document cameras, projectors, etc. was performed to specifically target the
Often, the teachers are no better than the following three design challenges of using
students when it comes to their familiarity with telepresence robots in a wide range of classroom
technologies, and yet, in the setup of using teaching:
telepresence robots for remote teaching, the
teachers need to pay attention to the teaching 1. Design for the teachers, located remotely
materials as well as to interacting with the 2. Design for the students, located locally and
students via mobile telepresence robots, which needing to interact with the robot
can be challenging if an interface that is easy to
constantly
use is not provided. In addition, the research
presented here also provides the teachers an 3. Design for smooth communication
option of using classroom equipment (computer, between the teachers and the students
microphone and projector) if needed.
Challenges for Communication. Maintaining a 4 PROTOTYPE DEVELOPMENT
high quality audio and video communication 4.1 “WALKING SKYPE” SETUP
between students and teachers is essential for
this application. Poor audio quality negatively The basic functions of the mobile telepresence
impacts the users’ experience and interactions robot presented here are not much different from
[2]. If only provided with a tele-conferencing the existing commercial products. It provides
function, the users sometimes will feel more like remote audio/video conferencing while being
observers than real students. However, if controlled to navigate through certain
provided with excessive functions without an environments (i.e. the “walking skype”). To
interface that is easy to use, the technology itself ensure the flexibility of the telepresence robot in
will become an obstacle for conversation. various classroom scenarios, the development
was focused on two major components, namely
3 DESIGN METHODOLOGIES the hardware and the software.
The advancement of technology uses in The physical system should have the following
education has always been a popular research functionality:
topic. Although the specific topics vary in
• A docking system providing a connection
different disciplines, one of the common focal with smart phones and tablets (including a
points in different research is to improve the
variety of power changer connections if
3
needed) and resting on a platform with
adjustable height
• A wheel component that can navigate inside
of a classroom
• A communication module between the smart
phones/tablets and the robotic platform
The communication is accomplished by adding
an Arduino component with built-in Bluetooth,
through which the teachers can control the
robots remotely. Refer to Figure 2 for the
detailed design of the hardware component.
Through testing, it was discovered that the
communication quality cannot be guaranteed if
one relies only on the microphone and speaker
of the smart phones/tablets. Thus, an additional Figure 2 Design of mobile telepresence robot; (a)
microphone array and speaker were added to the entire structure, (b) base
platform.
4.2 DESIGN OF THE INTERFACE
The software package is needed for the devices
(i.e. the chosen smart phones or tablets) at both The usability of new educational technologies
the teachers’ and students’ sides. It should plays a crucial role in the students’ learning
enable the following functionalities: outcomes [22]. As mentioned above, two sets of
apps (Android and IOS) are provided as user
• Voice and video communications between interface, for both the students’ and the teacher’s
the educators and the students site. This provides flexibility to both end-users,
• Voice and video recording and replay from and the teaching is enabled if both ends have any
both ends if needed smart devices. The students will focus on
• The educators should be able to control the learning the class materials, and thus the
robots remotely though their devices. complexity of their interface was minimized
Two sets of apps were developed as user (refer to Figure 3). The telepresence robot serves
interface, one each for the Android and IOS as Avatar of the teacher in the classroom, and
platforms. WebTRC (a peer-to-peer thus the screen on the robot only shows the face
communication protocol) was used to develop of the teacher. This interface also provides a
the apps. trouble shooting function in case that any
unexpected scenario interrupts the
communication.

Figure 3 User interface on student site


Using the robot presented here, the teachers can
teach the class remotely requiring only any
smart device (a cellphone or tablet suffices). The
4
teachers can then focus on the class materials received from both ears [23]. If the remotely
and the interaction with the students, while in located teachers only rely on the sound source
the meantime controlling the robot to navigate captured by the mounted tablet (which usually
within the classroom. It should be noted that only has one microphone), they may identify the
multi-tasking can be a problem, especially for scenario, for instance a certain student raising a
those teachers that are not familiar with new question during the lecture. However, then they
technologies. face a dilemma, especially when the student
with the question is not located within the field
In the application presented here, the teacher can
of view of the camera. The teachers can then
still manually control the robots to navigate and
answer the question directly without trying to
interact with the students (refer to Figure 3).
find out who is asking and then address the
However, those functions are performed
question in a “face-to-face” style. Alternatively,
automatically most of the time. The robots can
they can stop the lecture, manually turn the robot
identify the voice source if the students raise any
around until they identified and located the
questions, scan the classroom to generate a map,
student, maneuver the robot close to the student
give the teachers the option to click on the
and finally address the problem directly with the
students’ head if he/she wants to interact with a
students. None of the two scenarios above is
specific student closely, navigate to the selected
preferred.
student automatically (refer to the following
sections for more details). In the meantime, all To solve the above-mentioned problem –
the teachers need to do is focus on teaching. identifying and locating students with questions
during lecture – a microphone array [24] is used
in the mobile telepresence robot presented here.
This microphone array features seven
high-performance digital microphones, which
can be used to localize the voice source – i.e. to
calculate both the direction and distance. Refer
to Figure 5 for a pilot test setup for the
microphone array. In addition, using the seven
audio acoustic sources, the system is capable of
suppressing noise, cancelling acoustic echos,
and capturing far-field voices.
Figure 4 User interface on teacher site

4.3 MICROPHONE ARRAY


It is natural for human teachers to focus on
talking about the class materials while paying
attention to the students’ feedback. Addressing
the students’ questions and clearing any
confusion they might have in time is very
important both for the teaching quality and the
students’ learning outcomes. Identifying the
voice source, turning to the student that raised a
question, and answering the question is rather
straightforward for a human. However, it turns
out to be much more difficult for a robot to
accomplish.
Humans can locate the source of a sound with
extreme precision (within two degrees of space),
This is accomplished by the brain’s ability to
interpret the difference in the information
5
microphone array also provides the direction and
distance of the voice source. The teacher can
then initiate an auto-turn of the remotely located
robot to the direction of the voice source.
Combining the location information and a facial
detection module (refer to the section below),
the teacher can choose the student if he/she
decides to interact at a close distance. The robot
automatically navigates close to the student to
ensure a “face-to-face” interaction.
On the other hand, if the student raises a
question by using phrases not in the pre-defined
set of words (such as “I’m confused”, “Could
you explain that?”, “What about x, y, z?”, etc.),
the teacher can manually initiate a function to
find the last detected audio source and then
auto-turn to the student and follow the same
procedure described above.
Figure 5 Microphone array to determine direction and 4.5 FACIAL DETECTION
distance of a voice source
Face detection and recognition are well-known
4.4 SPEECH RECOGNITION and widely used techniques in autonomous robot
One of the main challenges of using telepresence as well as other applications [29]. They have
robots is that some users may not be familiar also been adopted in the application fields of
with new technologies, thus having trouble with telepresence robots. They can be used as
the user interface. Voice commands are often enhanced diagnostics techniques, for instance to
used as an alternative to solve this trouble [25, monitor the health states of a patient [30] or to
26]. For example, a telepresence robot platform recogniz a human face and his/her identify to
equipped with a natural human interface, provide a better experience in remote
autonomous navigation, hand-gesture video-conferencing [ 31 ]. As the research in
recognition and speech recognition through computer vision advanced, it has been proven
cloud computing services provided by Google that including feature such as face recognition in
has been proven to be very effective [27]. telepresence robot applications can be
inexpensive and have high performance at the
The core component of voice commanding is same time [32].
understanding the instructions received by the
microphone – i.e. speech recognition. In the A state-of-the-art real-time object detection
application presented here, an intuitive and system YOLO [ 33 ] was implemented in the
easy-to-use interface was developed, wherein system presented here. Using an open-source
the voice command is not used to control the neural network framework (Darknet [34] with
robot. However, a speech recognition module CUDA [35] and OpenCV [36]), the system can
(Google Assistant API [ 28 ]) is constantly perform a real-time face detection and use the
running in the background, analyzing the audio result to identify a human within the field of
received by the microphone array. This feature view of a webcam (e.g. cameras on the tablet
is combined with the previously mentioned used here). The result of human detection is
voice source localization. The interface on the shown below.
teacher site will show a notification if the speech
recognition module indicates that a student has
raised a question, by detecting words such as
“Excuse me”, “Yes?”, “Hello”, “Sorry”, etc. The

6
example for the use of a classroom projector.
The communication between the teacher’s
computer and the classroom projector, as well as
the communication between teacher’s interface
(either smart phone or tablet) and the
telepresence robot is controlled by the same
server, albeit as two independent processes.

Figure 6 Face detection results on teachers’ interface


In the future, the students’ facial data will be Figure 7 Communication with existing classroom
collected and used as training dataset in YOLO. equipment
The system will enabled to identify each student
4.7 MAPPING AND AUTOMATIC
in the classroom. The teacher will be able to
click on the student to either access the NAVIGATION
information of that student or command the Simultaneous localization and mapping (SLAM)
robot to automatically navigate close to the [37, 38] is a well-known technique in robotic
selected student for a ‘face-to-face” interaction. mapping and navigation. It can construct and
4.6 COMMUNICATION WITH EXISTING update a map of an unknown environment while
simultaneously keeping track of a robot’s
CLASSROOM EQUIPMENT location [39]. In the application presented here,
Many teachers find using chalkboards or it is desired that the robot can automatically
whiteboards to be almost a thing of the past with navigate to any location chosen by the teacher.
the availability of projectors in many In addition, it is preferred that the robot can
classrooms. Utilizing classroom equipment such function in different classrooms – i.e. an
as projectors in teaching can have many unknown environment.
benefits. It provides greater teaching versatility, To ensure that the interface for the teachers is
a better use of class time, better presentation, easy to use, it is necessary to avoid exposing
etc. This might not be an option if the teachers them to excessive information. Automatic
are not physically located in the classroom. In mapping and navigation is essential in this
my opinion, the ability to use in-classroom project to ensure a better interaction between the
equipment such as projectors can improve teachers and the students. However, the teachers
student learning outcomes, especially in those should focus on the class materials and control
STEM disciplines where communicating ideas the robot to move to a certain student by simply
through talking is often not enough and clicking on the student’s face and confirming the
insufficient for the students to understand the selection afterwards. The details of the map
concepts presented. generation and navigation path is hidden from
The interface in the teacher site introduced here the teacher in the default system setup.
was designed to be user friendly. Adding too
many features in there would hamper the goal of
being easy to use. Therefore, a separate
computer (located at the teacher’s site) is needed
if the teacher wishes to use the existing
classroom equipment. Refer to Figure 7 as an

7
Figure 8 Map generated using laser scanner
A robot simulation package called Gazebo [40]
was chosen as development tool here. Gazebo
makes it possible to rapidly test algorithms,
design robots, perform regression testing, and Figure 9 Path finding for automatic navigation
train AI systems using realistic scenarios. Future
developments and improvements will be 5 CONCLUSION AND FUTURE WORK
simplified by taking advantage of these
functions of Gazebo. Refer to Figure 8 for the In the research presented here, a telepresence
result of the map of a random indoor robot was developed specifically for educational
environment generated using the Gazebo purposes. In addition to the “standard” function
package and a laser scanner. Through the testing of remote audio and video conferencing that is
that was performed as part of the project offered by most telepresence robots, the robot
described here, it was discovered that using a developed here is able to maneuver within a
combination of laser scanner and camera, the classroom and interact with specific students
quality of the resulting maps is better than using automatically. Using a microphone array
them alone. However, laser scanners are usually combined with audio detection and speech
very expensive, and thus they represent a large recognition, the robot is able to help identify an
portion of the final price of a telepresence robot. individual student that needs a “face-to-face”
In the application of classroom teaching, the interaction. The mobile robot generates maps of
indoor maps are usually static (i.e. they do not classrooms and has ability to automatically
change rapidly) and lower quality maps are navigate using a camera and/or a laser scanner.
acceptable for path planning and navigating. Part With the help of face detection, the teachers can
of the future research will be a focus on indoor maneuver the robot within a classroom and
mapping and navigation only based on the interact with students easily. The system also
camera, thus eliminating the expense for a laser provides an additional function to enable the
scanner. One example of path planning and teachers to use the classroom equipment to assist
automatic navigation is depicted in Figure 9, in in their teaching. Prototype testing with a small
which the microphone array identified the group of students has confirmed that the mobile
location of the voice source, the teacher selected telepresence robot presented here is easy to use
the “goal” location by clicking on the student’s and has great potential in educational
face and then the robot generated a path for applications.
automatic navigation. The future research will be focus on collecting
data from real classroom teaching using the
mobile telepresence robot and improve the
design. The first phase of data collection will
start with the assembly of a pilot testing group
(preferably students from a diverse background).
The testing will be focused on assessing the
conversation quality between the remotely
8
located teacher and the in-classroom students
through the robot. Feedback will be collected [6] Cross, Matthew, and Tony L. Campbell.
and used for design improvements. The second "Mobile robot for telecommunication." U.S.
phase will be a large scale real classroom testing
Patent No. 9,296,109. 29 Mar. 2016.
(preferably classes from various STEM
disciplines). The teachers will be teaching [7] Myodo, Emi, et al. "Issues and solutions to
remotely, with the presence of teaching informal communication in working from
assistants to avoid potential technical home using a telepresence robot." ITE
difficulties. A pre-test and post-test will be
conducted to evaluated the students’ learning Transactions on Media Technology and
outcomes. The results will be compared with the Applications 6.1 (2018): 30-45.
traditional classroom setup using the same pre- [8] Ellison, Lars M et al. “Postoperative robotic
and post-test. Meanwhile, a questionnaire will
telerounding: a multicenter randomized
be administered to evaluate the usability of the
telepresence robot platform for both the assessment of patient outcomes and
instructors and the students as well as the satisfaction.” Archives of surgery 142 12
instructor credibility. (2007): 1177-81; discussion 1181.
[9] Gandsas, Alex, et al. "Robotic telepresence:
REFERENCES: profit analysis in reducing length of stay
after laparoscopic gastric bypass." Journal
[1] Tsui, Katherine M., and Holly A. Yanco.
of the American College of Surgeons 205.1
"Design challenges and guidelines for social
(2007): 72-77.
interaction using mobile telepresence
[10] Vespa, Paul M., et al. "Intensive care unit
robots." Reviews of Human Factors and
robotic telepresence facilitates rapid
Ergonomics 9.1 (2013): 227-301.
physician response to unstable patients and
[2] Kristoffersson, Annica, Silvia Coradeschi,
decreased cost in neurointensive care."
and Amy Loutfi. "A review of mobile
Surgical neurology67.4 (2007): 331-337.
robotic telepresence." Advances in
[11] Sucher, Joseph F., et al. "Robotic
Human-Computer Interaction 2013 (2013):
telepresence: a helpful adjunct that is viewed
3.
favorably by critically ill surgical patients."
[3] Lee, Min Kyung, and Leila Takayama.
The American Journal of Surgery 202.6
"Now, I have a body: Uses and social norms
(2011): 843-847.
for mobile remote presence in the
[12] Coradeschi, Silvia, et al. "Giraffplus:
workplace." Proceedings of the SIGCHI
Combining social interaction and long term
Conference on Human Factors in Computing
monitoring for promoting independent
Systems. ACM, 2011.
living." 6th International Conference on
[4] Guizzo, Erico. "When my avatar went to
Human System Interaction (HSI). IEEE,
work." IEEE Spectrum 47.9 (2010).
2013.
[5] Wang, Yulun, et al. "Protocol for a remotely
controlled videoconferencing robot." U.S.
Patent No. 9,375,843. 28 Jun. 2016.

9
[13] Labonte, Daniel, et al. "A pilot study on [21] Hurst, Beth, Randall Wallace, and Sarah
teleoperated mobile robots in home Nixon. "The impact of social interaction on
environments." IEEE/RSJ International student learning." Reading Horizons
Conference on Intelligent Robots and (Online) 52.4 (2013): 375.
Systems. IEEE, 2006. [22] Chang, Yizhe, et al. "Usability evaluation of
[14] Tsai, Tzung-Cheng, et al. "Developing a a virtual educational laboratory platform."
telepresence robot for interpersonal Computers in Education Journal 7(1):24-36
communication with the elderly in a home
environment." Telemedicine and [23] Phillips, Dennis P., Chelsea K. Quinlan, and
e-Health 13.4 (2007): 407-424. Rachel N. Dingle. "Stability of central
[15] Koceski, Saso, and Natasa Koceska. binaural sound localization mechanisms in
"Evaluation of an assistive telepresence mammals, and the Heffner hypothesis."
robot for elderly healthcare." Journal of Neuroscience & Biobehavioral Reviews 36.2
Medical Systems 40.5 (2016): 121. (2012): 889-900.
[16] Sheehy, Kieron, and Ashley A. Green. [24]
"Beaming children where they cannot go: https://www.seeedstudio.com/ReSpeaker-
telepresence robots and inclusive education: Mic-Array-Far-field-w%2F-7-PDM-Microp
An exploratory study." Ubiquitous Learning: hones-p-2719.html, accessed on April 30,
an International Journal 3.1 (2011). 2018.
[17] Kwon, Oh-Hun, et al. "Telepresence robot [25] Nakano, Fumio. "Method and system for
system for English tutoring." IEEE controlling an external machine by a voice
Workshop on Advanced Robotics and its command." U.S. Patent No. 5,230,023. 20
Social Impacts (ARSO). IEEE, 2010. Jul. 1993.
[18] [26] Van Kommer, Robert. "Mobile robot and
https://www.allhomerobotics.com/best-tele method for controlling a mobile robot." U.S.
presence-robots/, accessed on April 30, Patent No. 6,584,376. 24 Jun. 2003.
2018. [27] Do, Ha M., et al. "An open platform
[19] Cha, Elizabeth, Samantha Chen, and Maja J. telepresence robot with natural human
Mataric. "Designing telepresence robots for interface." IEEE 3rd Annual International
K-12 education." 26th IEEE International Conference on Cyber Technology in
Symposium on Robot and Human Interactive Automation, Control and Intelligent Systems
Communication (RO-MAN). IEEE, 2017. (CYBER). IEEE, 2013.
[20] Roschelle, Jeremy M., et al. "Changing how [28]
and what children learn in school with https://developers.google.com/assistant/sdk
computer-based technologies." The Future /reference/rpc/, accessed on April 30, 2018.
of Children (2000): 76-101.

10
[29] Zhang, Z., et al. "A virtual laboratory system [39] Zhang, Zhou, et al. "Real-Time 3-D
with biometric authentication and remote Reconstruction for Facilitating the
proctoring based on facial recognition." Development of Game-based Virtual
Proceedings of the 2016 ASEE Annual Laboratories." 2015 ASEE Annual
Conference and Exposition, New Orleans, Conference & Exposition. 2015.
LA, USA. 2016. [40] http://gazebosim.org/, accessed on April 30,
[30] Pinter, Marco, Timothy C. Wright, and H. 2018.
Neal Reynolds. "Enhanced Diagnostics for a
Telepresence Robot." U.S. Patent
Application No. 14/108,036.
[31] Schulz, Karsten. "Video conferencing
system with physical cues." U.S. Patent No.
7,092,001. 15 Aug. 2006.
[32] Janard, Krit, and Worawan Marurngsith.
"Accelerating real-time face detection on a
raspberry pi telepresence robot." Fifth
International Conference on Innovative
Computing Technology (INTECH). IEEE,
2015.
[33] https://pjreddie.com/darknet/yolo/, accessed
on April 30, 2018.
[34] https://pjreddie.com/darknet/, accessed on
April 30, 2018.
[35] https://developer.nvidia.com/about-cuda,
accessed on April 30, 2018.
[36] https://opencv.org/, accessed on April 30,
2018.
[37] Durrant-Whyte, Hugh, and Tim Bailey.
"Simultaneous localization and mapping:
part I." IEEE Robotics & Automation
Magazine 13.2 (2006): 99-110.
[38] Bailey, Tim, and Hugh Durrant-Whyte.
"Simultaneous localization and mapping
(SLAM): Part II." IEEE Robotics &
Automation Magazine 13.3 (2006): 108-117.

11

View publication stats

You might also like