Content uploaded by Guillermo Lasso R.
Author content
All content in this area was uploaded by Guillermo Lasso R. on Jan 11, 2020
Content may be subject to copyright.
ADVANCED HUMAN-ROBOT INTERACTION FOR LEARNING WITH
ROBOTIC PROCESS AUTOMATION
G. Lasso-Rodríguez1, R. Gil-Herrera2
1Universidad Americana de Europa (MEXICO)
2Universidad Americana de Europa (SPAIN)
Abstract
The teaching-learning processes supported by technology are a central and current issue in the
educational and social modern context. In particular, the automation of processes is not necessarily
triggered by a problem; in many cases, it all starts with opportunities. It is the scientific curiosity and/or
the longing for discovery, which sometimes allows materialising innovative ideas. This article is
motivated by the general question: can RPA (Robotic Process Automation) robots apply advanced
HRI (human-robot interaction) in education? Recent experiments have demonstrated how RPA can be
used in education as a resource to support teaching processes. This is already impressive, but can we
go even beyond that and empower the robots with additional means for teaching? Many ideas and/or
some AI (artificial intelligence) possibilities have not become evident, yet. The literature has been
reviewed, seeking for prove of RPA robots applying specifically: facial recognition, eye tracking and
ML (machine learning) in education, as part of the visual and oral interaction with the students. The
results of such review led to the execution of own experiments, interesting learning experiences and
findings, which are exposed through the document, and illustrated with the aid of videos to prove
achieved the results. It concludes by commenting on the broader implication of the findings.
Keywords: Robot, RPA, learning, artificial intelligence, teaching, education.
1 INTRODUCTION
Technology is an enabler to improve teaching-learning processes. It can support on post-learning
activities related to measurement, evaluation, as well as sentiment analysis using feedbacks to
monitor the education quality. Such administrative or management tasks would normally take much
effort if performed manually [1]. Similarly, the tools employed during the teaching labour, for example,
an LMS (Learning Management System) and/or specialized application modules (e.g. quizzes,
surveys, instant scoring, ranking, videos), can generate a positive impact on the student achievement.
Teaching methods, such applying game elements (i.e. gamification) can increase the student
motivation and his/her engagement too [2].
Varying personality and the uniqueness of each student, influenced by the changing social and
technological behaviours, may nowadays prove a teaching technique to have lower effectiveness for
learning, than it used to. Newer technologies are increasingly accessible to everyone, independently
of ages. There are primary education students with computers in their pocket, discussing the
homework with multiple classmates through videoconferences, and asking questions, e.g. weather
conditions, to their own mobile virtual assistants.
The rapid rate at which technological conditions are changing and impacting social interactions,
education systems, the media consumption, and work, should be seen as an opportunity or enabling
factor, to adapt AI (artificial intelligence) driven innovation strategies [3].
ER (educational robotics) can help to keep students engaged and motivated with the robots, while
also serving as a knowledge construction and assistive tool to improve mainly on specific fields, e.g.
STEM (Science, Technology, Engineering and Mathematics), promote the students’ well-being, and
reduce the risk of drop out and/or social exclusion, if the tasks are not overly complicated, and
pedagogical support is provided [4].
A different perspective for robots, associated to education, consists of making the robot a protagonist
in the teaching-learning process. Robot teacher, robot tutor, or robot instructor, are different names to
refer to robots with HRI (human-robot interaction) and teaching capabilities. These can be mechanical
(e.g. humanoid) or software-based (e.g. RPA robot), normally emulating human behaviour, reliant on
AI techniques, and are usually subject to discussions related to morality, values, psychological
welfare, and privacy, among others [5].
Proceedings of ICERI2019 Conference
11th-13th November 2019, Seville, Spain
ISBN: 978-84-09-14755-7
7718
RPA’s classic definition included the automation, by software robots, of processes consisting of high
volume of repetitive, standardized, rule-based and/or routine administrative white-collar tasks,
traditionally executed by humans.
The employment of RPA robots as teachers, or supporting in teaching processes, is an innovative
concept with great opportunities for further development [6]. Therefore, the question, can RPA robots
apply advanced HRI in education? Henceforth, the specific objectives sought in this document are: to
review the technical literature, seeking for advanced usage of HRI in education with RPA; to verify and
validate whether RPA can be used to apply eye tracking, face recognition and ML in education; to
explain how can these functionalities be integrated with an RPA robot.
2 FRAMEWORK
The scientific literature about RPA is deemed to be scarce and, this year, the issue is still attributed to:
being a relatively new technology. RPA is more often implemented, than investigated by researchers.
[7]. Much documentation is found in software providers’ web sites, summaries from consulting
companies, corporate white papers and reports on success stories with benefits [8].
There seem to be a lack of definite agreement or common understanding about RPA and what is
possible with the technology. The misconception identified in some papers led to discarding parts of,
or whole, articles. This situation was in some cases, and to some extent, understandable, due to the
rapidly increasing RPA adoption and AI research of the last couple of years. As the analysis of this
matter is not in the scope of this writing, here are only two recent examples, of appreciated papers,
and less serious cases, but worth mentioning: ‘Communication Robot for Elderly based on Robotic
Process Automation’ [9], with most of the content hardly fitting into RPA; ‘Can Machine replace Man? -
A conceptual study’ [10], which contradicts itself, isolating voice recognition and artificial intelligence
from RPA.
The traditional RPA software has been evolving, as result of partnerships to strengthen AI and
associated functionalities, impacting as well its concept, extended towards AI, cognitive computing,
process mining and data analytics [7].
RPA is said to be industry agnostic, however there was not much available in relation to HRI in
education, which is the interest in this article.
3 METHODOLOGY
The applied methodology is based on a comprehensive literature revision with a focus on RPA using
advanced HRI, in the process of facilitating human learning or the acquisition of knowledge. The
‘advanced’ attribute under research emphasizes on AI and techniques such as eye tracking, facial
recognition, and ML applications.
Even though, theoretically, there are almost endless technological possibilities with the use of robots,
many are still to be proved, especially for education. When there is no evidence found for the
mentioned HRI approaches in scope, own experiments are initiated and the results documented.
4 RESULTS
UiPath was used as RPA software for the experiments, due to its top leadership position in the
Gartner’s Magic Quadrant for RPA Software this year [11], however similar integrations could be
achieved with software from different providers too.
Depending on the perspective and/or strictness for considering a functionality to be AI or not, by just
developing a simple RPA robot, you might already be making use AI. For example, when automating
in virtual and/or remote environments like VMware, Citrix or Remote Desktop, where the screen
objects are not easily identifiable, the robot can utilise Computer Vision, without the user realising the
technology that lies behind. Similarly, it can occur with legacy application interfaces based on Flash
and Silverlight.
RPA gets even more powerful when integrated with added AI and cognitive automation, to leverage
different algorithms and technology approaches, starting to deal with unstructured data and handling
unplanned situations [12]. However, the experience proves that the usage of modern technology does
not bring solutions automatically. For example, there is no magic happening after applying ML with
7719
RPA robots, since the results will be as good as the models are, and even when having good ML
applications, if the data is not good enough, this will be reflected in the quality of the outputs, decisions
or actions of the robot.
Data scientists and/or engineers develop top notch ML models, and afterwards these have to be used
by the robots and integrated with end user applications. These last tasks could take double the
effort/time. With new RPA tools, like AI Fabric (see Fig. 1), according to Prabhdeep Singh, VP AI at
UiPath, ‘you could do this in 30 seconds… you upload your models to the cloud (e.g. GCP – Google
Cloud Services), no need to worry about manageability, … or versioning. As an RPA developer, you
point to the models that have been uploaded, in the Orchestrator, and you can easily include them as
part of your workflows’. This kind of tools eases previous challenges on robot’s ML scalability and
maintenance [13]. AI Fabric has raised high expectations, it’s been under development this year and a
public preview is expected before the year ends.
Figure 1. UiPath’s Orchestrator ‘ML skills’ from AI Fabric. Source: Own elaboration.
AI Fabric allows to use your own built skills, or pre-trained skills from UiPath or third parties, by just
doing drag and drop, like other objects in Studio. The ML skills can receive the data back to keep
improving the model.
Models can also be built to analyse eye-gaze data. Eye tracking technologies have already been used
to better understand how humans pay attention during interaction with mechanical robots. The point of
gaze data can be tracked online, or collected for gaze patterns analysis, e.g. to improve engagement
and HRI capabilities [14].
The same can be achieved with RPA robots. Depending on the eye tracking device and software, you
may or not have to spend on additional licensing. Some providers bill more if the storage of data, or
commercial use, is intended.
In our experiments, a Tobii Eye Tracker 4C (gaming.tobii.com/product/tobii-eye-tracker-4c) device was
tested (watch video under Fig. 2), in conjunction with an SDK from Tobii AB (github.com/Tobii/
CoreSDK), as well as a Microsoft API (docs.microsoft.com/en-us/uwp/api/windows.devices.input.
preview) for UWP (Universal Windows Platform) development. Detailed requirements and instructions
are provided in the above addresses.
7720
Figure 2. RPA robot applying HRI with eye tracking. Source: Own elaboration.
Another interesting functionality to enrich the HRI in RPA is to incorporate facial expressions or face
recognition capabilities.
The ability to recognise faces is a fact nowadays on multiple RPA software, however, not for the HRI
practice, as it is normally implemented for end user’s identification on attended RPA workflows, where
only specific people are authorized to operate with the robot.
We have experimented using a one-decade-old webcam (watch video under Fig. 3), which happened
to be good enough, with the UiPath Face Recognition Framework (github.com/andumorie/uipath-face-
recognition-framework), and Microsoft Facial Recognition (azure.microsoft.com/en-us/services/
cognitive-services/face). These and many other AI components are easily accessible and available
with documentation at UiGo! (go.uipath.com).
Figure 3. RPA robot applying HRI with facial recognition. Source: Own elaboration.
Some AI functionalities are currently only available online via third-party cloud services (e.g. API),
which in cases mean entering into a subscription or consumption-based contract.
7721
The three main AI options mentioned before can be useful for a rich HRI experience, either combined
or applied independently, in multiple ways by the robot teacher. For example, if the robot loses the
eye-gaze of the student for certain time, it can verify the student’s face to ensure it is talking to the
same person. If it is the same person and the student still loses focus frequently, an ML model can be
executed find the pattern leading to the loss of attention. The robot can be trained to act with the
student in different ways, depending on its findings.
5 CONCLUSIONS
After reviewing the literature, there was no meaningful article found in regard to HRI in education with
RPA. However, at the current pace the technology is advancing, there will likely be new RPA articles
released in 2020, combining both: education and HRI.
It has been demonstrated that RPA robots can apply advanced HRI, namely: ML, eye tracking, and
facial recognition, in education, as part of the visual and oral interaction with the students. Moreover,
information about the specific tools used, with links for integration details, and instructions for testing
as well additional AI components, have been shared.
Preparation and maintenance of additional AI functionalities for the robot teacher may require extra
effort, when these are not available in the RPA software out of the box. This eventually implies a
higher project cost.
The outcomes of the tests performed, supported by the videos, draw new lines for what is evidently
possible, and also set higher expectations on coming developments of HRI for learning with RPA.
REFERENCES
[1]
G. Chauhan, P. Agrawal and Y. Meena, "Aspect-Based Sentiment Analysis of Students’
Feedback to Improve Teaching–Learning Process," Information and Communication Technology
for Intelligent Systems., vol. 107, pp. 259-266, 2019.
[2]
D. Orhan Göksün and G. Gürsoy, "Comparing success and engagement in gamified learning
experiences via Kahoot and Quizizz," Computers & Education, vol. 135, pp. 15-29, 2019.
[3]
Observatory of Public Sector Innovation (OPSI), "Hello, World: Artificial Intelligence and its use
in the Public Sector," Organisation for Economic Co-operation and Development (OECD), 2019.
[4]
L. Daniela and M. Lytras, "Educational Robotics for Inclusive Education," Technology,
Knowledge and Learning, vol. 24, no. 2, pp. 219-225, 2019.
[5]
M. Smakman and E. Konijn, "Robot Tutors: Welcome or Ethically Questionable?," in Advances
in Intelligent Systems and Computing, vol. 1023, Springer, Cham, 2019, pp. 376-386.
[6]
G. Lasso-Rodríguez and R. Gil-Herrera, "Robotic Process Automation Applied to Education: A
New Kind of Robot Teacher?," in ICERI2019: 12th annual International Conference of Education,
Research and Innovation, Sevilla, 2019. Submitted.
[7]
L. Ivančić, D. Suša Vugec and V. Bosilj Vukšić, "Robotic Process Automation: Systematic
Literature Review," Business Process Management: Blockchain and Central and Eastern Europe
Forum, vol. 361, pp. 280-295, 2019.
[8]
S. Madakam, R. Holmukhe and D. Jaiswal, "The Future Digital Work Force: Robotic Process
Automation (RPA)," Journal of Information Systems and Technology Management, vol. 16, 2019.
[9]
T. Kobayashi, K. Arai, T. Imai, S. Tanimoto, H. Sato and A. Kanai, "Communication Robot for
Elderly based on Robotic Process Automation," in 2019 IEEE 43rd Annual Computer Software
and Applications Conference (COMPSAC), 2019.
[10]
N. Afza and D. Kumar, "Can Machine replace Man? – A conceptual study," Asia Pacific Journal
of Research, vol. 1, no. XCV, pp. 52-58, 2018.
[11]
Gartner Research, "Magic Quadrant for Robotic Process Automation Software," 2019.
7722
[12]
M. Kirchmer and P. Franz, “Value-Driven Robotic Process Automation (RPA),” Business
Modeling and Software Design, vol. 356, pp. 31-46, 2019.
[13]
P. Singh, Interviewee, UiPath AI Fabric Vision: Bringing AI & RPA Together - #GoogleNext19.
[Interview]. 10 April 2019.
[14]
K. Ijuin, K. Jokinen, T. Kato and S. Yamamoto, "Eye-gaze in Social Robot Interactions –
Grounding of Information and Eye-gaze Patterns," in The 33rd Annual Conference of the
Japanese Society for Artificial Intelligence, Niigata, 2019.
7723