Fig 1 - uploaded by Tomas Kot
Content may be subject to copyright.
Mobile robot Hercules and a screenshot of the existing 2D graphical user interface.

Mobile robot Hercules and a screenshot of the existing 2D graphical user interface.

Source publication
Article
Full-text available
This paper mentions some problems related to utilization of a head-mounted display (HMD) for remote control of mobile robots by a human operator and also presents a possible solution. Considered is specifically the new HMD device called Oculus Rift, which is a very interesting device because of its great parameters and low price. The device is desc...

Contexts in source publication

Context 1
... important discovery was that the stereovision cameras on the mobile robot Hercules (Fig. 1a) are not placed in a very good location, because both cameras see the last link of the arm from a very close distance (Fig. 1b) and produce extremely different views of it that are almost impossible for the brain to fuse together into a single stereo view. The arm occupies a significant portion of the pictures and thus it breaks the ...
Context 2
... important discovery was that the stereovision cameras on the mobile robot Hercules (Fig. 1a) are not placed in a very good location, because both cameras see the last link of the arm from a very close distance (Fig. 1b) and produce extremely different views of it that are almost impossible for the brain to fuse together into a single stereo view. The arm occupies a significant portion of the pictures and thus it breaks the overall depth ...

Similar publications

Conference Paper
Full-text available
For numerous real-world applications teleoper- ated unmanned guided vehicles (UGVs) can quite successfully assist a human in fulfilling her mission objectives. It is impor- tant for the specialists to get an overview of the site quickly and with as intuitive means as possible. Our approach to an intuitive human-machine interface for visually teleop...

Citations

... Numerous studies to develop VR/AR-based GCS have been done before. In Kot and Novák (2014) proposed a system to control a robot using a VR device, namely Oculus Rift. In García et al. (2017) proposed a VR system to precisely control UUV movement with an external joystick. ...
... In Gong et al. (2017), design an AR system to display video from a Raspberry Pi camera mounted on a humanoid robot. Finally, Kot et al. (2018) proposed a system to view robot perspectives using HoloLens with a separated joystick to control robot movement. In Hoppenstedt et al. (2019) proposed an (MR) based system to visualize the flight data of UAV that is controlled directly by the ROS platform. ...
Article
Full-text available
Human–robot interaction (HRI), which studies the interaction between robots and humans, appears as a promising research idea for the future of smart factories. In this study, HoloLens as ground control station (HoloGCS) is implemented, and its performance is discussed. HoloGCS is a mixed reality-based system for controlling and monitoring unmanned aerial vehicles (UAV). The system incorporates HRI through speech commands and video streaming, enabling UAV teleoperation. HoloGCS provides a user interface that allows operators to monitor and control the UAV easily. To demonstrate the feasibility of the proposed systems, a user case study (user testing and SUS-based questionnaire) was performed to gather qualitative results. In addition, throughput, RTT, latency, and speech accuracy were also gathered and analyzed to evaluate quantitative results.
... Virtual Control Room Model: In the Virtual Control Room Model, the user is placed in a virtual room that serves as a supervisory command and control center of a remote robot. Within the control room, the user is able to interacts with displays and objects in the virtual space itself, and can view 3D stereo video streams projected Manuscript submitted to ACM on the rooms walls thus still allowing the user to still see from the robot's perspective with stereoscopic depth perception [50,55] ...
... Uses include providing users with a means of understanding a robot's current state in which case a virtual 3D model of a real robot mimics a physical robot's joint configurations in real time. Visualization Robots are particularly useful in situations of limited situational awareness when the user cannot directly see the physical robot (or portions of the robot), such as in remote teleoperation tasks [50] (see Figure 3-A). ...
... Internal Reading VDEs display data returned from internal sensors (e.g., battery levels, robot temperatures, wheel speeds). Making this information more readily available to users may enhance situational awareness and prevent mishaps such as robots running out of battery in the middle of mission-critical tasks or traveling too fast through environmental hazards [50] (see Figure 3-A in which robot battery levels are displayed next to the stereo video stream). Internal Readiness VDEs display data about sensors and actuators, such as whether a sensor is ready to collect data or whether an actuator is ready to function, and if not, why (e.g., whether a sensor is disconnected, an actuator is experiencing a fault, etc.). ...
Article
Full-text available
Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) has been gaining considerable attention in HRI research in recent years. However, the HRI community lacks a set of shared terminology and framework for characterizing aspects of mixed reality interfaces, presenting serious problems for future research. Therefore, it is important to have a common set of terms and concepts that can be used to precisely describe and organize the diverse array of work being done within the field. In this paper, we present a novel taxonomic framework for different types of VAM-HRI interfaces, composed of four main categories of virtual design elements (VDEs). We present and justify our taxonomy and explain how its elements have been developed over the last 30 years as well as the current directions VAM-HRI is headed in the coming decade.
... For example, research shows that visualizations that provide the location where sensors are collecting data help operators understand the context of their task, thereby improving performance [7], [21]. Similarly, in our study, roboticists perceived that visual debugging tools helped them understand how the internal workings of the robot related to the real world: Other research has looked into providing visualizations of spatial regions to communicate sensed, inherent and userdefined spatial regions to users [4], [48], [49]. In our study, four (50%) of the RViz users and five (62.5%) of the AR users expressed a greater comprehension of the robot's spatial environment when this data was displayed. ...
... Augmented reality can be an effective tool to convey robot sensor, state and task data to the user. This has been illustrated through the exploration of visualizing robot state and intent [9]- [13], [20], [21]. These works demonstrate the value of displaying digital twins and spatial data to convey important information to the user. ...
... However, there has been limited work specifically focusing on visualization design choices supporting HRI Pioneer HRI 2022, March 7-10, 2022, Sapporo, Hokkaido, Japan 1161 978-1-6654-0731-1/22/$31.00 ©2022 IEEE roboticists during the debugging process. Some early systems have been developed to help users monitor the inner state and behavior of mobile robots, manipulators, and robotic swarms [20], [22]- [24]. But, these applications apply perspective 3D, which remove the depth cue of stereopsis and potentially detaches the data from its true context. ...
Conference Paper
Full-text available
... Virtual Control Room Model: In the Virtual Control Room Model, the user is placed in a virtual room that serves as a supervisory command and control center of a remote robot. Within the control room, the user is able to interacts with displays and objects in the virtual space itself, and can view 3D stereo video streams projected on the rooms walls thus still allowing the user to still see from the robot's perspective with depth [45,50] (see ...
... Uses include providing users with a means of understanding a robot's current state in which case a virtual 3D model of a real robot mimics a physical robot's joint configurations in real time. Visualization Robots are particularly useful in situations of limited situational awareness when the user cannot directly see the physical robot (or portions of the robot), such as in remote teleoperation tasks [45] (see Figure 3-A). ...
... Internal Reading VDEs display data returned from internal sensors (e.g., battery levels, robot temperatures, wheel speeds). Making this information more readily available to users may enhance situational awareness and prevent mishaps such as robots running out of battery in the middle of mission-critical tasks or traveling too fast through environmental hazards [45] (see Figure 3-A). Internal Readiness VDEs display data about sensors and actuators, such as whether a sensor is ready to collect data or whether an actuator is ready to function, and if not, why (e.g., whether a sensor is disconnected, an actuator is experiencing a fault, etc.). ...
Preprint
Full-text available
Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) has been gaining considerable attention in research in recent years. However, the HRI community lacks a set of shared terminology and framework for characterizing aspects of mixed reality interfaces, presenting serious problems for future research. Therefore, it is important to have a common set of terms and concepts that can be used to precisely describe and organize the diverse array of work being done within the field. In this paper, we present a novel taxonomic framework for different types of VAM-HRI interfaces, composed of four main categories of virtual design elements (VDEs). We present and justify our taxonomy and explain how its elements have been developed over the last 30 years as well as the current directions VAM-HRI is headed in the coming decade.
... All the computer-generated graphics are displayed in real time, with stereoscopic vision and dynamically aligned with the user's head orientation by the Oculus Rift (2016), which is an HMD embedding a high-resolution screen and a tracking sensor. The outcomes presented in the work of Kot and Novák (2014) have been used to configure the HMD stereoscopic view for an optimal eye accommodation and a reduced risk of image disparity problems. The binocular vision, in fact, is an important visual cue for enhancing the level of immersion of VR applications that isolate the users from their physical surrounding. ...
Article
Full-text available
This study is aimed at evaluating the impact of different technical solutions of a virtual reality simulator to support the assessment of advanced human-machine interfaces for hydraulic excavator based on a new coordinated control paradigm and haptic feedbacks. By mimicking the end-effector movements, the control is conceived to speed up the learning process for novice operators and to reduce the mental overload on those already trained. The design of the device can fail if ergo-nomics, usability and performance are not grounded on realistic simulations where the combination of visual, auditory and haptic feedbacks make the users feel like being in a real environment rather than a computer-generated one. For this reason , a testing campaign involving 10 subjects was designed to discriminate the optimal setup for the hardware to ensure a higher immersion into the VR experience. Both the audio-video configurations of the simulator (head-mounted display and surround system vs. monitor and embedded speakers) and the two types of haptic feedback for the soil-bucket interaction (contact vs. shaker) are compared in three different scenarios. The performance of both the users and simulator are evaluated by processing subjective and objective data. The results show how the immersive setup improves the users' efficiency and ergonomics without putting any extra mental or physical effort on them, while the preferred haptic feedback (contact) is not the more efficient one (shaker).
... This allows the operator to look around freely. Kot and Novak [29] use a HMD to render images from a stereovision camera, allowing for a better depth estimation of the operator. In addition, they simultaneously display complementing information within the operator's field of view, for example an image from a rear camera and additional status icons. ...
Preprint
Corner cases for driving automation systems can often be detected by the system itself and subsequently resolved by remote humans. There exists a wide variety of technical approaches on how remote humans can resolve such issues. Over multiple domains, no common taxonomy on those approaches has developed yet, though. As the scaling of automated driving systems continues to increase, a uniform taxonomy is desirable to improve communication within the scientific community, but also beyond to policymakers and the general public. In this paper, we provide a survey on recent terminologies and propose a taxonomy for remote human input systems, classifying the different approaches based on their complexity
... Teleoperation interfaces themselves have been a topic of interest within the HRI literature for several decades [43], with work in this space ongoing, especially with relation to accessible design [20,44], multi-robot control [27,28], and immersive AR/VR control [1,5,26,29]. However, there has been little work examining teleoperation interfaces for SARs, despite the unique needs faced by teleoperators (i.e., therapists) in contexts like therapy for CWA. ...
Conference Paper
Full-text available
Therapist-operated robots can play a uniquely impactful role in helping children with autism practice and acquire social skills. While extensive research within Human-Robot Interaction has focused on teleoperation interfaces for robots in general, little work has been done on teleoperation interface design for robots in the context of therapy for children with autism. Moreover, while clinical research has shown the positive impact robots can have on children with autism, much of that research has been performed in a controlled environment, with little understanding of the way these robots are used in practice. We analyze archival data of therapists teleoperating robots as part of their regular therapy sessions, to (1) determine common themes and difficulties in therapists’ use of teleoperation interfaces, and (2) provide design recommendations to improve therapists’ overall experience. We believe that following these recommendations will help maximize the effectiveness of therapy for children with autism when using Socially Assistive Robotics and the scale at which robots can be deployed in this domain.
... Teleoperation interfaces themselves have of course been a topic of interest within the HRI literature for several decades [33], with work in this space ongoing, especially with relation to accessible design [14,34], multi-robot control [20,21], and immersive AR/VR control [1,3,19,22]. However, there has been little work examining teleoperation interfaces for SARs, despite the unique needs faced by teleoperators (i.e., therapists) in contexts like ASD therapy. ...
Conference Paper
Full-text available
Therapist-operated robots can play a uniquely impactful role in helping children with Autism Spectrum Disorder (ASD) practice and acquire social skills. While extensive research within Human Robot Interaction has focused on teleoperation interfaces for robots in general, little work has been done on teleoperation interface design for robots in the context of ASD therapy. Moreover, while clinical research has shown the positive impact robots can have on children with Autism, much of that research has been performed in a controlled environment, with little understanding of the way these robots are used "in the wild". We analyze archival data of therapists teleoperating robots as part of their regular therapy sessions, to (1) determine common themes and difficulties in therapists' use of teleoperation interfaces, and (2) provide design recommendations to improve therapists' overall experience. We believe that following these recommendations will help maximize the effectiveness of ASD therapy with Socially Assistive Robots and the scale at which it can be deployed.
... En médecine, il existe de nombreux exemples de simulateur à la chirurgie, tels que le simulateur à la laparoscopie LapSim Une revue présente les différents casques du commerce [15]. Dans le domaine de la recherche, ces casques sont utilisés pour créer un environnement virtuel et s'entraîner à l'assemblage [49], en médecine [24,34] ou encore comme interface pour téléopérer des robots [56]. ...
Thesis
Full-text available
La cobotique, ou robotique collaborative, présente l’avantage de garder l’humain impliqué dans l’élaboration d’une tâche où son expertise est nécessaire. De plus, la cobotique permet de décupler la force de l’opérateur, d’augmenter sa dextérité et donc apporter une assistance à l’agent. On trouve son application dans l’industrie sur les chaînes d’assemblages ou en médecine pour des microchirurgies et les chirurgies mini-invasives. En industrie, plusieurs tâches de micro-assemblage, nécessitant la manipulation d’objets d’une centaine de micromètres à quelques millimètres, sont encore faites à la main, à l’aide d’une pince brucelles. Cependant, ces métiers, qui n’ont pas énormément évolué, font face à des désagréments liés à l’ergonomie de leur poste de travail et à la pratique de leurs activités. La thèse a pour but de développer une interface homme/machine robotique pour faciliter la manipulation fine. Reprenant la forme et la fonction des pinces brucelles, un nouvel outil actif instrumenté est proposé. Il vise à faciliter tous les travaux utilisant ce type de pince, en ajoutant des fonctions robotiques comme la commande de l’effort de serrage ou encore un retour haptique. Il peut aussi servir de dispositif maître dans une chaîne de téléopération et piloter un robot micromanipulateur. Cet outil sera notamment proposé à des artisans mais pourrait aussi répondre à certains besoins rencontrés par les chirurgiens utilisant eux aussi les pinces brucelles.