Mobile robot platforms, (a) 3D model, (b) Real robot.

Mobile robot platforms, (a) 3D model, (b) Real robot.

Source publication
Article
Full-text available
This paper presents an Augmented Reality software suite aiming at supporting the operator’s interaction with flexible mobile robot workers. The developed AR suite, integrated with the shopfloor’s Digital Twin, provides the operators’ a) virtual interfaces to easily program the mobile robots b) information on the process and the production status, c...

Context in source publication

Context 1
... introduced mobile dual-arm workers consist of two robot manipulators placed on a two axis, vertical and rotational, moving torso. The torso is placed on an omnidirectional mobile platform providing mobility in the robot resource as shown in Figure 3. The MRPs can navigate autonomously in different workstations of the factory. ...

Citations

... Michalos et al. 33 proposed a method to analyze and enhance industrial workplaces using immersive virtual reality. Aivaliotis et al. 34 and Kousi et al. 35 presented an augmented reality software suite aiming at supporting the operator's interaction with flexible mobile robot workers. Other scholars discussed the applications of digital twin technology to robotic assembly lines [36][37][38] . ...
Article
Full-text available
Aiming at the practical teaching of intelligent manufacturing majors faced with lack of equipment, tense teachers and other problems such as high equipment investment, high material loss, high teaching risk, difficult to implement internship, difficult to observe production, difficult to reproduce the results, and so on, we take the electrical automation technology, mechatronics technology and industrial robotics technology majors of intelligent manufacturing majors as an example, and design and establish a virtual simulation teaching platform for intelligent manufacturing majors by using the cloud computing platform, edge computing technology, and terminal equipment synergy. The platform includes six major virtual simulation modules, including virtual simulation of electrician electronics and PLC control, virtual and real combination of typical production lines of intelligent manufacturing, dual-axis collaborative robotics workstation, digital twin simulation, virtual disassembly of industrial robots, virtual simulation of magnetic yoke axis flexible production line. The platform covers the virtual simulation teaching content of basic principle experiments, advanced application experiments, and advanced integration experiments in intelligent manufacturing majors. In order to test the effectiveness of this virtual simulation platform for practical teaching in engineering, this paper organizes a teaching practice activity involving 246 students from two parallel classes of three different majors. Through a one-year teaching application, we analyzed the data on the grades of 7 core courses involved in three majors in one academic year, the proportion of participation in competitions and innovative activities, the number of awards and certificates of professional qualifications, and the subjective questionnaires of the testers. The analysis shows that the learners who adopt the virtual simulation teaching platform proposed in this paper for practical teaching are better than the learners under the traditional teaching method in terms of academic performance, proportion of participation in competitions and innovative activities, and proportion of awards and certificates by more than 13%, 37%, 36%, 27% and 22%, respectively. Therefore, the virtual simulation teaching platform of intelligent manufacturing established in this paper has obvious superiority in solving the problem of "three highs and three difficulties" existing in the practical teaching of engineering, and according to the questionnaire feedback from the testers, the platform can effectively alleviate the shortage of practical training equipment, stimulate the interest in learning, and help to broaden and improve the knowledge system of the learners.
... Focusing on safety, legibility, and efficiency, Chadalavada et al. [15] studied how robots (forklifts) in industrial logistics applications can communicate their navigational intent using spatial augmented reality so that humans can intuitively understand the robot's intent and feel safe near the robot. Aivaliotis et al. [16] and Hietanen et al. [17] both proposed an augmented reality-based way to show information about the process and the status of production and increase safety awareness by superimposing an active safety zone around the robot. Similarly, Li et al. [12] also adopted AR devices to propose a bidirectional human-robot safe interaction framework with the extraction of human states and robot states, including velocity control and prediction of potential collisions. ...
Article
The integration of human-robot collaboration yields substantial benefits, particularly in terms of enhancing flexibility and efficiency within a range of mass-personalized manufacturing tasks, for example, small-batch customized product inspection and assembly/disassembly. Meanwhile, as human-robot collaboration lands broader in manufacturing, the unstructured scene and operator uncertainties are increasingly involved and considered. Consequently, it becomes imperative for robots to execute in a safe and adaptive manner rather than solely relying on pre-programmed instructions. To tackle it, a systematic solution for safe robot motion generation in human-robot collaborative activities is proposed, leveraging mixed-reality technologies and Deep Reinforcement Learning. This solution covers the entire process of collaboration starting with an intuitive interface that facilitates bare-hand task command transmission and scene coordinate transformation before the collaboration begins. In particular, mixed-reality devices are employed as effective tools for representing the state of humans, robots, and scenes. This enables the learning of an end-to-end Deep Reinforcement Learning policy that addresses both the uncertainties in robot perception and decision-making in an integrated manner. The proposed solution also implements policy simulation-to-reality deployment, along with motion preview and collision detection mechanisms, to ensure safe robot motion execution. It is hoped that this work could inspire further research in human-robot collaboration to unleash and exploit the powerful capabilities of mixed reality.
... Several approaches towards the efficient interaction of human and robotic resources in HRC scenarios have been presented in the literature. Existing approaches are based on Augmented Reality (AR) techniques [13][14][15], smartphones [16], tablets [17,18], smartwatches [19] and voice-based interfaces [20]. In this paper, special emphasis is given to voice-based approaches due to the fact that human operators are able to interact with the robot without needing to stop their actions and use their hands to interact with the required interfaces. ...
Article
Full-text available
Quality inspection plays a vital role in current manufacturing practice since the need for reliable and customized products is high on the agenda of most industries. Under this scope, solutions enhancing human-robot collaboration such as voice-based interaction are at the forefront of efforts by modern industries towards embracing the latest digitalization trends. Current inspection activities are often based on the manual expertise of operators, which has been proven to be time-consuming. This paper presents a voice-enabled ROS2 framework towards enhancing the collaboration of robots and operators under quality inspection activities. A robust ROS2-based architecture is adopted towards supporting the orchestration of the process execution flow. Furthermore, a speech recognition application and a quality inspection solution are deployed and integrated to the overall system, showcasing its effectiveness under a case study deriving from the automotive industry. The benefits of this voice-enabled ROS2 framework are discussed and proposed as an alternative way of inspecting parts under human-robot collaborative environments. To measure the added value of the framework, a multi-round testing process took place with different parameters for the framework's modules, showcasing reduced cycle time for quality inspection processes, robust HRI using voice-based techniques and accurate inspection.
... Communication-based on direct verbal and visual synchronous communication technologies further promotes remote collaboration [8]. Regarding physical interactions, advances not only in AR and robots [9] but also collaborative robots (cobots) [10] enable remote operation or dividing tasks between cobots and humans. Integrating these technologies could enable telework in part manufacturing by forming new modes of collaboration in which manufacturing employees in telework collaborate in varying degrees with human colleagues, supported by additional technologies on-site to perform the required human tasks. ...
Article
Full-text available
A central element of industrial production is the manufacturing of finished parts from raw material. Even in highly automated environments, processes like milling still rely on human intervention. On-site human operators play a crucial role in ensuring the continuous operation and quality of parts through tasks such as setup and maintenance. This reliance on human involvement makes part manufacturing vulnerable to workforce reductions, whether due to unforeseen circumstances like pandemics or staff shortages. However, new modes of telework collaboration based on interactive systems that comprise visualization and communication technologies, collaborative robots, fast internet, and remote control of machine tools bear potential to overcome these challenges. In consequence, a conceptual framework is proposed that investigates how such modes and systems need to be designed to share the respective tasks between teleworking and on-site employees. As the interactions and systems show a high complexity and since reduced workforce situations often occur suddenly, a high degree of usability must be ensured to enable quick ramp-up and reliable operation. Therefore, an interdisciplinary approach between manufacturing engineering, ergonomics/human factors and human–computer interaction investigates how the concept of human-centered design (HCD) needs to be adapted to ensure this usability. While the initial study focuses on how to integrate human workers in the design of such a system, it also highlights the need to examine different collaboration modes and application scenarios.
... Experiments took place specifically focused on safety, performance and ergonomics and the AR-based concept is proven to enhance the overall performance which can for sure play a vital role in the quality of the assembled product. AR applications have also been proposed by Michalos et al. (2018) and by Aivaliotis et al. (2022), covering multiple aspects of human robot collaborative scenarios such as easy robot ...
... Assembly guidance provision approach(Aivaliotis et al. 2022). ...
Article
Full-text available
Quality control methods and techniques have been investigated in different fields of manufacturing during the last decades. The introduction of robots in manufacturing processes has created a rapid deployment of robotic applications, leading to an increased research interest in the aspect of quality control in such industrial environments. This paper summarizes and presents a review of recent research and progress on quality control technologies, focusing on robot-based production systems. The role of different parameters affecting the control of quality in robotic tasks is also discussed, incorporating the impact of operator support systems in human robot collaborative environments. Research gaps and implications of quality control in robotic applications are also described, and a future outlook of research in the particular field is provided.
... In the industrial sector, XR technologies are mainly used to assist and support operators during assembly activities (Vanneste et al. 2020), maintenance (Mourtzis, Siatras, and Angelopoulos 2020), and training (Menn et al. 2020). In recent years, with the increasing use of collaborative robots, these technologies have also been employed to enhance safe and efficient collaboration between humans and robots (Aivaliotis et al. 2023). The industrial XR market is expected to reach $76 billion in 2025 (BIS Research 2018). ...
... It uses a simple and intuitive way to interact with hardware, without requiring special programming skills or long-term training for human operators. The suggested software was developed for Microsoft's HoloLens hybrid reality headdisplay, integrated with the robot operating system, and tested in a case study in the automotive industry [3]. The above research techniques are not mature enough and their practicality is not strong. ...
... However, in this study, the user was only allowed to enter the workspace when the robot was stopped, and the boundaries, which conveyed the actual configuration of the robot, were displayed even when the user was in a safe space. In [44], the same situation is observed, where only boundaries related to the actual configuration are displayed, and no data concerning the user's state is shown. In [45], authors made use of visual and acoustic information to warn users about the robot's intention, making the planned trajectories easier to understand. ...
Preprint
Full-text available
Safety is the main concern in human-robot collaboration (HRC) in work environments. Standard safety measures based on reducing robot speed affect productivity of collaboration, and do not inform workers adequately about the state of the robot, leading to stressful situations due to uncertainty. To grant the user control over safety, we investigate using audio, visual and audio-visual mixed reality (MR) displays that inform about the boundaries of zones with different levels of hazard. We describe the design of the hazard displays for scenario of collaboration with a real robot. We then report an experimental user study with 24 users, comparing performance and user experience (UX) obtained with the auditory display, the visual display, and the audio-visual display resulting from combining both. Findings suggest that all modalities are suitable for HRC scenarios, warranting similar performance during collaboration. However, distinct qualitative results were observed between displays, indicating differences in the UX obtained.
... A promising development is the use of virtual reality/ augmented reality/mixed reality (VR/AR/MR, or XR) tools, which allow the user to not only visualize data in an intuitive manner but also interact with other (H)DTs for different purposes [127], e.g., co-creation [128] and co-simulation [129]. An example is that Aivaliotis et al. proposed an AR suite integrated with the DT of the shop floor, for developing human-robot interaction using Microsoft ® HoloLens ® [130]. Geng et al. also created a modular DT system that integrates VR/AR for remote control and virtual machining [131]. ...
Article
In the past decade, human digital twins (HDTs) attracted attention in both digital twin (DT) applications and beyond. In this paper, we discuss the concept and the development of HDTs, focusing on their architecture, key enabling technologies, and (potential) applications. Based on the literature, we identify personal data, model, and interface as three key modules in the proposed HDT architecture, supported by a data lake of human data and a model and interface library. Regarding the key enabling technologies that support the HDT functions, we envision that the internet of things (IoT) infrastructure, data security, wearables, human modeling, explainable artificial intelligence (AI), minimum viable sensing, and data visualization are closely associated with the development of HDTs. Finally, we investigate current applications of HDTs, with a particular emphasis on the opportunities that arise from leveraging HDTs in the field of personalized product design.
... This method integrated process decomposition, communication, and motion planning modules for indicative task execution, which matched users' skills to the requirements of production tasks. In addition, Aivaliotis et al. [111] developed an AR software suite for HRC with mobile robot platforms. Humans could define robot motion and navigation goals in a virtual interface. ...
Article
Full-text available
Robots are expanding from industrial applications to daily life, in areas such as medical robotics, rehabilitative robotics, social robotics, and mobile/aerial robotics systems. In recent years, augmented reality (AR) has been integrated into many robotic applications, including medical, industrial, human–robot interactions, and collaboration scenarios. In this work, AR for both medical and industrial robot applications is reviewed and summarized. For medical robot applications, we investigated the integration of AR in (1) preoperative and surgical task planning; (2) image-guided robotic surgery; (3) surgical training and simulation; and (4) telesurgery. AR for industrial scenarios is reviewed in (1) human–robot interactions and collaborations; (2) path planning and task allocation; (3) training and simulation; and (4) teleoperation control/assistance. In addition, the limitations and challenges are discussed. Overall, this article serves as a valuable resource for working in the field of AR and robotic research, offering insights into the recent state of the art and prospects for improvement.