Figure 1 - uploaded by Sandra Hirche
Content may be subject to copyright.
Interaction with virtual environment via haptic interfaces in case of the dyadic condition

Interaction with virtual environment via haptic interfaces in case of the dyadic condition

Source publication
Conference Paper
Full-text available
In order to enable intuitive physical interaction with autonomous robots as well as in collaborative multi-user virtual reality and tele- operation systems a deep understanding of human-human haptic interaction is required. In this paper the effect of haptic interac- tion in single and dyadic conditions is investigated. Furthermore, an energy-based...

Similar publications

Article
Full-text available
This paper addresses the stability of time-delayed force-reflecting displays used in human-in-the-loop virtual reality interactive systems. A novel predictive haptic-device model-based approach is proposed. The developed solution is stable and robust, and does not require either the estimation of time delay or any knowledge on its behavior. It appl...
Article
Full-text available
We present a modular wearable interface for haptic interaction and robotic teleoperation. It is composed of a 3-DoF fingertip cutaneous device and a 1-DoF finger kinesthetic exoskeleton, which can be used together as a single device or separately as two devices. The 3-DoF fingertip device is composed of a static body and a mobile platform. The mobi...
Article
Full-text available
Maintaining a stable haptic interaction with virtual environments, especially with physically-based de-formable objects, has long been an active area of re-search. We address this issue by presenting a compre-hensive haptic system architecture and virtual reality simulation, where a physically-based modeling using the Finite Element Method (FEM) co...
Article
Full-text available
The design review process of new products is time consuming, requires the collaboration and synchronization of activities performed by various experts having different competences and roles, and is today performed using different tools and different product representations. In order to improve the performances of the overall product design process,...

Citations

... If we applied common methods of inferring emergent-i.e. not fixed a priori-roles and control strategies, we may incorrectly infer leadership roles based on signs/directions of interaction force/torque 33,34,40,41 and power 42 or based on which person moved earlier, when in actuality these trends reflect whether the task is to decrease or increase the novice' step frequency. Other time-domain methods for identifying roles-e.g. ...
Article
Full-text available
Physical human–robot interactions (pHRI) often provide mechanical force and power to aid walking without requiring voluntary effort from the human. Alternatively, principles of physical human–human interactions (pHHI) can inspire pHRI that aids walking by engaging human sensorimotor processes. We hypothesize that low-force pHHI can intuitively induce a person to alter their walking through haptic communication. In our experiment, an expert partner dancer influenced novice participants to alter step frequency solely through hand interactions. Without prior instruction, training, or knowledge of the expert’s goal, novices decreased step frequency 29% and increased step frequency 18% based on low forces (< 20 N) at the hand. Power transfer at the hands was 3–700 × smaller than what is necessary to propel locomotion, suggesting that hand interactions did not mechanically constrain the novice’s gait. Instead, the sign/direction of hand forces and power may communicate information about how to alter walking. Finally, the expert modulated her arm effective dynamics to match that of each novice, suggesting a bidirectional haptic communication strategy for pHRI that adapts to the human. Our results provide a framework for developing pHRI at the hand that may be applicable to assistive technology and physical rehabilitation, human-robot manufacturing, physical education, and recreation.
... Humans perform collaborative tasks in a wide range of everyday actions [1], [2], [3], [4], [5], [6], [7], [8], from moving furniture together or tango dancing, to more asymmetric collaborations such as a therapist assisting a patient in physical training or a violin teacher holding their student's arm to demonstrate how to perform bowing movements. These collaborations rely on haptic communication where recent results suggest that the exchange of haptic information between connected humans improves the performance of both partners [9], [10], [11] and their motor learning [12]. ...
... We de ne "haptic communication" as the exchange of information through sensory feedback elicited from physical contact 33 . Haptic communication has been used to explain the bene ts of human-human hand interactions for performing several types of upper-limb tasks [34][35][36][37][38][39][40][41][42][43] . Our results suggest that the signs of interaction force and power may encode the direction (increase or decrease) in which to alter step frequency. ...
Preprint
Full-text available
Physical human-robot interactions (pHRI) often provide mechanical force and power to aid and alter human walking without requiring voluntary effort from the human. Alternatively, we propose that principles of physical human-human interactions (pHHI) can inspire pHRI that aids walking by engaging human sensorimotor processes. We hypothesize that low-force hand interactions can intuitively induce people to alter their own walking. Our experiment paradigm is based on partner dancing: an expert partner dancer influences novice participants to alter step frequency solely through hand interactions. Without prior instruction or training, novices decreased step frequency by 29% and increased step frequency 18% based on low forces (< 20 N) at the hands. Power transfer at the hands was 10-100x smaller than that exerted by the lower limbs to propel locomotion, suggesting that the expert did not mechanically alter the novice’s gait. Instead, the direction of hand forces and power may communicate information about desired walking patterns. Finally, the expert altered arm stiffness to match that of the novice, offering a design principle for pHRI to alter gait. Our results provide a framework for developing pHRI with wide-ranging applications, including assistive technology and physical rehabilitation, human-robot manufacturing, physical education, and recreation.
... However, this does not necessarily lead to worse performance [5]. In fact, compared to solo performance, human-human interaction can result in better tracking accuracy [4], [6], [7]. Similarly, compared to bimanual interaction, dyads have achieved faster motions in both discrete [8] and cyclical [9] aiming tasks and displayed similar performance and adaptation rates during rhythmic tasks [10]. ...
Article
Full-text available
During daily activities, humans routinely manipulate objects bimanually or with the help of a partner. This work explored how bimanual and dyadic coordination modes are impacted by the object's stiffness, which conditions inter-limb haptic communication. For this, we recruited twenty healthy participants who performed a virtual task inspired by object handling, where we looked at the initiation of force exchange and its continued maintenance while tracking. Our findings suggest that while individuals and dyads displayed different motor behaviours, which may stem from the dyad's need to estimate their partner's actions, they exhibited similar tracking accuracy. For both coordination modes, increased stiffness resulted in better tracking accuracy and more correlated motions, but required a larger effort through increased average torque. These results suggest that stiffness may be a key consideration in applications such as rehabilitation, where bimanual or external physical assistance is often provided.
... where v is the velocity of frame B. Similar metrics exist in the literature, where authors in [9], [22] use power in a scalar form to distinguish interaction patterns in the virtual environment. Note that, positive power implies that the agent is applying force in the direction of velocity which contributes to the motion significantly. ...
Preprint
Physical Human-Human Interaction (pHHI) involves the use of multiple sensory modalities. Studies of communication through spoken utterances and gestures are well established. Nevertheless, communication through force signals is not well understood. In this paper, we focus on investigating the mechanisms employed by humans during the negotiation through force signals, which is an integral part of successful collaboration. Our objective is to use the insights to inform the design of controllers for robot assistants. Specifically, we want to enable robots to take the lead in collaboration. To achieve this goal, we conducted a study to observe how humans behave during collaborative manipulation tasks. During our preliminary data analysis, we discovered several new features that help us better understand how the interaction progresses. From these features, we identified distinct patterns in the data that indicate when a participant is expressing their intent. Our study provides valuable insight into how humans collaborate physically, which can help us design robots that behave more like humans in such scenarios.
... However, it is yet unclear what characteristics to instill in a robot to make it more humanlike in overground pHRI. In physical human-human interaction (pHHI), humans do physical interaction solely through haptic-based information exchange even without any verbal communication [7][8][9][10]. Whether the contact is directly between humans, such as while dancing or assisting the elderly, or is through an object such as while moving a piece of furniture together, pHHI studies [7,[9][10][11][12][13] suggest that the motor intent is communicated in the form of coupled forces and movements. ...
Article
Full-text available
Many anticipated physical human-robot interaction (pHRI) applications in the near future are overground tasks such as walking assistance. For investigating the biomechanics of human movement during pHRI, this work presents Ophrie, a novel interactive robot dedicated for physical interaction tasks with a human in overground settings. Unique design requirements for pHRI were considered in implementing the one-arm mobile robot, such as the low output impedance and the ability to apply small interaction forces. The robot can measure the human arm stiffness, an important physical quantity that can reveal human biomechanics during overground pHRI, while the human walks alongside the robot. This robot is anticipated to enable novel pHRI experiments and advance our understanding of intuitive and effective overground pHRI.
... In a previous study, Macdorman et al. [22] mentioned that robots with anthropomorphic appearances and facial expressions have strong comprehensive capabilities and can adapt to human inertial thinking. Humans can understand corresponding states from the nonverbal behaviors of humanoid robots, similar to interacting with other human beings [23], [24]. Therefore, generating natural facial expressions and head motions for humanoid robots is essential for effective humanrobot interactions [25], and the most direct and simple method for natural action generation is to make the robot imitate the facial expressions and head motions of human beings. ...
Article
Full-text available
The ability of a humanoid robot to imitate facial expressions with simultaneous head motions is crucial to natural human-robot interaction. This mirrored behavior from human beings to humanoid robots has high demands of similarity and real-time performance. To fulfill these needs, this paper proposes a real-time robotic mirrored behavior of facial expressions and head motions based on lightweight networks. First, a humanoid robot that can change the state of its facial organs and neck through servo displacement is developed to achieve the mirrored behavior of facial expressions and head motions. Second, to overcome the high latency caused by deep learning models running in embedded devices, a lightweight deep learning network is constructed for detecting facial feature points, which can reduce model size and improve running speed without affecting the performance of the model. Finally, a mapping relationship of 68 facial feature points to optimal servo displacements is established to realize the mirrored behavior from human beings to humanoid robots. The experimental results show that the facial feature point recognition method based on lightweight model performs better than other state-of-the-art methods, and our head motion tracking method can maintain high accuracy compared with the gold standard optical motion capture system NOKOV. Overall, our method ensures the accurate and real-time generation of robot mirrored behavior and has a certain reference value for the efficient and natural interaction between humans and robots.
... One factor could be the number of effectors involved in the task. Dyads tend to outperform solos if the task can be performed unimanually by two individuals 7,8,13,14 . This can be explained by the biomechanical advantages associated with the involvement of more effectors during dyadic cooperation, i.e., two arms from two agents versus one arm from one agent. ...
Preprint
Full-text available
Collaboration frequently yields better results in decision making, learning, and haptic interactions than when these actions are performed individually. However, is collaboration always superior to solo actions, or do its benefits depend on whether collaborating individuals have different or the same roles? To answer this question, we asked human subjects to perform virtual-reality collaborative and individual beam transportation tasks. These tasks were simulated in real-time by coupling the motion of a pair of hand-held robotic manipulanda to the virtual beam using virtual spring-dampers. For the task to be considered successful, participants had to complete it within temporal and spatial constraints. While the visual feedback remained the same, the underlying dynamics of the beam were altered to create two distinctive task contexts which were determined by a moving pivot constraint. When the pivot was placed at the center of the beam, two hands contribute to the task with symmetric mechanical leverage (symmetric context). When the pivot was placed at the left side of the beam, two hands contribute to the task with asymmetric mechanical leverage (asymmetric context). Participants performed these task contexts either individually with both hands (solo), or collaboratively by pairing one hand with another one (dyads). We found that dyads in the asymmetric context performed better than solos. In contrast, solos performed the symmetric context better than dyads. Importantly, we found that two hands took different roles in the asymmetric context for both solos and dyads. In contrast, the contribution from each hand was statistically indistinguishable in the symmetric context. Our findings suggest that better performance in dyads than solos is not a general phenomenon, but rather that collaboration yields better performance only when role specialization emerges in dyadic interactions.
... Perhaps engaging similar sensorimotor mechanisms as light touch, haptic communication has been used to explain the benefits of human-human hand interactions for performing constrained physical tasks, but mostly during upper-limb manipulation. Human dyads have been shown to perform better than the worse partner in reaching (Reed et al., 2006;Reed and Peshkin, 2008), object manipulation (van der Wel et al., 2011;Mojtahedi et al., 2017;Jensen et al., 2021), target tracking (Wegner and Zeaman, 1956;Glynn and Henning, 2000;Feth et al., 2009;Ganesh et al., 2014;Melendez-Calderon et al., 2015), and force tracking (Masumoto and Inui, 2013). In studies that measured each person's force or torque contribution independently through coupled robotic devices, human partners have been shown to exert opposing forces or torques that cancel each other (Reed et al., 2006;van der Wel et al., 2011;Madan et al., 2015;Melendez-Calderon et al., 2015;Mojtahedi et al., 2017). ...
Article
Full-text available
Principles from human-human physical interaction may be necessary to design more intuitive and seamless robotic devices to aid human movement. Previous studies have shown that light touch can aid balance and that haptic communication can improve performance of physical tasks, but the effects of touch between two humans on walking balance has not been previously characterized. This study examines physical interaction between two persons when one person aids another in performing a beam-walking task. 12 pairs of healthy young adults held a force sensor with one hand while one person walked on a narrow balance beam (2 cm wide x 3.7 m long) and the other person walked overground by their side. We compare balance performance during partnered vs. solo beam-walking to examine the effects of haptic interaction, and we compare hand interaction mechanics during partnered beam-walking vs. overground walking to examine how the interaction aided balance. While holding the hand of a partner, participants were able to walk further on the beam without falling, reduce lateral sway, and decrease angular momentum in the frontal plane. We measured small hand force magnitudes (mean of 2.2 N laterally and 3.4 N vertically) that created opposing torque components about the beam axis and calculated the interaction torque, the overlapping opposing torque that does not contribute to motion of the beam-walker’s body. We found higher interaction torque magnitudes during partnered beam-walking vs. partnered overground walking, and correlation between interaction torque magnitude and reductions in lateral sway. To gain insight into feasible controller designs to emulate human-human physical interactions for aiding walking balance, we modeled the relationship between each torque component and motion of the beam-walker’s body as a mass-spring-damper system. Our model results show opposite types of mechanical elements (active vs. passive) for the two torque components. Our results demonstrate that hand interactions aid balance during partnered beam-walking by creating opposing torques that primarily serve haptic communication, and our model of the torques suggest control parameters for implementing human-human balance aid in human-robot interactions.
... [1], [5], [6], [9], [10]. Haptic communication involves exchange of force or velocity signals between the partners who are simultaneously engaged in perception (including haptic perception) and motor action. ...
... Through the haptic channel, the members can coordinate contributions, communicate intentions, negotiate roles, or adapt behaviors. Thus, haptic communication is thought to facilitate the development of a cooperative strategy and a shared action plan that is not available to dyad members when they work alone [1], [5]. ...
... Whether the communication pathway that we have modeled actually underlies the performance benefit enjoyed by dyads will have to be empirically validated. A comparison of dyad performance across conditions with and without haptic feedback requires an apparatus capable of rendering the force transmitted across a linkage connecting the two object halves, such as that developed for the experiments presented in [5], Fig. 7: The forces F 1 and F 2 applied by each member of a dyad to track a unit step reference A) with haptic feedback and B) without haptic feedback. The resulting internal force is shown in C). ...