Conference PaperPDF Available

Touchless Haptic Feedback for Supernatural VR Experiences

Authors:
Touchless Haptic Feedback for Supernatural VR Exp e rien ces
!
Jonatan Martinez, Daniel Griffiths, Valerio Biscione*, Orestis Georgiou, and Tom Carter
Ultrahaptics Ltd., Bristol, United Kingdom
!
ABSTRACT
Haptics is an important part of the VR space as seen by the plethora
of haptic controllers available today. By using a novel ultrasonic
haptic device, we developed and integrated mid-air haptic
sensations without the need to wear or hold any equipment in a VR
game experience. The compelling experience combines visual,
audio and haptic stimulation in a supernatural narrative in which
the user takes on the role of a wizard apprentice. By using different
haptified patterns we could generate a wide range of sensations
which mimic supernatural interactions (wizard spells). We detail
our methodology and briefly discuss our findings and future work.
Keywords: haptics; HCI; VR; ultrasound; supernatural.
Index Terms: H.5.1 [Human computer interaction (HCI)]:
Interaction devices—Haptic devices; H.5.2 [Human computer
interaction (HCI)]: Interaction paradigms—Virtual reality
1 INTRODUCTION
To create fully immersive experiences in Virtual Reality (VR),
all 5 senses (sight, sound, touch, smell, and taste) must perceive the
digital environment. These senses need not always accurately
reproduce reality since our imagination can often fill in the gaps.
Moreover, many sensory modalities can be augmented,
supplemented, or even substituted. With great advancements in
current VR technologies, it is mostly the visual and auditory
modalities that have been adequately addressed to date. Haptics,
one of the most intimate of senses, has to date mostly been realized
in hand-held controllers that vibrate, or actuate within a limited set
of degrees of freedom. Possible peripheral alternatives include
especially designed controllers, haptic wearable gloves, electrical
muscle stimulation pads, and even full body exoskeleton suits.
Haptic feedback can significantly increase the performance of
VR applications [3]. In this extended abstract, we present and
demonstrate a touchless (i.e., non-wearable) haptic platform and its
effective use in rendering supernatural haptic feedback in VR. The
haptic device we have developed combines off-the-shelf hand
tracking solutions with the advanced manipulation of focused
ultrasound to remotely stimulate receptor structures in various parts
of the hand. This novel haptic technology has come a long way
since early haptic displays [1] and can now be actuated within an
expanded functional space as to adequately produce extraordinary
haptic effects that can be leveraged in VR applications.
The goal of our present research was not the accurate haptic
reproduction of complex 3D objects (e.g., a coffee cup) but rather
the haptification of abstract forces like the shooting of lightning
bolts from your hands (e.g., like a wizard), thereby enhancing the
overall VR experience. Since such effects are attributed to forces
beyond scientific understanding or the laws of nature (and hence
are supernatural) they present great challenges to developers with
regards to rendering them using existing haptic devices. Moreover,
since these haptic effects are not something we can relate to direct
past experiences we have to supplement our active ultrasonically
induced haptics with those of audio-visual pseudo-haptics [2].
The result of our research seems to suggest a significant
amplification and realism of the desired haptic sensations. Namely,
that we have adequately rendered the haptic perception of what one
might expect magical spells of lightning, fire, and wind to feel like.
This means that when tasked with computationally analyzing a
sample of haptic data points, ordered in both space and time relative
to the hand and fused together with other audio-visual haptic cues,
the brain was sufficiently convinced to believe and accept that this
is indeed a supernatural experience. Crucial to this illusionary
effect is that our haptic engine and actuator device have a large
enough functional space, and can rapidly explore and display it.
Here, we will limit the discussion to the demonstrator’s physical
and haptic sensation design, and defer further in-depth technical
and user studies for future work. Our main contribution is the
demonstration of the capabilities of state-of-the-art touchless mid-
air haptic feedback hardware and software, and a discussion on
best-practices for its efficient application in the rendering of
supernatural forces in VR/AR games and experiences.
2 DEMONSTRATION SETUP
The demonstration is set on a tabletop and consists of the HTC Vive
VR headset with headphones, a single Lighthouse and a gaming
laptop, an Ultrahaptics Evaluation Kit (UHEV1), a LEAP Motion
controller, and an adjustable pedestal stand tilted at an angle of 45
degrees (see Figure 1). The LEAP motion and UHEV1 are pre-
calibrated (with a fixed, known position relative to one another). To
align the graphics to the UHEV1 and the LEAP, we place the Vive
controller on a known position on the UHEV1. When the trigger on
the Vive controller is pressed, the camera is moved so that the
UHEV1 is aligned with a known point in the virtual scene (the stand
where the magic orbs appear).
We have designed the setup, the user interface and gameplay of
the demonstrator as to focus and show off the strengths of the
components being used in synergy. Namely, we made sure that the
position of the LEAP Motion controller camera has a direct and
clear field of view (FoV) of the user’s palms. Most of the game
interactions are also designed to be actioned with open palms. This
improves the hand tracking accuracy of the LEAP camera and also
improves the haptic sensations achieved by focused ultrasound
Valerio.Biscione@ultrahaptics.com
Figure 1: Photo of the demonstrator setup that includes the UHEV1
device, the LEAP mot ion controller, and VR capture screen
showing the young wizard shooting lightning out of his bare hands.
.!
vibrations since the Pacinian corpuscles are most dense there.
Finally, we decided on a sitting down game, rather than a standing
up one, since this can relax the user while performing her magical
spells. As a result, both the LEAP camera and the Ultrahaptics
device have been angled at 45 degrees (see Figure 1).
3 GAME AND INTERACTION TECHNIQUES
The game is designed to mimic the physical setup described above
but graphically dressing everything with a dungeon-like effect and
the user’s avatar hands are an all-white skeleton [4].
The game narrative was organized around the concept that the
user is an apprentice wizard learning the lightning spell. The book
of spells, resting on a pedestal in front of her, contains visual
instructions, and a voiceover was added to make the experience
more compelling. Each spell can be absorbed through a “magic
spherical orb”, which the user is prompted to touch. The use of the
lightning spells (triggered by a close-open hand gesture)
“accidentally” results in the release of some flying insects (in this
case scarabs) which the user is prompted to squash by first learning
the fire spell (absorbing the fire orb) and then hitting, grasping, or
slapping the scarabs in mid-air with either of her fiery hands. This
in turn “accidentally” results in part of the desk catching fire. The
user is then instructed to extinguish the fire with the wind spell
while hovering her open hands over of the fire. Once the fire is out
the user is informed that her session is over and the book of spells
magically closes. Screenshots are shown in Figure 2.
4 HAPTICS
A phased array composed of 16x16=256 ultrasonic transducers at
40 kHz was used to focus ultrasound at one or more specified
locations in three dimensions, 90 degrees wide and up to 70 cm far
from the device. The (x,y,z) coordinates for creating the focus point
(e.g., at the center of the user’s palm or fingertip) are obtained from
the LEAP Motion API. For a tactile sensation to be perceived, the
acoustic radiation force at a focus point is amplitude modulated at
200 Hz as to create small localized skin displacements that are
perceived by the user as strong haptic effect. The haptic refresh rate
of this device is an impressive 16 kHz which is fast enough to allow
very rapid updates of the focus point coordinates as to smoothly
display complex curves and patterns on the users’ hands (see Figure
3). These complex curves are what we have used to emulate the
desired supernatural effects. The Ultrahaptics device comes with an
API written in C++ and C# with Unity bindings which has (x,y,z)
position coordinates as one of its main inputs needed to generate up
to 8 amplitude modulated focus points, thereby making the real-
time haptification of hand gestures straight forward, as long as they
are i) detected by the LEAP Motion, and ii) within the focusing
region of the Ultrahaptics device.
All user interactions described in the previous section have been
haptified using focused ultrasound. By moving the focus of the
ultrasound across the palm with a certain pattern, we could generate
different haptic sensations. These haptic sensations were designed
using just a single focus point printed on each hand (two focus
points in total), moving along a predefined path or skipping through
several points on the surface of the palm, thus generating a “haptic
pattern” of movement. Therefore, we had to constantly update in
software the coordinate system at which the focus point was printed
as to form planes parallel to the user’s palms. Standard rotation and
translation matrices were used for this operation in order to create
a left and right palm coordinate system relative to the LEAP
motion’s frame of reference. The intensity of the haptic sensations
was kept fixed across all haptic effects.
We designed several haptic patterns in order to match the
following interactions: Hand hovers over the orb and touching the
orb (these varied slightly for the lightning, the fire, and the wind
orb); Lightning Spell, Fire Spell (when touching the scarabs to kill
them); Wind spell. A schematic of these patterns for the four of the
sensations is shown and described in Figure 3. The haptic
perception of these effects is enhanced by using different graphical
and auditory cues [3]. For example, for the fire spell (Figure 3c),
the movement of the spiral along the figure 8 path is synchronized
with a graphical representation of a small flame following the same
figure 8 path on the user’s palm, together with the typical crackling
and sizzling sound of flames, with a stereo 3D sound coming from
the user hands. As a further example, when absorbing the lightning
from the orb, sparks are seen to be generated at semi-random
positions on the avatar’s palm. The haptic sensation is enhanced by
also using a synchronized sound effect of sparkling electricity.
5 CONCLUSION
In this extended abstract, we have shown and described the
world’s first demonstration of mid-air haptics within a supernatural
narrative in VR. We have shown how a varied set of sensations can
be evoked by changing the movement pattern of an ultrasonic focus
point thus creating a compelling experience. To that end, we have
learned that a compelling haptic experience can be obtained by
combining different levels of moving paths, e.g., a haptic line
moving anti-clockwise, whose center is also following a path
around the palm. Such complex ultrasonic manipulations have only
recently been unlocked. As a future direction for this work, we
therefore propose that an even greater range of sensations can be
obtained by combining complex paths from multiple focus points.
REFERENCES
[1] Takayuki Iwamoto, et al., “Non-contact method for producing
tactile sensation using airborne ultrasound.” Haptics:
Perception, Devices and Scenarios (2008): 504-513.
[2] Anatole Lecuyer, “Simulating Haptic Feedback Using Vision:
a Survey of Research and Applications of Pseudo-Haptic
Feedback, Presence: Teleoperators and Virtual
Environments.” 18.1 (2009): 39-53.
[3] Robert Stone, "Haptic feedback: A brief history from
telepresence to virtual reality." Haptic Human-Computer
Interaction (2001): 1-16.
[4] https://www.youtube.com/watch?v=GqgFRPs0wxA !
Figure 3: Four examples of the haptic pattern used in the demo. All
the sensations were projected on the surface of the palm. a)
touching the magic orb: skipping through multiple random haptic
points; b) lightning spell: moving haptic point from the wrist to the
index finger tip; c) fire spell: spiral following the path of a figure of 8;
d) wind spell: a rotating line follows a rotating path.
Figure 2: Screenshots during the orb activation and lightning sp ell.
... Therefore, ultrasound mid-air haptics and complementary hand tracking devices may improve user engagement during interaction with digital contents (Limerick et al., 2019). Because of improvements in human-computer interaction, they have attracted significant recent attention in the literature (e.g., Iwamoto et al., 2008), for application areas such as automotive user interfaces, virtual reality (VR), augmented reality (AR), mixed reality (MR) applications, and games Harrington et al., 2018;Martinez et al., 2018). Martinez et al. (2018) developed several mid-air haptic algorithms for rendering 3D shapes. ...
... Because of improvements in human-computer interaction, they have attracted significant recent attention in the literature (e.g., Iwamoto et al., 2008), for application areas such as automotive user interfaces, virtual reality (VR), augmented reality (AR), mixed reality (MR) applications, and games Harrington et al., 2018;Martinez et al., 2018). Martinez et al. (2018) developed several mid-air haptic algorithms for rendering 3D shapes. By changing the movement pattern of an ultrasonic focus point, various different sensations can be evoked. ...
Article
Full-text available
Introduction Carpal tunnel syndrome (CTS) is the most common nerve entrapment neuropathy, which causes numbness and pain in the thumb, the index and middle fingers and the radial side of the ring finger. Regular hand exercises may improve the symptoms and prevent carpal tunnel surgery. This study applied a novel ultrasonic stimulation method to test tactile sensitivity in CTS and also a mixed-reality-assisted (MR-assisted) exercise program which measured hand movements and provided haptic feedback for rehabilitation. Methods Twenty patients with mild unilateral CTS took part in the experiments. A mid-air haptics device (Ultrahaptics STRATOS Explore) was used to apply amplitude-modulated ultrasound waves (carrier frequency: 40 kHz) onto the skin to create tactile stimulation mechanically. Participants performed a two-alternative forced-choice task for measuring tactile thresholds at 250-Hz modulation frequency. They were tested at the index fingers and the thenar eminences of both hands. Additionally, 15 CTS patients used an MR-assisted program to do hand exercises with haptic feedback. Exercise performance was assessed by calculating errors between target and actual hand configurations. System Usability Scale (SUS) was adopted to verify the practical usability of the program. Results Thresholds at the thenar eminences of the affected and healthy hands were not significantly different. While the thresholds at the healthy index fingers could be measured, those of the affected fingers were all higher than the stimulation level produced by the maximum output from the ultrasound device. In the exercise program, a significant positive correlation (ρ = 0.89, p < 0.001) was found between the performance scores and the SUS scores, which were above the criterion value established in the literature. Discussion The results show that thenar tactile sensitivity is not affected in mild CTS as expected from the palmar cutaneous branch of the median nerve (PCBm), but index finger threshold is likely to be higher. Overall, this study suggests that mid-air haptics, with certain improvements, may be used as a preliminary test in the clinical setting. Moreover, the device is promising to develop gamified rehabilitation programs and for the treatment follow-up of CTS.
... UsHF has been applied in several contexts with some success. For example, within entertainment, UsHF has been used to add haptic feedback to a VR rhythm game (Georgiou, et al., 2018) and to compliment a children's wizard game in which players could cast spells within VR (Martinez, Griffiths, Biscione, Georgiou, & Carter, 2018). ...
... UsHF has been applied in several contexts with some success. For example, within entertainment, UsHF has been used to add haptic feedback to a VR rhythm game (Georgiou, et al., 2018) and to compliment a children's wizard game in which players could cast spells within VR (Martinez, Griffiths, Biscione, Georgiou, & Carter, 2018). ...
Thesis
Full-text available
The engineering design process can be complex and often involves reiteration of design activities in order to improve outcomes. Traditionally, the design process consists of many physical elements, for example, clay/foam modelling and more recently Additive Manufacturing (AM), with an iterative cycle of user testing of these physical prototypes. The time associated with creating physical prototypes can lengthen the time it takes to develop one product, and thus, comes at a burdensome financial and labour cost. Due to the aforementioned constraints of the conventional design process, more research is being conducted into applications of Virtual Reality (VR) to complement stages of the design process that would otherwise take and cost a significant amount of time and money. VR enables users to create 3D virtual designs and prototypes for evaluation, thus facilitating the rapid correction of design and usability issues. However, VR is not without its pitfalls, for example, it often only facilitates an audiovisual simulation, thus hindering evaluation of the tactile element of design, which is critical to the success of many products. This issue already has a wide body of research associated with it, which explores applications of haptic (tactile) feedback to VR to create a more realistic and accurate virtual experience. However, current haptic technologies can be expensive, cumbersome, hard to integrate with existing design tools, and have limited sensorial output (for example, vibrotactile feedback). Ultrasound Haptic Feedback (UsHF) appears to be a promising technology that offers affordable, unencumbered, integrable and versatile use. The technology achieves this by using ultrasound to create mid-air haptic feedback which users can feel without being attached to a device. However, due to the novel nature of the technology, there is little to no literature dedicated to investigating how users perceive and interpret UsHF stimuli, and how their perception affects the user experience. The research presented in this thesis concerns the human factors of UsHF for engineering design applications. The PhD was borne out of interest from Ultraleap (previously Ultrahaptics), an SME technology developer, on how their mid-air haptic feedback device could be used within the field of engineering. Six studies (five experimental and one qualitative) were conducted in order to explore the human factors of UsHF, with a view of understanding its viability for use in engineering ii design. This was achieved by exploring the tactile ability of users in mid-air object size discrimination, absolute tactile thresholds, perception of intensity differences, and normalisation of UsHF intensity. These measures were also tested against individual differences in age, gender and fingertip/hand size during the early stages, with latter stages focussing on the same measures when UsHF was compared to 2D multimodal and physical environments. The findings demonstrated no evidence of individual differences in UsHF tactile acuity and perception of UsHF stimuli. However, the results did highlight clear limitations in object size discrimination and absolute tactile thresholds. Interestingly, the results also demonstrated psychophysical variation in the perception of UsHF intensity differences, with intensity differences having a significant effect on how object size is perceived. Comparisons between multimodal UsHF and physical size discrimination were also conducted and found size discrimination accuracy of physical objects to be better than visuo-haptic (UsHF) size discrimination. Qualitative studies revealed an optimistic attitude towards VR for engineering design applications, particularly within the design, review, and prototyping stages, with many suggesting the addition of haptic feedback could be beneficial to the process. This thesis offers a novel contribution to the field of human factors for mid-air haptics, and in particular for the use of this technology as part of the engineering design process. The results indicate that UsHF in its current state could not offer a replacement for all physical prototypes within the design process; however, UsHF may still have a place in the virtual design process where haptic feedback is required but is less reliant on the accurate portrayal of virtual objects, for example, during early stage evaluations supplemented by later physical prototypes, simply to indicate contact with virtual objects, or when sharing designs with stakeholders and multidisciplinary teams.
... UsHF has been applied in several contexts with some success. For example, within entertainment, UsHF has been used to add haptic feedback to a VR rhythm game (Georgiou, et al., 2018) and to compliment a children's wizard game in which players could cast spells within VR (Martinez, Griffiths, Biscione, Georgiou, & Carter, 2018). ...
... UsHF has been applied in several contexts with some success. For example, within entertainment, UsHF has been used to add haptic feedback to a VR rhythm game (Georgiou, et al., 2018) and to compliment a children's wizard game in which players could cast spells within VR (Martinez, Griffiths, Biscione, Georgiou, & Carter, 2018). ...
Article
The engineering design process can be complex and often involves reiteration of design activities in order to improve outcomes. Traditionally, the design process consists of many physical elements, for example, clay/foam modelling and more recently Additive Manufacturing (AM), with an iterative cycle of user testing of these physical prototypes. The time associated with creating physical prototypes can lengthen the time it takes to develop one product, and thus, comes at a burdensome financial and labour cost. Due to the aforementioned constraints of the conventional design process, more research is being conducted into applications of Virtual Reality (VR) to complement stages of the design process that would otherwise take and cost a significant amount of time and money. VR enables users to create 3D virtual designs and prototypes for evaluation, thus facilitating the rapid correction of design and usability issues. However, VR is not without its pitfalls, for example, it often only facilitates an audio-visual simulation, thus hindering evaluation of the tactile element of design, which is critical to the success of many products. This issue already has a wide body of research associated with it, which explores applications of haptic (tactile) feedback to VR to create a more realistic and accurate virtual experience. However, current haptic technologies can be expensive, cumbersome, hard to integrate with existing design tools, and have limited sensorial output (for example, vibrotactile feedback). Ultrasound Haptic Feedback (UsHF) appears to be a promising technology that offers affordable, unencumbered, integrable and versatile use. The technology achieves this by using ultrasound to create mid-air haptic feedback which users can feel without being attached to a device. However, due to the novel nature of the technology, there is little to no literature dedicated to investigating how users perceive and interpret UsHF stimuli, and how their perception affects the user experience. The research presented in this thesis concerns the human factors of UsHF for engineering design applications. The PhD was borne out of interest from Ultraleap (previously Ultrahaptics), an SME technology developer, on how their mid-air haptic feedback device could be used within the field of engineering. Six studies (five experimental and one qualitative) were conducted in order to explore the human factors of UsHF, with a view of understanding its viability for use in engineering design. This was achieved by exploring the tactile ability of users in mid-air object size discrimination, absolute tactile thresholds, perception of intensity differences, and normalisation of UsHF intensity. These measures were also tested against individual differences in age, gender and fingertip/hand size during the early stages, with latter stages focussing on the same measures when UsHF was compared to 2D multimodal and physical environments. The findings demonstrated no evidence of individual differences in UsHF tactile acuity and perception of UsHF stimuli. However, the results did highlight clear limitations in object size discrimination and absolute tactile thresholds. Interestingly, the results also demonstrated psychophysical variation in the perception of UsHF intensity differences, with intensity differences having a significant effect on how object size is perceived. Comparisons between multimodal UsHF and physical size discrimination were also conducted and found size discrimination accuracy of physical objects to be better than visuo-haptic (UsHF) size discrimination. Qualitative studies revealed an optimistic attitude towards VR for engineering design applications, particularly within the design, review, and prototyping stages, with many suggesting the addition of haptic feedback could be beneficial to the process. This thesis offers a novel contribution to the field of human factors for mid-air haptics, and in particular for the use of this technology as part of the engineering design process. The results indicate that UsHF in its current state could not offer a replacement for all physical prototypes within the design process; however, UsHF may still have a place in the virtual design process where haptic feedback is required but is less reliant on the accurate portrayal of virtual objects, for example, during early stage evaluations supplemented by later physical prototypes, simply to indicate contact with virtual objects, or when sharing designs with stakeholders and multidisciplinary teams.
... Kinesthetic information includes all body movement data, such as position, force, velocity, torque and angular velocity, while tactile information refers to an object's surface properties, including shape, surface texture, and temperature. By using different haptic patterns, a wide range of sensations that mimic supernatural interactions are generated (Martinez et al. 2018). ...
Article
Full-text available
Emerging multimedia technologies significantly enhance the naturalness and immersion of human computer interaction. Currently, research on kinesthetic information has gained increasing attentions of multimedia community. However, effective interaction between kinesthetic and other multimedia signals remains a challenging task. In this paper, we propose a visual-kinesthetic interaction in Virtual Reality (VR) and real-world control tasks. First, we model the correlation between user’s visual attention and kinesthetic positions under different tasks. Second, we utilize an attention-based Long Short-Term Memory network to predict the kinesthetic positions. Third, we build a VR system with robotic car control, which validates our model in VR interaction and control tasks. With a high task achievement rate, we envision the implementation of kinesthetic information in a more natural interaction system. The VR interaction system based on the proposed model can also provide guidance for the design of immersive robot teleoperation systems.
... The effect is commonly created by VR headsets consisting of a head-mounted display with a small screen in front of the eyes but can also be created through specially designed rooms with multiple large screens. VR typically incorporates auditory and video feedback, but it may also allow other types of sensory and force feedback through haptic technology [208]. ...
Article
Full-text available
Artificial intelligence (AI) and 5G system have been two hot technical areas that are changing the world. On the deep convergence of computing and communication, networking systems of AI (NSAI) is presenting a paradigm shift, where distributed AI becomes immersive in all elements of the network, i.e., cloud, edge, and terminal devices, which make AI virtually operating as a networking system. On the other hand, by the evolution of the communication systems, a network is becoming a service-specific system interweaved with AI, i.e., the network operates as an AI system, enabling real-time smart services. With the developing technology trends of 'AI as a network and network as an AI,' the ecosystem of NSAI can be presenting the next-generation waves of both AI systems and B5G-6G communication networks. In this article, we mainly aim to provide a comprehensive survey on the system architecture, key technologies, application scenarios, challenges, and opportunities of NSAI, which can shed light on the future developments of both telecommunications and AI computing. The contributions of this article also include: 1) providing a unified framework for the deep convergence of computing and communications, where the network and application/service can be jointly optimized as a single integrated system and 2) suggesting the roadmap and open research problems in realizing the online-evolutive integration of cyberspace, physical world, and human society, toward the ubiquitous brain networks (UBNs), which are requiring the joint efforts from both research communities of computing and communication.
Article
Non-contact, mid-air haptic devices have been utilized for a wide variety of experiences, including those in extended reality, public displays, medical, and automotive domains. In this work, we explore the use of synthetic jets as a promising and under-explored mid-air haptic feedback method. We show how synthetic jets can scale from compact, low-powered devices, all the way to large, long-range, and steerable devices. We built seven functional prototypes targeting different application domains, in order to illustrate the broad applicability of our approach. These example devices are capable of rendering complex haptic effects, varying in both time and space. We quantify the physical performance of our designs using spatial pressure and wind flow measurements, and validate their compelling effect on users with stimuli recognition and qualitative studies.
Conference Paper
Full-text available
This paper presents a short review of the history surrounding the development of haptic feedback systems, from early manipulators and telerobots, used in the nuclear and subsea industries, to today's impressive desktop devices, used to support real-time interaction with 3D visual simulations, or Virtual Reality. Four examples of recent VR projects are described, illustrating the use of haptic feedback in ceramics, aerospace, surgical and defence applications. These examples serve to illustrate the premise that haptic feedback systems have evolved much faster than their visual display counterparts and are, today, delivering impressive peripheral devices that are truly usable by non -specialist users of computing technology.
Conference Paper
Full-text available
This paper describes a new tactile device which produces stress fields in 3D space. Combined with 3D stereoscopic displays, this device is expected to provide high-fidelity tactile feedback for the interaction with 3D visual objects. The principle is based on a non-linear phenomenon of ultrasound, acoustic radiation pressure. We fabricated a prototype device to confirm the feasibility as a tactile display. The prototype consists of 91 airborne ultrasound transducers packed in the hexagonal arrangement, a 12 channel driving circuit, and a PC. The transducers which were in the same distance from the center of the transducer array were connected to form a 12 channel annular array. The measured total output force within the focal region was 0.8 gf. The spatial resolution was 20 mm. The prototype could produce sufficient vibrations up to 1 kHz.
Article
Full-text available
ABSTRACT This paper presents a survey of the main results obtained in the field of “pseudo-haptic feedback”: a technique meant to simulate haptic sensations in virtual environments using visual feedback and properties of human visuo-haptic perception. Pseudo-haptic feedback uses vision to distort haptic perception and verges on haptic illusions. Pseudo-haptic feedback has been used to simulate various haptic properties such as the stiffness of a virtual spring, the texture of an image, or the mass of a virtual object. This paper describes the several experiments inwhich,these haptic properties were simulated. It assesses the definition and the properties of pseudo-hapticfeedback. It also describes several virtual reality applications in which pseudo-haptic feedback has been successfully implemented, such as a virtual environment for vocational training of milling machine operations, or a medical simulator for training in regional anesthesia procedures. Pseudo-Haptic Feedback 3
Conference Paper
This paper presents a short review of the history surrounding the development of haptic feedback systems, from early manipulators and telerobots, used in the nuclear and subsea industries, to today's impressive desktop devices, used to support real-time interaction with 3D visual simulations, or Virtual Reality. Four examples of recent VR projects are described, illustrating the use of haptic feedback in ceramics, aerospace, surgical and defence applications. These examples serve to illustrate the premise that haptic feedback systems have evolved much faster than their visual display counterparts and are, today, delivering impressive peripheral devices that are truly usable by non-specialist users of computing technology.
Article
This paper presents a short review of the history surrounding the development of haptic feedback systems, from early manipulators and telerobots, used in the nuclear and subsea industries, to today's impressive desktop devices, used to support real-time interaction with 3D visual simulations, or Virtual Reality. Four examples of recent VR projects are described, illustrating the use of haptic feedback in ceramics, aerospace, surgical and defence applications. These examples serve to illustrate the premise that haptic feedback systems have evolved much faster than their visual display counterparts and are, today, delivering impressive peripheral devices that are truly usable by non-specialist users of computing technology. Keywords Haptics, Virtual Reality, telepresence, human factors, ceramics, surgery, aerospace, defence.