Article

Force shading for haptic shape perception

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... First, performance of the hardware can be assessed using human subject testing, usually via methods that assess human performance of tasks in the haptic virtual environment, or that measure the individual's perception of the qualities of the virtual environment. To this end, researchers have studied the effects of software on the haptic perception of virtual environments (Rosenberg & Adelstein, 1993;Millman & Colgate, 1995;Morgenbesser & Srinivasan, 1996). Morgenbesser and Srinivasan (1996), for example, looked at the effects of force shading algorithms on the perception of shapes. ...
... To this end, researchers have studied the effects of software on the haptic perception of virtual environments (Rosenberg & Adelstein, 1993;Millman & Colgate, 1995;Morgenbesser & Srinivasan, 1996). Morgenbesser and Srinivasan (1996), for example, looked at the effects of force shading algorithms on the perception of shapes. It is also common to compare performance of tasks in a simulated environment with a particular device to performance in an equivalent real-world environment (Buttolo et al., 1995;West & Cutkosky, 1997;Richard et al., 1999;Shimojo et al., 1999;Unger et al., 2001). ...
Chapter
Full-text available
The word "haptic" refers to the sense of touch. This sense is essentially twofold, including both cutaneous touch and kinesthetic touch. Cutaneous touch refers to the sensation of surface features and tactile perception and is usually conveyed through the skin. Kinesthetic touch sensations, which arise within the muscles and tendons, allow one to interpret where one's limbs are in space and in relation to him or her. The sense of touch is one of the most informative senses that humans possess. Mechanical interaction with a given environment is vital when a sense of presence is desired, or when a user wishes to manipulate objects within a remote or virtual environment with manual dexterity. The haptic display, or force-reflecting interface, is the robotic device that allows the user to interact with a virtual environment or teleoperated remote system. The haptic interface consists of a real-time display of a virtual or remote environment and a manipulator, which serves as the interface between the human operator and the simulation. The application of haptic interfaces in areas such as computer-aided design (CAD) and manufacturing (CAD/CAM), design prototyping, and allowing users to manipulate virtual objects before manufacturing them enhances production evaluation. Along the same lines, the users of simulators for training in surgical procedures, control panel operations, and hostile work environments benefit from such a capability. Haptic interfaces can also be employed to provide force feedback during execution of remote tasks such as telesurgery or hazardous waste removal. With such a wide range of applications, the benefits of haptic feedback are easily recognizable.
... In addition to the interaction paradigms described above, additional techniques have been proposed for displaying surface properties such as shape, friction, and texture of virtual objects. Morgenbesser and Srinivasan (1996) have proposed force-shading methods to smooth the feel of polyhedral objects by eliminating the force discontinuities at polygonal boundaries. Salcudean and Vlaar (1994), Salisbury et al. (1995), Chen et al. (1997) and Green and Salisbury (1997) have proposed techniques to simulate friction on smooth surfaces. ...
... Minsky (1995) presented a method to simulate 2-D haptic textures by perturbing the direction of reaction force. In contrast to the simulation of surface properties that add roughness to a smooth surface, the force-shading technique (Morgenbesser & Srinivasan, 1996;Basdogan et al., 1997;Ruspini et al., 1997) eliminates the surface roughness. When the objects are represented as polyhedrons, the surface normal and the force magnitude are usually discontinuous at the edges. ...
Article
Full-text available
Computer haptics, an emerging field of research that is analogous to computer graphics , is concerned with the generation and rendering of haptic virtual objects. In this paper, we propose an efficient haptic rendering method for displaying the feel of 3-D polyhedral objects in virtual environments (VEs). Using this method and a haptic interface device, the users can manually explore and feel the shape and surface details of virtual objects. The main component of our rendering method is the ''neighborhood watch'' algorithm that takes advantage of precomputed connectivity information for detecting collisions between the end effector of a force-reflecting robot and polyhe-dral objects in VEs. We use a hierarchical database, multithreading techniques, and efficient search procedures to reduce the computational time such that the haptic servo rate after the first contact is essentially independent of the number of polygons that represent the object. We also propose efficient methods for displaying surface properties of objects such as haptic texture and friction. Our haptic-texturing techniques and friction model can add surface details onto convex or concave 3-D po-lygonal surfaces. These haptic-rendering techniques can be extended to display dynamics of rigid and deformable objects.
... If the operational point is not touching the feature, yet it is below the surface height, the force displayed corresponds to that penetration distance. In addition, the piece-wise linear surface created by interpolation between radii can be smoothed using force shading [29]. ...
... If the operational point is not touching the feature, yet it is below the surface height, the force displayed corresponds to that penetration distance. In addition, the piece-wise linear surface created by interpolation between radii can be smoothed using force shading [29]. The simple method described above creates a ripple effect on the surface due to the discrete skeleton. ...
Chapter
Full-text available
This work develops a methodology for building haptic reality-based modeling systems by exploiting the complementary goals of haptic exploration and display. While the generally unstructured nature of haptic exploration makes it difficult to develop and control autonomous robotic fingers, simultaneous consideration of exploration and display provides a way to characterize the required sensing, control, finger geometry, and haptic modeling components. Both the exploring robot and haptic display must complement an appropriate virtual model, which can accurately recreate remote or hazardous environments. We present an example of haptic exploration, modeling and display for surface features explored using a three-degree-of-freedom spherical robotic fingertip with a tactile sensor.
... Brain neurons exhibit a tremendous complexity that severely hinders the ability of scientists to understand their connections and mechanisms. Each human brain is estimated to contain roughly 10 11 neurons, and this complexity grows by orders of magnitude for neural dendrites (Figures 1). Therefore, visualization and interpretation of even a small portion of the brain becomes a daunting task. ...
... Early proxy-based methods suffered discontinuities at convex regions of the constraint surface, because the proxy would suddenly jump to distant locations. Morgenbesser et al. [11] introduced force shading techniques to smooth forces on piecewise linear constraint surfaces. Later, Ruspini et al. [14] improved force shading to deal with cases where the proxy is in contact with multiple intersecting shaded planes at the same time. ...
Conference Paper
Full-text available
Neural connections in the brain are arranged in dense, complex networks of filiform (i.e. thread-like) structures. Navigation along complex filiform networks entails problems including path following, path selection at complex branching nodes, and the combination of large-scale with fine-scale exploration. This work proposes haptic interaction techniques to aid in the navigation along such complex filiform structures. It introduces a constraint-based path following approach with smooth node transition and intuitive branch selection during fine-scale exploration, as well as user-friendly camera control and visual aids for large-scale exploration. We demonstrate the successful application of these techniques to the analysis and understanding of dense neural networks.
... Moreover, the studies on human perception conducted in virtual environments have demonstrated that manipulation of force cues can also significantly alter our perception of object size, shape, and surface. For example, the direction of the force vector reflected through a haptic device was altered in real-time to generate illusory bumps or troughs on an otherwise flat surface [30], [31]. Using this concept, when force cues of a hole (bump) were combined with the geometric cues of a bump (hole), it has been shown that humans perceive a hole (bump) [32], [33]. ...
Article
Full-text available
In virtual/augmented/mixed reality (VR/AR/MR) applications, rendering soft virtual objects using a hand-held haptic device is challenging due to the anatomical restrictions of the hand and the ungrounded nature of the design, which affect the selection of actuators and sensors and hence limit the resolution and range of forces displayed by the device. We developed a cable-driven haptic device for rendering the net forces involved in grasping and squeezing 3D virtual compliant (soft) objects being held between the index finger and thumb only. Using the proposed device, we investigate the perception of soft objects in virtual environments. We show that the range of object stiffness that can be effectively conveyed to a user in virtual environments (VEs) can be significantly expanded by controlling the relationship between the visual and haptic cues. We propose that a single variable, named Apparent Stiffness Difference , can predict the pattern of human stiffness perception under manipulated conflict, which can be used for rendering a range of soft objects in VEs larger than what is achievable by a haptic device alone due to its physical limits.
... According to Salisbury, "haptic rendering is the process of computing and generating forces in response to user interactions with virtual objects" [20]. Early haptic rendering research investigated models for user-object interaction through points of action, and in this context, God object methods [21] as well as shape and texture representation methods [22,23] have been proposed. Subsequently, a model that approximates the surface contact by the skin was examined based on the demand for the reality of the gripping operation by the finger or hand. ...
Article
Full-text available
The tactile information to be presented to the user during interaction with a virtual object is calculated by simulating the contact between the object model and user model. The flexible skin tissue of human hands causes a distributed force on the contact area and results in deformation of the skin tissue. This is the target contact state that should be presented by the device. However, most multipoint haptic displays do not have sufficient degrees of freedom to represent the target contact state. In this paper, the concept of deformation matching is proposed and formulated, whereby the output force is calculated to minimize the error between the target skin deformation and skin deformation owing to the device output force. For comparison, the conventional concept of force matching is also formulated. The difference in human perception between these two methods in the expression of friction was investigated through experiments using subjects, in which a pin-array tactile display capable of stimulating 128 points was used. It was demonstrated that the perception of the friction coefficient was more sensitive and the perception of the friction direction was more accurate in the deformation matching than in the force matching.
... Research in human-machine interfaces suggests that LFFs can be used for haptic rendering of control buttons [5]. Also, the modulation of the direction of a force has been reported to affect the perception of curvature [2]. ...
Conference Paper
Full-text available
Lateral force fields (LFFs) have been used before to generate haptic textures [3]. We propose that LFFs can be used to study haptic shape perception. We present preliminary results of an experiment in which human subjects interact with realistic LFFs. The LFFs encode shape information in the magnitude of unidimensional force vectors. Subjects explore the LFFs and classify them into haptic categories. We found that subjects can consistently perform this classification. This and subjects’ qualitative judgments of the stimuli suggest that haptic interaction with LFFs resembles the experience of touching a real 3D object.
... 25 Thompson (1997). 26 Avila and Sobierajski (1996). 27 Barbagli, Devengenzo and Salisbury (2003). ...
Research
Full-text available
In this project, the design of a haptic device called pantograph is aimed. This device is designed to be a planar mechanism with 2 degrees of freedom. The basic feature of the device is that it has a haptic rendering system. The fact that the device has this system allows users to receive force feedback and understand the shape of virtual objects using virtual reality. Haptic devices that can be used in many different areas such as aviation, medical, entertainment, education is widely used and the device designed in this project will be used for educational purposes together with simulation systems. The device was designed on the basis of 5 bar planar mechanisms and was programmed with the help of Matlab and Arduino. The device is designed to be portable, configurable, easy to program and low cost. The device must be able to operate in a working space of 100mm * 100mm and be able to apply an isotropic force of at least 2.5N to the user. In order to obtain sufficient information about the project and the technology used, a literature survey was conducted. Based on the results of this research, a mechanical design of the device was established and drawings were made using the SolidWorks program. Then, location and force analyzes were performed to ensure that the device meets the optimum working requirements. Following the completion of these stages, the production phase began. Some of the parts to be used in the device were manufactured using a 3D printer, while others were purchased. After assembly of the parts, the programming phase started and, with the completion of this phase, the device was made ready for use. //In this process, we would like to express our appreciativeness to our supervisor Associate Professor Yiğit TAŞÇIOĞLU for his countless guidance and encouragement throughout our senior design project. He has demonstrated such a significant support that he has been there to listen our problems and gave his valuable support and advices every time we reach a stalemate. His guidance has assisted us in all stages of our project including design, production and testing. We also would like to thank all academic members of Mechanical Engineering Department for their endless guidance, support, professional attitude towards us along our undergraduate program. It is also worth to emphasize the invaluable contribution of Assistant Eren ERGÜL with his understanding attitude towards us. Finally, we wish to express gratitude to our families for their precious support.
... Strategies analogous to Gourard or Phong shading, used for interpolating normals for lighting, can be developed for haptic rendering. The paper by Morgenbesser and Srinivasan (1996) was the first to demonstrate the use of force shading for haptic rendering. Using a similar technique to Phong shading Salisbury et al. (1995) found that a smooth model could be perceived from a coarse three-dimensional model. ...
Chapter
Full-text available
This chapter will introduce the fundamental metaphor for haptic interaction: single-point contact or three-degree-of-freedom haptic rendering. Due to its fundamental aspect, we first review some of the introductory concepts about human-computer interaction through a haptic display.
... O modelo pode ser desenvolvido a partir de princípios básicos do cotidiano, ou parametrizado para representar apenas alguns aspectos desejados [31]. As características do tato humano permitem, em alguns casos, o uso de modelos físicos simplificados para gerar objetos virtuais que ofereçam estímulos táteis que competem em realismo com objetos verdadeiros, como foi demonstrado em [32]. ...
Thesis
Full-text available
The success of many teleoperation tasks depends heavily on the skills of the operator and his ability to perceive the work environment. Visual feedback, in many cases, is not sufficient e.g. when the image quality of the work environment is low, occlusions occur in the display, or when the task involves contact forces associated with visually unnoticeable small clearances. To compensate for these shortcomings, haptic devices emerge as an alternative to visual feedback, in which touch interaction with the user produces force-feedback. This thesis presents the development and modeling of a haptic interface system of five degrees of freedom for the teleoperation of robot manipulators, focusing on those that work in hazardous or hostile environments for humans. The interface is developed from the coupling of two commercial haptic devices “Novint Falcon”, with three degrees of freedom each. The system resulting from the coupled devices is modeled as a parallel manipulator capable of providing the operator with 3D force feedback (in three dimensions) and torque feedback in two directions. To demonstrate the effectiveness of the developed haptic system, a virtual environment is implemented with the aid of computer graphics techniques and libraries such as OpenGL, ODE and Chai3D. The kinematic and dynamic models of a serial manipulator Schilling Titan IV, with six degrees of freedom, are implemented in the virtual environment, including its interaction with virtual objects for the evaluation of typical teleoperation tasks. Nonlinear controllers are implemented in the virtual serial manipulator, including computed torque and sliding mode control.
... Finding a set of active constraints.2. Starting from its old position, the algorithm identifies the new position as the point on active constraint that is closest to the current avatar position.Force Shading Morgenbesser and Srinivasan[42] introduced the force shading concept, which refers to a controlled variation in the direction of the force vector displayed by the haptic renderer for the purpose of creating the illusion of a nonflat shape on a nominally flat surface. This allows to create the illusion of a smoothly curved shape which is actually represented as a mesh of polygons.When virtual objects are represented with polygonal meshes, god-object or proxy approaches can be used for finding the contact point and the force value when a collision occurs. ...
... The next two sections of this chapter introduce psychophysical analysis methods for measuring human sensory limits and explain how they can be used to achieve the objectives of this research. The last section briefly explains how analysis of variance can 1 The orientation of the reaction force can be modified by the force shading techniques to remove the discontinuities between discrete haptic surfaces [27]. be used to analyze the significance of the relationship between the control variables and the results. ...
... where L is the cost function to be minimized, l 1 , l 2 , l 3 are Lagrange multipliers and (A i , B i , C i , D i ) are the homogeneous coefficients for the constraint plane equations on which the proxy lies. The 'force shading' technique (haptic equivalent of Phong shading) introduced by Morgenbesser and Srinivasan refined the above algorithm while rendering smooth objects [11]. Mesh based haptic rendering is not amenable to object scaling as the constraint equation for the planes (A i , B i , C i , D i ) must be recomputed. ...
Conference Paper
Full-text available
In this work, we address the issue of virtual representation of objects of cultural heritage for haptic interaction. Our main focus is to provide a haptic access of artistic objects of any physical scale to the differently abled people. This is a low-cost system and, in conjunction with a stereoscopic visual display, gives a better immersive experience even to the sighted persons. To achieve this, we propose a simple multilevel, proxy-based hapto-visual rendering technique for point cloud data which includes the much desired scalability feature which enables the users to change the scale of the objects adaptively during the haptic interaction. For the proposed haptic rendering technique the proxy updation loop runs at a rate 100 times faster than the required haptic updation frequency of 1KHz. We observe that this functionality augments very well to the realism of the experience.
... The results, reported in terms of the maximum angular changes between adjacent polygons for rendering smooth objects, suggested that the new shading algorithm can significantly reduce a user's sensitivity to discontinuities in polygonal models. Compared to other shading schemes (e.g., [30]), the new shading algorithm can provide both tactile (contact location) and force shading. The thresholds obtained in [29] were incorporated into a 3D force and contact-location rendering platform and evaluated with a 3D object identification task [31]. ...
... where L is the cost function to be minimized, l 1 , l 2 , l 3 are Lagrange multipliers and A, B, C, D are the homogeneous coefficients for the constraint plane equations. The 'force shading' technique (haptic equivalent of Phong shading) introduced by Morgenbesser and Srinivasan refined the above algorithm while rendering smooth objects [20]. One common problem with the mesh based representation is when the object is not fully enclosed by the bounding planes and a small hole may remain, where the IHIP sinks during the rendering process. ...
... To provide the user with an impression of a smooth transition between triangles, several approaches are presented. An approach with a low complexity presented in [24], [25] is called " Force Shading " . Likewise the Gouraud shading in graphical rendering, we define a vector field being a substitute of the face normal at an arbitrary point on the triangle. ...
Conference Paper
Full-text available
Real-time cloth simulation involves many computational challenges to be solved, particularly in the context of haptic applications, where high frame rates are necessary for obtaining a satisfactory tactile experience.In this paper, we present a realtime cloth simulation system which offers a compromise between a realistic physically based simulation of fabrics and a haptic application with high requirements in terms of computation speed. We give emphasis on architecture and algorithmic choices for obtaining the best compromise in the context of haptic applications. A first implementation using a haptic device demonstrates the features of the proposed system and leads to the development of new approaches for the haptic rendering using the proposed approach.
... The characteristics of the human haptic system allow in some cases the use of simpli® ed physical models to render haptic objects that compete in realism with actually physical objects (Flanagan and Lederman, 2001;Minsky, 1995;Morgenbesser and Srinivasan, 1996;Robles-De-La-Torre and Hayward, 2001). Another possibility is the recording of ground data and replaying it as a function of state variables and/or time (Okamura et al., 2000). ...
... Strategies analogous to Gourard or Phong shading, used for interpolating normals for lighting, can be developed for haptic rendering. The paper by Morgenbesser and Srinivasan (1996) was the first to demonstrate the use of force shading for haptic rendering. Using a similar technique to Phong shading Salisbury et al. (1995) found that a smooth model could be perceived from a coarse three-dimensional model. ...
... Virtual reality technologies are now becoming sufficiently developed so that simulations of cultural or architectural sites, or virtual environments for design, especially using immersive displays, are successful in making users feel immersed in the environment. Haptic technology also allows for more direct manipulation, which could specifically be relevant to 3D shape conceptualization [7] [8]. Haptic feedback groups the modalities of force feedback, tactile feedback, and the proprioceptive feedback [1]. ...
Article
Full-text available
A novel approach to the synthesis and the interac- tion of (with) virtual Environment is presented. In the following paper, the description of a novel multi-point VE-haptic system is given. The system has been devel- oped with the intent of interacting with an interactive VE, based on the constructivist approach. In what fol- low the main feature of the VE and of the interface are given. A special focus has been given to the design phi- losophy:several performances optimizations in terms of user interaction and object manipulation, devices joined workspace, isotropy, arms interferences, con- tinuous force, VE and inter-arm positional coherence have been performed during the design.
... Strategies analogous to Gourard or Phong shading, used for interpolating normals for lighting, can be developed for haptic rendering. The paper by Morgenbesser and Srinivasan (1996) was the first to demonstrate the use of force shading for haptic rendering. Using a similar technique to Phong shading Salisbury et al. (1995) found that a smooth model could be perceived from a coarse three-dimensional model. ...
... These thresholds are quite poor considering that the average visual discrimination threshold for vector directions was only 3.25° [3]. The relatively poor discriminability of force-direction found in [3] explains why techniques such as " force-shading " (rendering force direction by averaging adjacent surface normals [20]) have succeeded in creating a smoother feel of polyhedral objects without introducing noticeable artifacts. ...
Article
The authors report an experiment in which twenty-five partici- pants discriminated force vectors presented along five directions (up, left, right, diagonally up left, diagonally up right). The force vectors were presented with a three degree-of-freedom force- feedback device. A three-interval one-up three-down adaptive procedure was used. The five reference force-direction condi- tions were presented in randomly interleaved order. The results show an average force-direction discrimination threshold of 33° regardless of the reference-force direction. Position data recorded at a nominal sampling rate of 200 Hz revealed a 10.1 mm average displacement of the fingertip between the start and end positions in a trial. The average maximum deviation from the starting position within a trial was 21.3 mm. We conclude that the resolution with which people can discriminate force direction is not dependent on the direction of the force per se. These results are useful for designers of haptic virtual environ- ments.
... The answer is a qualified "yes." "Force shading", a haptic counterpart of bump mapping in computer graphics, presents a non-flat shape on a nominally flat surface by varying the force vector direction in haptic rendering [Morgenbesser andSrinivasan 1996][Robles-De-La-Torre andHayward 2001]. To our knowledge, such phenomena have been qualitatively measured only by 3D devices, and a quantitative comparison to 2D devices has not been made. ...
Conference Paper
Full-text available
Is a 2-dimensional (2D) force feedback device capable of presenting 3-dimensional (3D) shapes? The answer is a qualified "yes." "Force shading", a haptic counterpart of bump mapping in computer graphics, presents a non-flat shape on a nominally flat surface by varying the force vector direction in haptic rendering [Morgenbesser and Srinivasan 1996][Robles-De-La-Torre and Hayward 2001]. To our knowledge, such phenomena have been qualitatively measured only by 3D devices, and a quantitative comparison to 2D devices has not been made. We compare thresholds of human shape perception of the plane experimentally, using 2D and 3D force feedback devices.
... The lateral-force-based haptic illusion is well known (Minsky et al. [11], Robles-De-La-Torre and Hayward [12], Levesque and Hayward [13], Robles-De-La-Torre [14], Smith, et al. [15]). Similar research has also been conducted on force shading [16,17]. By presenting the lateral force distribution shown in the upper side of fig. 1, the device presents the bump illusion shown in the lower side of fig. 1 to the user. ...
Conference Paper
Full-text available
In this paper we discuss about the tactile information required for touchscreen interfaces, and propose a lateral-force-based 2.5-dimensional haptic display. The use of a lateral-force-based display enabled virtual bump sensations to be presented. Recently touchscreen interfaces have become popular worldwide. However, few touchscreens provide tactile feedback. Little hardware is commercialized for such kind of touchscreen. From our consideration, “press,” “guide,” and “sweep” movement are essential information for touchscreens. However, these devices have some operational disadvantages of these information. We intend to cope with these disadvantages by introducing the new idea of lateral-force-based display. We developed an experimental prototype of lateral-force-based display, investigated 2.5-dimensional sensations, realized a part of these essential information, and confirmed the requirement of displayed force control according to spatial wavelength. Furthermore we propose measuring method of spatial wavelengths employing multi-resolution analysis based on Haar wavelet transformation.
... In the haptics literature there exist several approaches to render contact forces. In [17] an effective point-based rendering algorithm was firstly introduced and constantly improved by others, i.e. [19][18].Ruspini et al. [15][16]extended the algorithm to support contacts of arbitrary shapes. In contrast to the well known proxy method a recent approach (see [14]) suggests to compute the contact forces by solving the Signorini contact problem employing finite elements. ...
Conference Paper
Full-text available
The EU funded RTD project "HAPTEX" addresses the challenge of developing a Virtual Reality (VR) system for the realistic and accurate rendering of the physical interactions of humans with textiles, through the real-time generation of artificial visual and haptic stimuli. This challenge concerns the development of both the SW and the HW components of the VR system, as well as it implies a substantial advancement in the understanding of the mechanisms underlying the human haptic perception of fine physical properties like those of textiles. This paper reports some important details relating to the technical implementation of the developed HW and SW components with special emphasis on the issues related to their integration into a single VR system. Furthermore some preliminary results relating to the functional tests carried out on the integrated system are also reported.
... The characteristics of the human haptic system allow in some cases the use of simplified physical models to render haptic objects that compete in realism with actual physical objects (Minsky, 1995;Morgenbesser & Srinivasan, 1996;Robles-De-La-Torre & Hayward, 2001;Flanagan & Lederman, 2001). Another possibility is the recording of ground data and replaying it as a function of state variables and/or time (Okamura et al., 2000). ...
Article
Full-text available
Haptic interfaces enable person-machine communication through touch, and most commonly, in response to user movements. We comment on a distinct property of haptic interfaces, that of providing for simultaneous information exchange between a user and a machine. We also comment on the fact that, like other kinds of displays, they can take advantage of both the strengths and the limitations of human perception. The paper then proceeds with a description of the components and the modus operandi of haptic interfaces, followed by a list of current and prospective applications and a discussion of a cross-section of current device designs.
... In the interpolating normal force shading mode, the surface normal vector used for calculating haptic force feedback instead interpolates three different normal vectors assigned to the vertices of the triangle (the interpolation is based on the current position of the surface contact point in Barycentric coordinates). This interpolating normal force shading results in a smooth transition of the force direction when the trajectory of the contact point passes across a triangle to an adjacent triangle, making sharp edges feel smoother to users [8]. The vertex normals are the same normals used to calculate the surface shading values that are interpolated for Gouraud shading. ...
Article
We investigated the effect of changing two haptic rendering characteristics, the magnitude of friction and presence of haptic normal force shading, in combination with the role of stereoscopic displays and shadows, on the 3D drawing task of tracing curves on 3D virtual surfaces. Eleven subjects used a PHANToM haptic arm to trace target curves on surfaces with a variety of systematically varied geometric characteristics. We found that a higher simulated friction force increased drawing accuracy without slowing down the drawing speed by a statistically significant amount. The use of smooth, interpolated haptic normal force shading, which was combined with visual smooth shading (Gouraud shading), improved speed slightly but did not significantly affect accuracy. Stereo displays improved both drawing speed and accuracy, whereas the presence or absence of the shadow of the drawing tool had no statistically significant effect on either measure. In addition to the effects of these controllable characteristics of the virtual reality interface, we found several significant interactions with the geometric characteristics.
... Research in human-machine interfaces suggests that LFFs can be used for haptic rendering of control buttons [5]. Also, the modulation of the direction of a force has been reported to affect the perception of curvature [2]. ...
Article
Full-text available
Lateral force fields (LFFs) have been used before to gen-erate haptic textures [3]. We propose that LFFs can be used to study haptic shape perception. We present preliminary results of an experiment in which human subjects interact with realis-tic LFFs. The LFFs encode shape information in the magnitude of unidimensional force vectors. Subjects explore the LFFs and classify them into haptic categories. We found that subjects can consistently perform this classification. This and subjects' qual-itative judgments of the stimuli suggest that haptic interaction with LFFs resembles the experience of touching a real 3D ob-ject. INTRODUCTION Margaret Minsky's work on lateral force fields (LFFs) in-dicates that when human subjects interact with such fields they experience them as haptic textures [3]. In general, an LFF asso-ciates forces to positions in a way that is not ordinarily found in haptic interaction with objects. For example, consider the associ-ation of the position (x, y, z) of a manipulandum with a force field defined in a x-y plane. Surely, a mechanical device can be con-structed to produce such an association. But this requires using sliders, springs, etc. For the purpose of this paper, we will con-sider LFFs which consist of forces whose magnitudes are propor-tional to the slope of a function that depends on the (x, y) position of a haptic manipulandum. Minsky used LFFs to generate haptic textures, but we propose that the idea is more general and can be used to generate haptic shapes, too. An LFF can be speci-fied from a function z = S(x, y) that describes the local shape of a 3D object. For example, the slope (the partial derivative of S with respect to x, y or both) can be used to modulate the mag-nitude of force vectors with components in the x and y axes, but not in the z axis. Can haptic interaction with this LFF elicit the perception of touching a 3D object? It seems likely that some of Minsky's stimuli (for example, her virtual gratings) could be haptically perceived by human subjects not only as textured sur-faces, but also as 3D shapes with small-sized features. The same may be true about some stimuli that Minsky labeled as "bumps", but she did not not use them to study haptic shape perception. Research in human-machine interfaces suggests that LFFs can be used for haptic rendering of control buttons [5]. Also, the modulation of the direction of a force has been reported to affect the perception of curvature [2].
... (b) Force shading [1] is similar to Phong shading. The direction of force is altered to the smoothed normal at the nearest point. ...
Article
Piecewise linear polygonal model has only G 0 continuity, thus users can easily feel the edges when using haptic device to touch a solid rep- resented by coarse polygonal meshes. To pro- duce an appealing haptic sensation for smooth solids, a large number of polygons are needed in conventional approaches. This however slows down computation and consumes much more memory. In this paper, we present a method to generate smooth feedback force in haptic in- teraction with coarse polygonal meshes. Our method calculates the interaction force based on Gregory patches, which are locally constructed from n-sided polygons and ensure G 1 continu- ity across boundaries of patches. During the real time haptic interaction, the contact point is continuously tracked on the locally constructed Gregory patches and thus generates smooth haptic forces to be rendered. Our method is validated on various models with comparison to conventional force rendering techniques.
... The results, reported in terms of the maximum angular changes between adjacent polygons for rendering smooth objects, suggested that the new shading algorithm can significantly reduce a user's sensitivity to discontinuities in polygonal models. Compared to other shading schemes (e.g., [28]), the new shading algorithm can provide both tactile (contact location) and force shading. The thresholds obtained in [27] was incorporated into a park183@purdue.edu, ...
Conference Paper
Full-text available
The effect of contact location information on the perception of virtual edges was investigated by comparing human edge sharpness discrimination under the force-alone and force-plus-contact-location conditions. The virtual object consisted of the 2D profile of an edge with two adjoining surfaces. Edge sharpness JNDs for both conditions increased from about 2 to 7 mm as the edge radii increased from 2.5 to 20.0 mm, and no significant difference was found between the two conditions. A follow-up experiment with the contact-location alone condition resulted in higher (worse) edge sharpness discrimination thresholds, especially at higher edge radius values. Our results suggest that contact location cues alone are capable of conveying edge sharpness information, but that force cues dominate edge sharpness perception when both types of cues are available.
... It has been found that force-feedback information alone can elicit complex perceptions relating to haptic texture and shape [3] [4] [5] [6] [7]. Such perceptions can be considered as haptic perceptual illusions of texture and shape. ...
Conference Paper
Full-text available
Achieving realistic rendering of thin and spatially sharp objects (needles, for example) is an important open problem in computer haptics. Intrinsic mechanical properties of users, such as limb inertia, as well as mechanical and bandwidth limitations in haptic interfaces make this a very challenging problem. A successful rendering algorithm should also provide stable contact with a haptic virtual object. Here, perceptual illusions have been used to overcome some of these limitations to render objects with perceived sharp features. The feasibility of the approach was tested using a haptics-to-vision matching task. Results suggest that lateral-force-based illusory shapes can be used to render sharp objects, while also providing stable contact during virtual object exploration
Article
A technique is presented for providing users with realistic haptic feedback as they interact with virtual rigid objects. The technique imposes no restrictions on the algorithms used to simulate the virtual environment. The force control loop is decoupled from the simulation by employing a local model of the interaction updated periodically by the virtual environment. This local model includes a proxy of the virtual object manipulated by the user and a geometric description of the motion constraints imposed on the proxy by neighboring objects. Two schemes are employed to coordinate the proxy and the haptic device. A planar haptic interaction system, including a rigid mass that can be translated and rotated in the presence of constraints, is used to evaluate the performance of the proposed technique.
Chapter
The term haptics originates from the 19th century, where it was used mainly in relation to psychophysics research. It is derived from the Greek word haptikos, which means “able to touch/grasp”. Today it is used to describe all tactile (related to skin deformation), kinesthetic (related to muscle forces) and proprioceptive (related to joint positions) sensations in the body. An important aspect to note about haptics is that it involves both a passive receptive and an active explorative component, thus, requiring bi-directional input and output.
Article
This work was motivated by the need for perceptualizing nano-scale scientific data, e.g., those acquired by a scanning probe microscope, where collocated topography and stiffness distribution of a surface can be measured. Previous research showed that when the topography of a surface with spatially varying stiffness is rendered using the conventional penalty-based haptic rendering method, the topography perceived by the user could be significantly distorted from its original model. In the worst case, a higher region with a smaller stiffness value can be perceived to be lower than a lower region with a larger stiffness value. This problem was explained by the theory of force constancy: the user tends to maintain an invariant contact force when s/he strokes the surface to perceive its topography. In this paper, we present a haptization algorithm that can render the shape of a mesh surface and its stiffness distribution with high perceptual accuracy. Our algorithm adaptively changes the surface topography on the basis of the force constancy theory to deliver adequate shape information to the user while preserving the stiffness perception. We also evaluated the performance of the proposed haptization algorithm in comparison to the constraint-based algorithm by examining relevant proximal stimuli and carrying out a user experiment. Results demonstrated that our algorithm could improve the perceptual accuracy of shape and reduce the exploration time, thereby leading to more accurate and efficient haptization.
Article
The edge effect is a problem that has to be tackled when performing haptic interaction with discontinuous primitives. In this paper, an innovated algorithm is designed to render a smooth haptic feedback force with a locally constructed C1 continuous Gregory patch. The continuous Gregory patch is generated from n-sided polygon, which is determined by a real-time contact region prediction method. The contact region prediction algorithm, derived from the dynamic model of the haptic device, is able to deal with the inconsistency of the local nearest point and global nearest point when obtaining the potential contact region. The parametric patch can be achieved in three steps employing boundary generation, height model interpolation, and Gregory patch construction. For a better shape preserving character, the height model of the contact region is respected by the parametric Gregory patch construction algorithm. The generated patch is continuous on boundaries and can render continuous feedback force as the proxy point transits between different patches. Since the presented scheme needs fewer primitives than conventional method, it consumes less memory and runs more efficiently in computation. The experimental results have shown that the smooth haptic force can be achieved with the proposed method. Meanwhile, the motion predictor also presents a good performance in the validating experiment. [DOI: 10.1115/1.4007170]
Conference Paper
Full-text available
Y α X d O d Cylindrical Linear r F/stiffness α Fig. 3 Contour surface for a constant force. When the subject touches the stimulus shape with constant force F, the locus of the cursor follows the surface shown in solid lines. O X Y α 20mm Linear force shading Cylindrical force shading 0 2 4 6 8 10 12 0 0. A-type (Cyl.) Fig. 1 Sectional image of the stimulus. Angle α defines the size of each tangent plane that approximates the cylindrical surface. Black dots denote cursor positions and dark gray arrows show feedback force directions for Cylindrical (left) and Linear (right) conditions. Introduction The effect of spatial resolution in haptic rendering [1] remains to be well understood or studied. How much resolution is actually required? How sensitive are human beings? To answer such questions, we conducted psychophysical experiments to measure absolute thresholds of haptic smoothness in perceiving curved surfaces presented by two point-contact type force feedback devices with different spatial resolutions.. Experiments Apparatus:The stimulus (Fig.1) was a polygonal approximation of a cylindrical surface by tangent planes. It was displayed haptically using PHANToM 1.0 T (nominal positional res.: 0.07mm) and A (0.03mm) (SensAble Technologies, Inc.). Subjects touched the stimulus and responded "yes" if they felt it to be a smooth cylindrical surface and "no" if not. Resolution angle α was varied in 1-degree increments and threshold angles of surface smoothness were measured by the method of limits. To avoid the effect of haptic bump mapping, the direction of feedback force was force shaded.[2]] Parameters: Radius of stimulus: {30, 60, 90, 120} (mm); Surface stiffness: 0.5 (N/mm); Force shading: {Cylindrical, Linear}. In the "Cylindrical" condition, the feedback force vector magnitude was the surface stiffness (0.5 N/mm) times penetration depth of the cursor into the stimulus (mm) and its direction was that of an actual cylinder. In the "Linear" condition, the feedback force vector was a linear interpolation of two vectors to the first and second tangent planes nearest to the cursor (Fig. 1). Force applied by subjects was also recorded at several points on the stimulus.. Results Fig. 2 shows that the threshold angle is positively proportional to the curvature (1/radius) (R=0.7-0.9). The analysis of covariance on their regression lines showed that (1) inclinations do not differ statistically (p<0.4), (2) two lines for the Linear condition do not differ (p<0.9), and (3) line pairs {T-Cyl., A-Cyl.} and {T-Cyl., T-Linear} are different statistically at 1% level. The magnitude of applied force was almost constant over the surface (about 0.7-1.0 N), but depended strongly on the individual subject. It is weakly positively proportional to the radius.. Discussion Since the applied force has a constant magnitude, the shape that the subject actually touches should be a "contour surface" for force magnitude F (Fig. 3). Let d be the maximum difference between the contour surface and a cylindrical surface inscribed in it. Fig. 4 shows d calculated for measured threshold angles and their force magnitude. In conditions other than T-Cyl., d is independent of curvature and nearly equal to 0.1mm. Interestingly, their distribution shown by SD seems to be "digitized" by the spatial resolution of equipment.. Conclusion and Future Workk When the direction of feedback force is C 1 or more continuous, perception of haptic smoothness of a cylindrical surface is defined by the height of "bumps" on the surface, and its absolute threshold is about 0.1 mm. The value is useful for haptic display design and shape simplification. Our future work includes experiments on perceiving doubly curved surfaces and human sensitivity in the time-domain.
Conference Paper
This paper presents the development of a virtual sculpting system, with the goal of enabling the user to create a freeform model by carving a virtual workpiece with a virtual tool while providing haptic interface during the sculpting process. A virtual reality approach is taken to provide stereoscopic viewing and force feedback, thus making the process of model creation in the virtual environment easier and more intuitive. The development of this system involves integrating techniques and algorithms in geometric modeling, computer graphics, and haptic rendering. Multithreading is used in an attempt to address the different update rates required in the graphic and haptic displays.
Article
Full-text available
One of the mayor research challenges of this century is the understanding of the human brain. Regarding this field line, simulation based research is gaining importance. A large amount of money is being spent in huge international projects such as The Human Brain Project [1] and The Blue Brain [2]. The behavior of the brain and, therefore, the behavior of brain simulations depend to a large extend on the neural topology. Neural elements are organized in a connected, dense, complex network of thread-like (i.e., filiform) structures. The analysis of a computer-based simulation using just the visual modality is a highly complex task due to the complexity of the neural topology and the large amounts of multi-variable and multi-modal data generated by computer simulations. In this paper, we propose the use of haptic devices to aid in the navigation along these neural structures, helping neurobiologists in the analysis of neural network topologies. However, haptic navigation constrained to complex filiform networks entails problems when these structures have high frequency features, noise and/or complex branching nodes. We address these issues by presenting a new user-adapted search haptic method that uses the forces exerted by the users to infer their intentions. In addition, we propose a specific calibration technique to adapt the haptic navigation to the user’s skills and to the data. We validate this approach through a perceptual study. Finally, we show in this paper the application of our method to the analysis of dense and complex filiform structures in the neurobiology context. Additionally, our technique could be applied to other problems such as electronic circuits and graph exploration.
Article
Full-text available
We propose a haptic rendering technology for touchable video. Our touchable video technique allows users for feeling the sense of touch while probing directly on 2D objects in video scenes or manipulating 3D objects brought out from video scenes using haptic devices. In our technique, a server sends video and haptic data as well as the information of 3D model objects. The clients receive video and haptic data from the server and render 3D models. A video scene is divided into small grids, and each cell has its tactile information which corresponds to a specific combination of four attributes: stiffness, damping, static friction, and dynamic friction. Users can feel the sense of touch when they touch directly cells of a scene using a haptic device. Users can also examine objects by touching or manipulating them after bringing out the corresponding 3D objects from the screen. Our touchable video technique proposed in this paper can lead us to feel maximum satisfaction the haptic-audio-vidual effects directly on the video scenes of movies or home-shopping video contents.
Conference Paper
Virtual coupling, a spring-damper system between the haptic probe and its virtual representation, the proxy, is one of the most common approaches for haptic rendering. We have extended the virtual coupling by updating the spring stiffness, sometimes used to simulate compliance of a material, depending on the direction between the proxy and the probe. This anisotropic variation of the stiffness is used in exploring inhomogeneities beneath the surface allowing detection of rigid structures even when they are obscured by another structure beneath the surface. In addition, we also compensate for the energy variation of the spring to maintain passivity and increase realism. User studies were performed to survey the success rate in the detection of obscured rigid bodies beneath the surface with the modified virtual coupling algorithm and the improvement of shape perception for sub-surface objects with the additional energy compensation term providing gradient information. We also discuss potential benefits of the proposed methods as basic extensions to well-known haptic rendering algorithms which are both simpler and yield improved performance over traditional deformation simulation techniques.
Article
Full-text available
Haptic/force feedback device is an important type of three-dimensional haptic display whose application fields include computer aided design and surgical simulation. With such a device, three elements of feedback force (magnitude, direction, and displacement) can be controlled independently to give haptic illusion as well as natural haptic simulation of virtual objects. To date, however, shape perception factors have not been well studied. This paper presents experiments on the effect of displacement in plane shape perception with a point contact type force feedback device. The subjects changed the height h of smooth mountain-ous stimulus shape (width 2w and stiffness s) to determine parameters and their thresholds for the shape perceived as flat. The feedback force direction was fixed upward, the same as a horizontal plane, to see the effect of displacement. A shape is felt to be flat (1) if its height is smaller than absolute threshold value h at = 0.034 · w + 0.022 (where s is 0.25–0.5 N/mm and w is 20–40 mm), or (2) if force for its height h, calculated by h · s, is smaller than 0.18 N (where s is 0.25–0.5 N/mm and w is 5–20 mm).
Article
In pictures, every object is displayed in 2D space. Seeing the 2D image, people can perceptually reconstruct and understand information regarding the scene. To enable users to haptically interact with an object that appears in the image, the present study proposes a geometry-based haptic rendering method. More specifically, our approach is intended to estimate haptic information from the object’s structure contained in an image while preserving the two-dimensional visual information. Of the many types of objects that can be seen in everyday pictures, this paper mainly deals with polyhedron figures or objects composed of rectangular faces, some of which might be shown in a slanted configuration in the picture. To obtain the geometric layout of the object being viewed from the image plane, we first estimate homographic information that describes a mapping from the object coordinate to the target image coordinate. Then, we transform the surface normals of the object face using the extrinsic part of homography that locates the face of the object we are viewing. Because the transformed normals are utilized for calculating the force in the image space, we call this process normal vector perturbation in the 2D image space. To physically represent the estimated normal vector without distorting the visual information, we employed a lateral haptic rendering scheme in that it fits with our interaction styles on 2D images. The active force value at a given position on the slanted faces is calculated during the interaction phase. To evaluate our approach, we conducted an experiment with different stimulus conditions, in which it was found that participants could reliably estimate the geometric layout that appears in the picture. We conclude with explorations of applications and a discussion of future work.
Article
The use of spatial (geographic) information is becoming ever more central and pervasive in today’s internet society but the most of it is currently inaccessible to visually impaired users. However, access in visual maps is severely restricted to visually impaired and people with blindness, due to their inability to interpret graphical information. Thus, alternative ways of a map’s presentation have to be explored, in order to enforce the accessibility of maps. Multiple types of sensory perception like touch and hearing may work as a substitute of vision for the exploration of maps. The use of multimodal virtual environments seems to be a promising alternative for people with visual impairments. The present paper introduces a tool for automatic multimodal map generation having haptic and audio feedback using OpenStreetMap data. For a desired map area, an elevation map is being automatically generated and can be explored by touch, using a haptic device. A sonification and a text-to-speech (TTS) mechanism provide also audio navigation information during the haptic exploration of the map.
Article
The vertex and edge effect caused by polygonal representation of a virtual object can result in energy leak and make the system unstable. In this paper, we focuses on how to eliminate the vertex and edge effect by developing new haptic rendering algorithm when a probe moves across the vertices. We propose a new method to eliminate it by using a vector median filter weighted with shape measure. This approach has a number of advantages, including being perpendicular to the triangle, fast computation, least energy cost, and smooth transition. The developed method is validated on various models with comparison to conventional force rendering techniques.
Article
Full-text available
Haptic display with force feedback is often necessary in several virtual environments. To enable haptic rendering of large datasets we introduce Continuously-Adaptive Haptic Rendering, a novel approach to reduce the complexity of the rendered dataset. We construct a continuous, multiresolution hierarchy of the model during the pre-processing and then at run time we use high-detail representation for regions around the probe pointer and coarser representation farther away. We achieve this by using a bell-shaped filter centered at the position of the probe pointer. Using our algorithm we are able to haptically render one to two orders of magnitude larger datasets than otherwise possible. Our approach is orthogonal to the previous work done in accelerating haptic rendering and thus can be used with them.
Article
This work is aimed at rendering an object described by a dense point cloud data without a pre-computed polygonal mesh and surface normal information. We use a proxy based rendering technique with proxy collocated with the HIP in free space. To avoid sinking of proxy into the object during collision a sphere of sufficiently large radius is selected as the proxy so as to avoid it going through the point cloud. Once collision is detected, we minimize a cost function depending on the current HIP and proxy positions and find a new goal position for the proxy corresponding to the local minimum of the cost function. Our rendering algorithm continuously evaluates a tangential vector for the proxy to move over the object surface during collision. The penetration depth is calculated from the proxy to the HIP and is used to calculate the reaction force for the haptic device. We used our technique to render several dense point cloud based models. We also use the surface normal evaluated at the point of contact to shade the object surface when shown visually.
Conference Paper
Displaying haptic and tactile information on a touchscreen is one of the key technologies for human interfaces. In this paper we propose a method that allows the user to simultaneously feel both large bump and small textures through a screen. Our method employs lateral force and direction-controlled mechanical vibration (around 0-400 Hz). The technology allows not only geometrical shapes but also textures to be felt. Our experiments revealed that with our proposed method almost all people can simultaneously detect geometry and texture information on a touchscreen. In addition, providing adequate direction for the vibration enhances the feelings of softness.
Article
Full-text available
Haptic displays are emerging as effective interaction aids for improving the realism of virtual worlds. Being able to touch, feel, and manipulate objects in virtual environments has a large number of exciting applications. The underlying technology, both in terms of electromechanical hardware and computer software, is becoming mature and has opened up novel and interesting research areas. In this paper, we clarify the terminology of human and machine haptics and provide a brief overview of the progress recently achieved in these fields, based on our investigations as well as other studies. We describe the major advances in a new discipline, Computer Haptics (analogous to computer graphics), that is concerned with the techniques and processes associated with generating and displaying haptic stimuli to the human user. We also summarize the issues and some of our results in integrating haptics into multimodal and distributed virtual environments, and speculate on the challenges for the future.
Article
Full-text available
New image processing algorithms are presented facilitating creation of virtual models of real objects: finding straight lines and rough image registration. Lines are found via Fast Hough Transform using Hartley Transform. Lines and color around them are used for rough image registration. Fuzzy logic is used to account for color.
ResearchGate has not been able to resolve any references for this publication.