Fig 5 - uploaded by Silviu Butnariu
Content may be subject to copyright.
Rotate interaction gesture 

Rotate interaction gesture 

Source publication
Article
Full-text available
Natural User Interfaces (NUI) is a relatively new area of research that aims to develop human-computer interfaces, natural and intuitive, using voice commands, hand movements and gesture recognition, similar to communication between people which also implies body language and gestures. In this paper is presented a natural designed workspace which a...

Contexts in source publication

Context 1
... software configuration for the VR environment (Fig. 5) is designed as a modular distributed architecture based on the strict separation of its VR system management into layers. The proposed architecture provides methods through which objects in virtual environment can be added, manipulated or removed. Microsoft Windows is the operation system used for VR and CAD system. The VR 3D database ...
Context 2
... user can rotate the scene by using several bimanual gestures. In the first case the index finger of the left hand will be pointing forward (PF) and the right hand will be used to orient the viewpoint by performing rotation gestures (Fig. 5). In the second case the left hand will be in the same PF state, but using the right hand a continuous rotation will be performed over one of the axis of the coordinate system by moving to the left/right for the X axis rotation, up/down for the Y axis rotation and forward/backward for Z axis rotation. The right palm velocity is acquired ...

Similar publications

Article
Full-text available
Automatically identifying protein conformations can yield multiple candidate structures. Potential candidates are examined further to cull false positives. Individual conformations and the collection are compared when seeking flaws. Desktop displays are ineffective due to limited size and resolution. Thus a user must sacrifice large scale content b...

Citations

... In connection with Leap Motion, there is one study that has used Leap Motion, namely "Design Review of CAD Models using a NUI Leap Motion Sensor". In this study, using the Leap Motion device as a motion detection sensor to replace commands for other devices such as a mouse proves that using Leap Motion in simulation software can make it more interactive and intuitive [9]. Also, there is a paper proving that hand gesture is preferred over using mouse and keyboard for the design interface. ...
... Research conducted by Florin G. et al developed a design review method for CAD models using the NUI Leap Motion sensor as a controller. The study contains a CAD model design display that can be controlled using hand movements with the help of Leap Motion such as zoom, pan, and rotate [9]. Furthermore, research conducted by Aron S. et al integrates speech recognition and hand gesture control for interaction with Augmented Reality (AR) design practice. ...
Article
Architecture is part of the art of life and works best when it seems to give expression to the life that inhabits it. As technology has developed a lot, the field of architecture has also experienced many developments supported by technology such as making architectural designs. The development experienced in making architectural designs is that it is no longer required to draw design choices with side views on a lot of paper but instead use a simulation application for making architectural designs that are displayed through a monitor layer in 3D so that they can see more realistically. After many innovations emerged regarding simulation applications which generally still use hardware such as a mouse, then these innovations shifted to connecting the simulation application with the virtual world where hardware such as a mouse has been replaced by using gestures or limb movements such as parts of the hand or other parts of the body. While other fields like medic already use virtual tool interaction, architecture fields still just use mouse and keyboard. One of the virtual interaction devices used to be able to detect fingers and palms to interact with 3D simulation applications is Leap Motion. Leap Motion is starting to be widely used to carry out various ways of interacting with hands because it is more interactive and interesting to use even many people think that it can be used for hand rehabilitation or therapy too. However, even though many innovations have developed, there is still no innovation in the use of a hand gesture command system using Leap Motion for simulation applications to design architectural designs in 3D views. Based on these things, the authors designed and developed hand gesture commands using Leap Motion for Architectural Design App. The hand gesture command system used for the Floorplanner simulation application will be tested using a system usability scale by users with architectural, informatics, and unfamiliar backgrounds. Based on the tests carried out, it can be seen that the system is running properly even though it requires getting used to the device first and getting a score from the System Usability Scale of 59,167 out of scale from 0 to 100 which means it is still accepted by users or declared as "Marginally Acceptable" category.
... Therefore, static gestures could be mainly used to invoke CAD commands for model-creation tasks while dynamic gestures could be used for model manipulation and modification [27]. In some studies [3,34,92], static gestures and dynamic gestures are combined for CAD manipulation instructions. For example, users can translate a model by closing the left hand into a fist (i.e., static gesture) and moving the right hand in the direction they want to translate it (i.e., dynamic gesture) [92,93]. ...
... In some studies [3,34,92], static gestures and dynamic gestures are combined for CAD manipulation instructions. For example, users can translate a model by closing the left hand into a fist (i.e., static gesture) and moving the right hand in the direction they want to translate it (i.e., dynamic gesture) [92,93]. In this process, the left-hand gesture plays the role of a trigger signal for the translation task while the right-hand gesture controls the direction and distance of the translation. ...
Article
Full-text available
Computer-aided design (CAD) systems have advanced to become a critical tool in product design. Nevertheless, they still primarily rely on the traditional mouse and keyboard interface. This limits the naturalness and intuitiveness of the 3D modeling process. Recently, a multimodal human–computer interface (HCI) has been proposed as the next-generation interaction paradigm. Widening the use of a multimodal HCI provides new opportunities for realizing natural interactions in 3D modeling. In this study, we conducted a literature review of a multimodal HCI for CAD to summarize the state-of-the-art research and establish a solid foundation for future research. We explore and categorize the requirements for natural HCIs and discuss paradigms for their implementation in CAD. Following this, factors to evaluate the system performance and user experience of a natural HCI are summarized and analyzed. We conclude by discussing challenges and key research directions for a natural HCI in product design to inspire future studies.
... For the command of the Jaco robotic system the Leap Motion controller was used because, based on [33], it allows accurate tracking of hand movements. The Leap Motion controller can detect the position, orientation and velocity of hands along with the position, orientation and velocity of each component finger [34]. The user can send translation (X axis) and rotation (roll, pitch, yaw angles) commands to the Jaco robotic arm by moving the right hand. ...
Chapter
A hybrid brain-computer interface (BCI) is a system that combines multiple biopotentials or different types of devices with a typical BCI system to enhance the interaction paradigms and various functional parameters. In this paper we present the initial development of a hybrid BCI system based on steady state evoked potentials (SSVEP), eye tracking and hand gestures, used to command a robotic arm for manipulation tasks. The research aims to develop a robust system that will allow users to manipulate objects by means of natural gestures and biopotentials. Two flickering boxes with different frequencies (7.5 Hz and 10 Hz) were used to induce the SSVEP for the selection of target objects, while eight channels were used to record the electroencephalographic (EEG) signals from user’s scalp. Following the selection, the users were able to manipulate the objects from the workspace by using the Leap Motion controller to send commands to a Jaco robotic arm.
... The use of small markers in the users' hand for motion capture has also been tested [15] [18] [27]. A more recent alternative to such sensors is the leap motion [10] [2] [31]. This device only tracks hands and arm motions, but its accuracy is superior to those mentioned above. ...
... In [23] authors propose an integrated environment in which the user may interact with the desktop CAD system by using a tablet in the virtual environment: to aid in the visualization of the assembly, users are able to select and see inside all the parts selected by positioning the tablet on the virtual object. For the visualization and transformation of a complete assembly, users can utilize rotate, pan and zoom gestures with a Leap Motion sensor [10] or by using different types of depth colour cameras [5] [28] [33] or by the aid of hand markers [15]. The first are targeting creative users, supporting them in sketching the conceptual model of the object as in [16], or mechanical engineers in creating solids as in [15] and [30]. ...
... [10] [13] [28][33] represent objects through meshes in the VR environment possibly directly linked to CAD representation, as shown in table 1. Even if more limited in the range of operations allowed, many systems are using meshes as a unique shape representation[17] [31] [18] [27] [24] [25] [39]. ...
Article
Full-text available
Human Computer Interaction is the study how to design, evaluate, and implement computer systems interactive, the systems can make a dialogue with human. This research was designed and implementation leap motion used hand movements into digital models, which can be used to replace the function of the mouse or keyboard and can provide an alternative in the utilization of hand movements as interactions that the naturally between human and computers and allowing users to input gesture commands into an application in place of a mouse or keyboard. Human gesture recognition as technique interaction for deliver interpretation more natural and for communicating with computers. The aim of this research to find usability leap motion, how effectiveness of the Leap Motion controller for hand gesture recognition for solve important problems in human use of mouse or keyboard. Testing and implementation from this reseach focused are tracking, detection, dynamic, and statis gesture recognation. The result 6 gesture recognation double outward swipe, tap, swipe, clap, circular and fly control tap accuracy of gesture interrpretation data has obtained 87.62%. The uses the data for recognation model has obtain very good and testing with Leap Motion visualizer hand gesture recognation are often sensitive to poor resolution.
Chapter
Full-text available
Unmanned aircraft system (UAS) sensor operators must maintain performance while tasked with multiple operations and objectives, yet are often subject to boredom and consequences of the prevalence-effect during area scanning and target identification tasks. Adapting training scenarios to accurately reflect real-world scenarios can help prepare sensor operators for their duty. Furthermore, integration of objective measures of cognitive workload and performance, through evaluation of functional near infrared spectroscopy (fNIRS) as a non-invasive measurement tool for monitor of higher-level cognitive functioning, can allow for quantitative assessment of human performance. This study sought to advance previous work regarding the assessment of cognitive and task performance in UAS sensor operators to evaluate expertise development and responsive changes in mental workload.
Chapter
In this paper, we propose a Natural User Interface (NUI) based on Leap Motion controller which enables intuitive manipulation of a 6-DOF Jaco robotic arm. By using this NUI interface, we aim to study the effects of hand movements and gestures for interaction with a robotic arm. We present a qualitative evaluation of the proposed NUI system by trials with subjects carrying out a standard robotic arm manipulation session using both the conventional joystick interface and the NUI based one. The results are discussed in the last part of the paper.