May 2024
·
9 Reads
This page lists the scientific contributions of an author, who either does not have a ResearchGate profile, or has not yet added these contributions to their profile.
It was automatically created by ResearchGate to create a record of this author's body of work. We create such pages to advance our goal of creating and maintaining the most comprehensive scientific repository possible. In doing so, we process publicly available (personal) data relating to the author as a member of the scientific community.
If you're a ResearchGate member, you can follow this page to keep up with this author's work.
If you are this author, and you don't want us to display this page anymore, please let us know.
May 2024
·
9 Reads
November 2023
·
32 Reads
·
5 Citations
IEEE Transactions on Visualization and Computer Graphics
Integrated hand-tracking on modern virtual reality (VR) headsets can be readily exploited to deliver mid-air virtual input surfaces for text entry. These virtual input surfaces can closely replicate the experience of typing on a Qwerty keyboard on a physical touchscreen, thereby allowing users to leverage their pre-existing typing skills. However, the lack of passive haptic feedback, unconstrained user motion, and potential tracking inaccuracies or observability issues encountered in this interaction setting typically degrades the accuracy of user articulations. We present a comprehensive exploration of error-tolerant probabilistic hand-based input methods to support effective text input on a mid-air virtual Qwerty keyboard. Over three user studies we examine the performance potential of hand-based text input under both gesture and touch typing paradigms. We demonstrate typical entry rates in the range of 20 to 30 wpm and average peak entry rates of 40 to 45 wpm.
October 2023
·
38 Reads
·
3 Citations
July 2023
·
730 Reads
Fingertips are one of the most sensitive regions of the human body and provide a means to dexterously interact with the physical world. To recreate this sense of physical touch in a virtual or augmented reality (VR/AR), high-resolution haptic interfaces that can render rich tactile information are needed. In this paper, we present a wearable electrohydraulic haptic interface that can produce high-fidelity multimodal haptic feedback at the fingertips. This novel hardware can generate high intensity fine tactile pressure (up to 34 kPa) as well as a wide range of vibrations (up to 700 Hz) through 16 individually controlled electrohydraulic bubble actuators. To achieve such a high intensity multimodal haptic feedback at such a high density (16 bubbles/cm 2) at the fingertip using an electrohydraulic haptic interface , we integrated a stretchable substrate with a novel dielectric film and developed a design architecture wherein the dielectric fluid is stored at the back of the fingertip. We physically characterize the static and dynamic behavior of the device. In addition, we conduct psychophysical characterization of the device through a set of user studies. This electrohydraulic interface demonstrates a new way to design and develop high-resolution multimodal haptic systems at the fingertips for AR/VR environments.
April 2023
·
8 Reads
·
2 Citations
April 2023
·
16 Reads
·
3 Citations
March 2023
·
33 Reads
·
9 Citations
October 2022
·
17 Reads
·
3 Citations
October 2022
·
29 Reads
·
4 Citations
October 2022
·
15 Reads
·
9 Citations
... Other criteria included more than ten citations for publications older than ten years, and contained related reported results on the design of 3D UIs. Articles that address the design of 2D/3D UIs in/for 3D environments, help with answering the research questions or reiterating design guidelines and, as this work is a conjuncture of HCI and VR, highly cited works from reputable journals or conferences such as ACM's CHI [25][26][27][28][29][30][31][32], IEEE's VR and TVCG [21,22,[33][34][35][36][37][38][39][40][41][42][43][44], as well as other conference and journal sources [13,[45][46][47][48][49][50][51][52][53][54][55][56][57][58] from between the early 2000s to 2023 were selected. This meta-review will aim to provide an updated overview of significant works in the field that relate to 3D UIs in VR to establish an end-to-end dimensional perspective for a set of reiterated 3D UI design recommendations to guide future development. ...
October 2023
... however it is also possible we see a shift in the way users interact with mid-air keyboards that could involve methods inspired by swipe-based smartphone typing [27], like the work by Dudley et al. [28], novel ergonomic raycast techniques tailored to XR akin to the CD gain of a mouse [29], or even entirely new interaction modalities that capitalize on the unique sensing and ML capabilities of future XR devices. ...
November 2023
IEEE Transactions on Visualization and Computer Graphics
... They often mimic traditional keyboard layouts (e.g., QWERTY) and are positioned within the user's field of view. Input typically involves either hand [21], controllerbased pointing [22] or the use of eye gaze [22], [23] for key selection. The appeal of this approach lies in its relative simplicity, potential for leveraging existing user familiarity, and minimal hardware requirements beyond standard XR devices. ...
April 2023
... Then, we calculated the results for the entire phrases. This approach matches how text entry results are typically analyzed in most studies, e.g., [6,8,9,10,11,12,13,15,16,17,20,21,23,24,25,28,68,69], and thus enables comparisons of our results with a wide range of work in the literature. ...
March 2023
... Todi et al. used model-based RL to adapt item ranking in a 2D menu [43]. Yu et al. studied using RL to determine for timing of intelligent assistance during object selection tasks in MR [46]. Gebhardt et al. explored using RL to adapt the visibility of 2D labels in VR based on gaze data [20]. ...
October 2022
... Palmer et al. [7] proposed mappings for relocating forces from the thumb and index finger to the wrist and demonstrated their benefits during pick-and-place tasks when visual feedback is limited. Pezent et al. [8] proposed "Tasbi", a wrist-worn device that combines vibrotactile and squeeze stimuli during augmented and virtual reality interactions, and showed multiple applications where these relocated tactile stimuli are combined with visual feedback to create visual pseudo-haptic illusions of tactile interactions [9]. Tanaka et al. [10] introduced a system in which they attached electrodes to the wrist and the back of the hand to apply electro-tactile stimuli to the nerves while keeping the palmar side unobstructed. ...
October 2022
... This new condition will allow the decoupling of the positional drift issue from the accuracy of the body pose, allowing for a more in-depth study of the perceptual results. We believe future studies that integrate hand tracking like RotoWrist [30] or data-driven methods for selfavatar animation would be valuable to provide more insight into how animation fidelity impacts the SoE and user performance in VR. ...
December 2021
... Research on input devices using wrist motion without directly attaching to the hand is also increasing. With the growing interest in vibrotactile feedback in wearable wristband devices, Chase et al. [234] used information transfer as a metric to explore the signal variation space within a single vibrotactile actuator (e.g., frequency, amplitude, and modulation). Typical control systems rely on digital on/off control to limit the degrees of freedom available when designing haptic experiences, allowing only inflate/decrease at a set rate. ...
July 2021
... Currently, there is no specific structure for the meta-universe. Although some requirements and elements of these new systems have been investigated [60][61][62], further research is needed for additional required tools and infrastructures, to improve efficiency, reliability, and safety. Figure 5 provides an aggregated presentation of the foundations, requirements, infrastructures, features, components, and ecosystem of the Metaverse. ...
May 2021
... However, mid-air virtual keyboards also face significant challenges. The absence of physical surfaces and haptic feedback makes precise key targeting difficult, leading to higher error rates [24]. Extended use frequently causes fatigue and discomfort in the arms and shoulders, particularly with handbased input [25]. ...
November 2020