Figure 1 - uploaded by Sven Mayer
Content may be subject to copyright.
Two screenshots of the Flappy Bird game implementation, showing the game at the start (a) and during the game after the first pipe (b).

Two screenshots of the Flappy Bird game implementation, showing the game at the start (a) and during the game after the first pipe (b).

Source publication
Conference Paper
Full-text available
With smartphones being a prime example, touchscreens became one of the most widely used interface to interact with computing systems. Compared to other touchscreen devices, smartphones pose additional challenges as the hand that interacts with the device is commonly used to also hold the device. Consequently, determining how fingers of the hand hol...

Contexts in source publication

Context 1
... understand how single tap input influences the area where users comfortable can tap the screen, we implemented a clone of Flappy Bird, see Figure 1. We used Flappy Bird as a study apparatus, as the game has the unique feature that the tap can be performed on the whole GUI. ...
Context 2
... game consists of two screens, a start screen, and a play screen. The start screen contains a "Play" and a "Leaderboard" button, see Figure 1a. The "Leaderboard" button was implemented in a traditional way. ...
Context 3
... "Leaderboard" was implemented using the Play Games Services 1 provided by Google. In the game phase, see Figure 1b, the bird needs to be steered tough the vertical pipes. The bird continuously moves to the right, while gravity pulls the bird down. ...

Citations

... Nevertheless, foldable smartphones are lighter and smaller than tablets. User comfort zone and reachable zone of a foldable phone is different from that of a tablet (Mayer, Le, Funk, & Henze, 2019). Foldable smartphone should therefore be defined as a mid-screen mobile device between regular smartphone and tablet (iiMedia Research, 2022). ...
... Based on common hand postures used to grip a smartphone with one hand, Le et al. [27] characterized the region of the phone that could be comfortably reached. Mayer et al. [33] give further evidence through their analysis of 45 million touch events collected during touch-based gameplay, where they identified a region of the screen most likely to be comfortable to tap, known as the "sweet spot. " Based on this evidence, Reflow takes a spatially-dependent approach to personalizing touch interactions by modifying the position and size of UI elements. ...
... If the resting position is assumed to be constant for a single user (which previous research has shown is plausible [26,33]), we can estimate its location as a model parameter. To do this-in addition to the standard Fitts' model parameters and -we also learned by fitting Equation 1 to a dataset of pairs ⟨ , ⟩ using non-linear ordinary least squares. ...
Preprint
Full-text available
Touch is the primary way that users interact with smartphones. However, building mobile user interfaces where touch interactions work well for all users is a difficult problem, because users have different abilities and preferences. We propose a system, Reflow, which automatically applies small, personalized UI adaptations, called refinements -- to mobile app screens to improve touch efficiency. Reflow uses a pixel-based strategy to work with existing applications, and improves touch efficiency while minimally disrupting the design intent of the original application. Our system optimizes a UI by (i) extracting its layout from its screenshot, (ii) refining its layout, and (iii) re-rendering the UI to reflect these modifications. We conducted a user study with 10 participants and a heuristic evaluation with 6 experts and found that applications optimized by Reflow led to, on average, 9% faster selection time with minimal layout disruption. The results demonstrate that Reflow's refinements useful UI adaptations to improve touch interactions.
... Each trial begins by tapping a 9 by 9 mm start button on the screen, always using the right thumb. The button is positioned 38 mm from the bottom and 8 mm from right side of the device, putting it near the comfort "sweet spot" [16]. This normalizes the initial grip on the phone. ...
Article
Full-text available
We investigate the performance of one-handed touch input on the side of a mobile phone. A first experiment examines grip change and subjective preference when reaching for side targets using different fingers. Results show all locations can be reached with at least one finger, but the thumb and index are most preferred and require less grip change for positions along the sides. Two following experiments examine taps and flicks using the thumb and index finger in a new two-dimensional input space. A side-touch sensor is simulated with a combination of capacitive sensing and motion tracking to distinguish touches on the lower, middle, or upper edges. When tapping, index and thumb speeds are similar with thumb more accurate and comfortable, and the lower edge is most reliable with the middle edge most comfortable. When flicking with the thumb, the upper edge is fast and rated highly.
... Touch input on the back of the device enables a wide range of use cases, such as preventing shoulder surfing while authenticating [16], improving reachability by moving the front screen content [44], 3D object manipulation [4,85], or performing user-defined gesture input [86], and zooming gestures in single-handed grips [62]. To understand interaction beyond the touchscreen, researchers investigated finger placement [51], movements [47,50,56], and on-device gestures [107] to propose guidelines for BoD interaction. ...
Article
Full-text available
While advances in mobile text entry enable smartphone users to type almost as fast as on hardware keyboards, text-heavy activities are still not widely adopted. One reason is the lack of shortcut mechanisms. In this article, we determine shortcuts for text-heavy activities, elicit shortcut gestures, implement them for a fully touch-sensitive smartphone, and conduct an evaluation with potential users. We found that experts perform around 800 keyboard shortcuts per day, which are not available on smartphones. Interviews revealed the lack of shortcuts as a major limitation that prevents mobile text editing. Therefore, we elicited gestures for the 22 most important shortcuts for smartphones that are touch-sensitive on the whole device surface. We implemented the gestures for a fully touch-sensitive smartphone using deep learning and evaluated them in realistic scenarios to gather feedback. We show that the developed prototype is perceived as intuitive and faster than recent commercial approaches.