Conference Paper

Tangible newspaper for the visually impaired users

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

This paper outlines a novel interaction technique allowing the visually impaired users to examine the layout of complex paper documents (e.g. a newspaper page). The document is placed on a desk in the view of a camera placed above it. The tip of user's index finger is tracked by means of algorithms of computer vision. As the user moves the finger over the document, the layout of its logical units (articles) is revealed by means of a simple sonification of their boundaries and the speech synthesis. In the paper, implementation of the test application is described and evaluated. The main advantage of our system is the direct mapping of the kinaesthetic experience to the layout of the physical document, enabling the user to employ any strategy to traverse the document. The prototype has been implemented using the ARToolKit library.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Sporka et al. [24], apresentam o Tangible Newspaper. Trata-se de um sistema tangível concebido para utilizadores com incapacidades visuais. ...
... Então, através da síntese de TTS 11 é possível apresentar ao utilizador, via áudio, o conteúdo que se apresenta na zona do jornal seleccionada. Figura 2.10 Exemplo de Data Matrix para o Jornal Tangível, retirado de [24] Desta forma, toda a arquitectura envolvente ao sistema é constituída, para além do indicador, composto por um código Data Matrix associado ao dedo indicador, mostrado na figura 2.10, por uma câmara. Esta Câmara é utilizada para duas funcionalidades: uma para detectar a posição do indicador, outra para verificar o tipo de formatação que a página do jornal apresenta, possibilitando o mapeamento da posição do indicador, para uma área do jornal, identificando-se o texto que o utilizador pretende que seja reproduzido. ...
... Esta interface apresenta problemas quanto à forma de validação (abanar o cubo), pois os utilizadores tinham alguma dificuldade em perceber este método optando, primordialmente, por gesticular em frente à resposta correcta. Figura 2.11 Exemplo de cubo como interface Tangível, retirado de [24] 2.4.1.3 Sensetable Muitos sistemas de interfaces tangíveis utilizam câmaras para a localização de objectos. ...
... We have elaborated this idea in the prototype of our system Tangible Newspaper [19] that allows the blind users to explore the layout of the document rather than its actual content. The document is placed on a desk in the view of a camera placed above it. ...
... Fig. 3 shows the schema of the system. From the initial user study as described in [19], we were able to conclude that the users were capable to explore the spatial relations and positioning of the polygonal regions within the presented documents. ...
Conference Paper
The number of people using computers is permanently increasing in last years. Not all potential users have all capabilities that allow them to use computers without obstacles. This is especially true for handicapped and elderly users. For this class of users a special approach for design and implementation of user interfaces is needed. The missing capabilities of these users must be substituted by capabilities that these users have. In most of cases the use of sounds and speech offers a natural solution to this problem. In the paper the outline of problems related to special user interfaces will be discussed. In further the examples of application of user interfaces using special forms of speech and related acoustic communication will be given.
... The previous studies aimed at supporting the visually impaired are diverse and include research on reading and writing [3,[8][9][10][11][12][13][14], which are difficult for the visually impaired, as well as on various activities such as supporting visually impaired children to play [5], enjoy soccer games [15], record with a camera [16], create music [17], play games [18], and use virtual reality (VR) systems [19][20][21]. ...
Article
Full-text available
BACKGROUND: Visually impaired people have been considered only as “receivers” of support; however, few studies have considered them as “givers” of support to sighted people. OBJECTIVE: To support the walking of sighted people using information available specifically to the visually impaired. METHODS: Utilizing white cane usage data of visually impaired people to create inaccessibility maps for sighted people. RESULTS: A user study conducted with elderly people and their stakeholders, who had high accessibility needs found that the maps were generally useful as long as they were within the user’s area of interest. CONCLUSIONS: Although the proposed method should be practically beneficial to users, the information detected by the system should be expanded from white cane use data, its accuracy, and more data collected from visually impaired people.
... Furthermore, it was intuitive for VI users to obtain such auditory information with their fingers touching the tactile diagrams or other tangible interfaces. Early prototypes, such as KnowWhere [24], 3DFinger [34], Tangible Newspaper [38], supported computer-vision-based tracking of VI user's finger on 2D printed material (e.g., maps and newspaper) and retrieval of the corresponding speech information. Nanayakkara et al. [30] developed EyeRing, a finger-worn device with an embedded camera connected to an external micro-controller for converting the printed text into speech output based on OCR and text-to-speech techniques. ...
Chapter
Traditional tactile diagrams for the visually-impaired (VI) use short Braille keys and annotations to provide additional information in separate Braille legend pages. Frequent navigation between the tactile diagram and the annex pages during the diagram exploration results in low efficiency in diagram comprehension. We present the design of FingerTalkie, a finger-worn device that uses discrete colors on a color-tagged tactile diagram for interactive audio labeling of the graphical elements. Through an iterative design process involving 8 VI users, we designed a unique offset point-and-click technique that enables the bimanual exploration of the diagrams without hindering the tactile perception of the fingertips. Unlike existing camera-based and finger-worn audio-tactile devices, FingerTalkie supports one-finger interaction and can work in any lighting conditions without calibration. We conducted a controlled experiment with 12 blind-folded sighted users to evaluate the usability of the device. Further, a focus-group interview with 8 VI users shows their appreciation for the FingerTalkie’s ease of use, support for two-hand exploration, and its potential in improving the efficiency of comprehending tactile diagrams by replacing Braille labels.
... Some prototypes use a camera and image recognition algorithms to detect touch on raised-line maps [16,45,51], digital maps [28], regular visual paper documents [24], 3D printed maps [18] or tangible objects and tactile grids [43]. Beyond maps, camera-based finger tracking has been used for tangible newspapers [50], and to obtain audio descriptions for 3D printed graphics [47,48]. These works suggest that camera-based finger tracking can be successfully used in technology for VIP. ...
Conference Paper
Full-text available
Current low-tech Orientation & Mobility (O&M) tools for visually impaired people, e.g. tactile maps, possess limitations. Interactive accessible maps have been developed to overcome these. However, most of them are limited to exploration of existing maps, and have remained in laboratories. Using a participatory design approach, we have worked closely with 15 visually impaired students and 3 O&M instructors over 6 months. We iteratively designed and developed an augmented reality map destined at use in O&M classes in special education centers. This prototype combines projection, audio output and use of tactile tokens, and thus allows both map exploration and construction by low vision and blind people. Our user study demonstrated that all students were able to successfully use the prototype, and showed a high user satisfaction. A second phase with 22 international special education teachers allowed us to gain more qualitative insights. This work shows that augmented reality has potential for improving the access to education for visually impaired people.
... In such case (by definition) the visual modality may not be used and all information must be delivered by means of alternative methods and devices. We have attended also this problem in [83,82] or [95]. ...
... Our latest direction of research is the use of tangible user interfaces to deliver graphic information. A prototype described in [13] allows the users to explore the layout of newspaper page by tracking their index finger on its surface. A camera, mounted above the table where the newspaper is put, tracks the position of the finger and the users receives an acoustic description in real-time as they move around the page. ...
Article
The HCI at the CTU has a long tradition of research of interaction methods for people with difficulties, especially for the users with vision impairments. As of June 2007, our HCI group counts 4 faculty members and 4 PhD students; however, numerous BSc and MSc students of the CTU are also involved in the research by direct appointment or through their coursework in the HCI-related subjects and thesis projects. This paper provides a short summary and bibliography of the accessibility research at the Czech Technical University in Prague. It also presents an overview of our collaboration with our industrial partners that is related to the assistive technologies as well as our training and networking activities. HCI for People with Vision Difficulties Our HCI group is an integral part of the Computer Graphics Group [23] which determines the scope of our research. We address first of all the problem of delivery of graphic information to the visually challenged users. One of our first initiatives was to create a tool for exploring the 2D graphic information in a semantic approach. The project was called Blind Information System. The goal of this project
... The ActiveCube of Watanabe et al. [17] has shown that tangible user interfaces are well suited to convey spatial information to visually impaired and blind people. The project Tangible Newspaper [15] shows, that tangible user interfaces are also well suited for obtaining an overview about complex documents such as maps. Boverman et al. [2] did promising research at exploring spatially sonified information with tangible user interfaces. ...
Conference Paper
Full-text available
Before venturing out into unfamiliar areas, most people scope out a map. But for the blind or visually impaired traditional maps are not accessible. In our previous work, we developed the “Auditory Map” which conveys the location of geographic objects through spatial sonification. Users perceive these objects through the virtual listener’s ears walking through the presented area. Evaluating our system we observed that the participants had difficulties perceiving the directions of geographic objects accurately. To improve the localization we introduce rotation to the Auditory Map. Rotation is difficult to achieve with traditional input devices such as a mouse or a digitizer tablet. This paper describes a tangible user interface which allows rotating the virtual listener using physical representations of the map and the virtual listener. First evaluation results show that our interaction technique is a promising approach to improve the construction of cognitive maps for visually impaired people.
Article
In the medical field, there exists a serious problem with regard to communications between hospital staff and foreign patients. According to statistics, many countries worldwide have a low rate of literacy. Illiterate people engaging in multilingual communication face problems. Therefore, this situation requires the provision of support in various ways. Currently, medical translators accompany patients to medical care facilities, and the number of requests for medical translators has been increasing. However, medical translators cannot provide support at all times. Therefore, the medical field has high expectations from information technology. However, a useful system has yet to be developed and introduced in the medical field for practical use. In this chapter, we propose a multilingual communication support system called “M3.” M3 uses parallel texts and voice data to achieve high accuracy in communication between people speaking different languages. The Language Grid provides various parallel texts provided by a multilingual parallel text sharing system and parallel text providers. The proposed system can obtain and share parallel texts using Web services via the Language Grid.
Chapter
An important challenge for interaction designers is to understand factors that shape user experience, and novel approaches are being developed to establish user experience as a particular field of research. The previous attempts to provide a comprehensive theory of user experience have been focused on analyzing sensations and emotions as well as perceptions and behaviors. A holistic view of human experience is still lacking. I argue that a holistic view of the human being is needed to provide the appropriate theoretical foundations for user experience analyses in diverse contexts. In this chapter I introduce a theoretical framework for understanding human experience, and discuss how a holistic view revealing the fundamental human modes of being is contributing to and shaping user experience while people interact with information and communication technologies. Investigating user experience with the help of the framework facilitates interaction designers’ understanding of factors that shape a holistic user experience.
Conference Paper
In the medical field, a serious problem exists with regard to communication between hospital staffs and patients. Currently, although a medical translator accompanies a patient to medical care facilities, round-the-clock or emergency support is difficult to provide due to increasing requests. The medical field has high expectations from information technology. Therefore, we have developed a support system for multilingual medical reception termed M3. We have installed our system in the Kyoto City Hospital in Japan. However, we found that our system cannot provide support to illiterate people. If an illiterate person and another person speak different languages, it is difficult the other person to communicate face to face with the illiterate person while explaining the meaning of texts shown on the display of the support system. This is one of the problems specific to the multilingual communication. There is a need to solve this problem. Therefore, we have developed a method to provide support to illiterate people engaging in multilingual face-to-face communication. We use a text-to-speech function implemented using a selector switch to provide support to illiterate people in performing operations using a touch screen. We performed an experiment to examine the effect of the proposed method. The results of the experiment are as follows. (1) From the results of the questionnaire, we find that the subjects are able to operate the selector switch easily. Therefore, we conclude that the method using the selector switch has little effect on the operation of the system. (2) Retrieval time using the text-to-speech function is five times that using the normal operation. We need to consider a structure that can retrieve the required information easily if many readings of texts are required.
Conference Paper
Full-text available
Auditory icons add valuable functionality to computer interfaces, particularly when they are parameterized to convey dimensional information. They are difficult to create and manipulate, however, because they usually rely on digital sampling techniques. This paper suggests that new synthesis algorithms, controlled along dimensions of events rather than those of the sounds themselves, may solve this problem. Several algorithms, developed from research on auditory event perception, are described in enough detail here to permit their implementation. They produce a variety of impact, bouncing, breaking, scraping, and machine sounds. By controlling them with attributes of relevant computer events, a wide range of parameterized auditory icons may be created.
Article
Full-text available
This paper presents our vision of Human Computer Interaction (HCI): "Tangible Bits." Tangible Bits allows users to "grasp & manipulate" bits in the center of users' attention by coupling the bits with everyday physical objects and architectural surfaces. Tangible Bits also enables users to be aware of background bits at the periphery of human perception using ambient display media such as light, sound, airflow, and water movement in an augmented space. The goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities. This paper describes three key concepts of Tangible Bits: interactive surfaces; the coupling of bits with graspable physical objects; and ambient media for background awareness. We illustrate these concepts with three prototype systems -- the metaDESK, transBOARD and ambientROOM -- to identify underlying research issues. Keywords tangible user interface, ambient media, gras...
Article
There have been many research efforts devoted to tangible user interfaces (TUIs), but it has proven difficult to create a definition or taxonomy that allows us to compare and contrast disparate research efforts, integrate TUIs with conventional interfaces, or suggest design principles for future efforts. To address this problem, we present a taxonomy, which uses metaphor and embodiment as its two axes. This 2D space treats tangibility as a spectrum rather than a binary quantity. The further from the origin, the more ''tangible'' a sys- tem is. We show that this spectrum-based taxonomy offers multiple advantages. It unifies previous categori- zations and definitions, integrates the notion of ''calm computing,'' reveals a previously un-noticed trend in the field, and suggests design principles appropriate for different areas of the spectrum.
Haptic and Spatial Audio Based Navigation of Visually Impaired Users in Virtual Environment. 8th ERCIM International Workshop "User Interfaces For All
  • V Němec
  • A J Sporka
  • P Slavík
V. Němec, A. J. Sporka, P. Slavík: Haptic and Spatial Audio Based Navigation of Visually Impaired Users in Virtual Environment. 8th ERCIM International Workshop "User Interfaces For All", LCNS 3196, pp. 452-459, Springer-Verlag Berlin Heidelberg, Vienna, June 2004.
Tool for Pictorial Data Transformations
  • Z Míkovec
  • P Slavík
Z. Míkovec, P. Slavík: Tool for Pictorial Data Transformations. In: Intelligent Multimedia, Computing and Communications, Technologies and Applications of the Future. New York : John Wiley & Sons, 2001, p. 10-19.
An investigation of soft-button widgets using sound
  • M Fernström
  • L Bannon
  • E Brazil
M. Fernström, L. Bannon, E. Brazil: An investigation of soft-button widgets using sound. In: Le Journées de Design Sonore 2004. URL: http://richie.idc.ul.ie/ eoin/research/fernstrom_et_al_jds04.pdf (retrieved 02/01/2005.)
Slavík: Tool for Pictorial Data Transformations
  • Z Míkovec
  • P Slavík
  • Míkovec Z.