ArticlePDF AvailableLiterature Review

Survey and analysis of the current status of research in the field of outdoor navigation for the blind

Authors:

Abstract and Figures

Purpose: In this article, we comprehensively review the current situation and research on technology related to outdoor travel for blind and visually impaired people (BVIP), given the diverse types and incomplete functionality of navigation aids for the blind. This aims to provide a reference for related research in the fields of outdoor travel for BVIP and blind navigation. Materials and methods: We compiled articles related to blind navigation, of which a total of 227 of them are included in the search criteria. One hundred and seventy-nine articles are selected from the initial set, from a technical point of view, to elaborate on five aspects of blind navigation: system equipment, data sources, guidance algorithms, optimization of related methods, and navigation maps. Results: The wearable form of assistive devices for the blind has the most research, followed by the handheld type of aids. The RGB data class based on vision sensor is the most common source of navigation environment information data. Object detection based on picture data is also particularly rich among navigation algorithms and associated methods, indicating that computer vision technology has become an important study content in the field of blind navigation. However, research on navigation maps is relatively less. Conclusions: In the study and development of assistive equipment for BVIP, there will be an emphasis on prioritizing attributes, such as lightness, portability, and efficiency. In light of the upcoming driverless era, the research focus will be on the development of visual sensors and computer vision technologies that can aid in navigation for the blind.IMPLICATIONS FOR REHABILITATIONThe visual deficiency can easily help blind and visually impaired people (BVIP) to develop psychological disorders.There are few, if any, devices to meet the outdoor travel needs of BVIP in all aspects.There is no comprehensive summary and overview in the field of outdoor navigation for the blind.The selection of appropriate assistive devices can help BVIP better understand the information of their surroundings and make safer and more effective outdoor trips.
Content may be subject to copyright.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=iidt20
Disability and Rehabilitation: Assistive Technology
ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/iidt20
Survey and analysis of the current status of
research in the field of outdoor navigation for the
blind
Yue Lian, De-er Liu & Wei-zhen Ji
To cite this article: Yue Lian, De-er Liu & Wei-zhen Ji (2023): Survey and analysis of the current
status of research in the field of outdoor navigation for the blind, Disability and Rehabilitation:
Assistive Technology, DOI: 10.1080/17483107.2023.2227224
To link to this article: https://doi.org/10.1080/17483107.2023.2227224
Published online: 04 Jul 2023.
Submit your article to this journal
View related articles
View Crossmark data
ORIGINAL RESEARCH
Survey and analysis of the current status of research in the field of outdoor
navigation for the blind
Yue Lian
a
, De-er Liu
a
and Wei-zhen Ji
b
a
School of Civil Engineering and Mapping and Engineering, Jiangxi University of Technology, Ganzhou, Jiangxi, China;
b
State Key Laboratory of
Remote Sensing Science, Beijing Normal University, Beijing, China
ABSTRACT
Purpose: In this article, we comprehensively review the current situation and research on technology
related to outdoor travel for blind and visually impaired people (BVIP), given the diverse types and
incomplete functionality of navigation aids for the blind. This aims to provide a reference for related
research in the fields of outdoor travel for BVIP and blind navigation.
Materials and methods: We compiled articles related to blind navigation, of which a total of 227 of
them are included in the search criteria. One hundred and seventy-nine articles are selected from the ini-
tial set, from a technical point of view, to elaborate on five aspects of blind navigation: system equip-
ment, data sources, guidance algorithms, optimization of related methods, and navigation maps.
Results: The wearable form of assistive devices for the blind has the most research, followed by the
handheld type of aids. The RGB data class based on vision sensor is the most common source of naviga-
tion environment information data. Object detection based on picture data is also particularly rich among
navigation algorithms and associated methods, indicating that computer vision technology has become
an important study content in the field of blind navigation. However, research on navigation maps is rela-
tively less.
Conclusions: In the study and development of assistive equipment for BVIP, there will be an emphasis
on prioritizing attributes, such as lightness, portability, and efficiency. In light of the upcoming driverless
era, the research focus will be on the development of visual sensors and computer vision technologies
that can aid in navigation for the blind.
IMPLICATIONS FOR REHABILITATION
The visual deficiency can easily help blind and visually impaired people (BVIP) to develop psycho-
logical disorders.
There are few, if any, devices to meet the outdoor travel needs of BVIP in all aspects.
There is no comprehensive summary and overview in the field of outdoor navigation for the blind.
The selection of appropriate assistive devices can help BVIP better understand the information of
their surroundings and make safer and more effective outdoor trips.
ARTICLE HISTORY
Received 26 July 2022
Revised 14 March 2023
Accepted 13 June 2023
KEYWORDS
Blind; blind navigation;
assistive devices; outdoor;
computer vision
Introduction
The World Health Organization and the World Vision Report 2019
indicate that the number of individuals with visual impairments
worldwide exceeds 2.2 billion. China is home to the largest visu-
ally impaired population in the world, with its blind population
constituting 20% of the global total [1]. In 2019, there were
17.31 million individuals with visual impairments in China, with
over 8 million of them being completely blind, and number of
individuals with visual impairments is rising by 450,000 annually.
Blind corridors and barrier-free facilities have been constructed in
most outdoor areas, such as streets, parks, and roads to facilitate
travel for BVIP, who mainly use these corridors for outdoor navi-
gation. However, barrier-free facilities exhibit certain issues, such
as inadequate design, obstructions in circulation pathways, and
inadequate follow-up management of circulation pathways [2–6].
At the same time, visual impairment poses a significant cognitive
barrier for blind people [7]. The latter is the source of several
uncertainties regarding the use of assistive devices and, ultim-
ately, makes it harder to perceive the surrounding environment.
Making it very difficult for blind people to get around [8–10]. It is
difficult to solve in a short period of time [11], which poses a
huge challenge for the blind to travel safely. The visually impaired
population relies on their remaining senses to navigate their geo-
graphical surroundings [12]. However, their mobility is often
restricted compared to those without visual impairments [13].
These limitations can result in a narrow social circle and a
secluded lifestyle, ultimately leading to potential psychological
complications [14]. Due to the diversity of assistive devices and
their different functions, the selection of appropriate guide assist-
ive devices can assist BVIP in further exploring the environment
[15], traveling safely, and improving their quality of life.
Before the development of intelligent assistive devices for the
blind, blind people frequently traveled outdoors using simple
guide canes (Figure 6). The tip of the canes helps blind individuals
CONTACT Yue Lian leeyelyly@qq.com School of Civil Engineering and Mapping and Engineering, Jiangxi University of Technology, Ganzhou, Jiangxi, 341000,
China
2023 Informa UK Limited, trading as Taylor & Francis Group
DISABILITY AND REHABILITATION: ASSISTIVE TECHNOLOGY
https://doi.org/10.1080/17483107.2023.2227224
detect and avoid obstacles found on the navigational path,
adjusting accordingly to their travel destination. Several research-
ers have studied various aspects of assisting BVIP travel, adding
different sensors, such as ultrasonic, infrared, and visual sensors
on the basis of traditional assistive devices for the blind, driven
by the Internet of Things, Big Data, artificial intelligence, and
other technologies. Furthermore, the sensory modalities employed
by BVIP have been augmented to include not only tactile sensa-
tions but also auditory and mental imagery, thereby enhancing
their ability to navigate the outdoor environment with safety and
confidence.
The BVIP places a greater emphasis on factors, such as dis-
tance and duration when engaging in outdoor navigation [16].
Additionally, the stress of travel and individual experiences can
significantly impact the BVIP’s outdoor travel experience. The find-
ings of Syed Rizal Alfam Wan and Mohamad Noh [17] suggest
that BVIP prioritizes safety when navigating outdoors, specifically
concerning road crossings, obstacles, individuals, and road condi-
tions, such as potholes. Furthermore, BVIP shows a preference for
walking stick-type aids during outdoor trips. Wu [18] conducted
an analysis of the outdoor travel needs of blind individuals, which
revealed that accessibility features, such as pavements, blind corri-
dors, zebra crossings, steps, and accessible ramps require greater
accuracy, comprehensiveness, and real-time geographical informa-
tion for BVIPs compared to the general population.
The guidance of BVIP outdoor trips involves the use of posi-
tioning and object recognition detection techniques to enable
obstacle avoidance and path planning during navigation. The
data required for these tasks are primarily comprised of visual
images, three-dimensional point cloud, and infrared data. The
related process is shown in Figure 1.
Mack et al. [19] conducted a comprehensive review of accessi-
bility-related articles published in the CHI and ASSETS databases
from 1994 to 2019. The authors found that more than 43% of the
articles in the last decade were focused on issues related to blind
and visually impaired people (BVIP). These studies can be grouped
into three categories: system component/application technology,
navigation mode/device, and navigation task. System componen-
t/application technology studies generally focus on the structure
of navigation systems for the blind, as well as the techniques and
methods used by these systems. According to Bhowmick et al.
[20], assistive technology for BVIP can include various elements,
such as technology, devices, equipment, services, systems, proc-
esses, and environmental transformations, which can assist blind
individuals in engaging in production and living independently.
Kuriakose et al. [21] provide an overview of the current state of
navigation systems designed for visually impaired individuals and
classify these systems based on underlying technologies.
Fernandes et al. [22] describe navigation and localization technol-
ogies for BVIP and modify conventional navigation systems to
address the needs and limitations of visually impaired users.
Navigation mode/device research primarily focuses on the meth-
ods and aids used by BVIP to navigate outdoor environments. For
instance, Zhu Xiang Cheng categorizes blind navigation mode
into four groups, including guide dogs, handheld, wearable, and
mobile guide aids [11], while Wu et al. [23] divides guide robots
into five categories, including mobile guide robots, intelligent ter-
minal-based guide systems, wearable guide devices, handheld
guide instruments, and intelligent guide canes. Navigation aids
can be divided into two groups based on traditional and new
types, according to Cheng [24], while Wang et al. [25] classify
pathfinding aids into four groups, including electronic conducting
aids, mobile guide aids, intelligent canes, and intelligent guiding
systems. Dakopoulos et al. [26] classify navigation electronics into
Electronic Travel Aids (ETAs), Electronic Orientation Aids (EOAs),
and Position Locating Devices (PLDs). Tapu et al. [27] evaluate the
advantages and limitations of each technique in outdoor naviga-
tion systems. Navigation task research is arranged according to
the purpose of BVIP travel, the function of the system, or the pro-
cess of blind navigation. For example, Anandan et al. [28] categor-
ize navigation system tasks into three groups, including
environmental mapping, travel planning, and real-time navigation,
and navigation activities into five groups, including walking assis-
tants, travel aids, visual alternative navigation systems, navigation
systems, and aided navigation systems. El-Taher et al. [29] decom-
pose the navigation area into navigation phases and navigation
tasks, providing a systematic review of methodological, data set,
and task limitations in the literature.
Several researchers have conducted literature reviews on the
topic of BVIP navigation. Khan et al. [30] undertook a historical
analysis of 191 relevant articles, proposing an improved procedure
for reducing fatalities and serious injuries among BVIP. Horton
et al. [31] retrieved and systematically reviewed 36 articles on
hardware platforms, applications, and user testing of assistive
devices from the EBSCO literature database. In a study of 36
articles about the White Cane, Khan et al. [32] provided a specific
review of this widely used mobility aid for BVIP, while Prandi
et al. [33] analyzed 111 articles on barrier-free wayfinding and
navigation from various perspectives, including usage back-
ground, audience, technology, data types, and evaluation, to pro-
vide insights for designing barrier-free environments in the future.
All of these studies have made significant contributions to the
field of blind navigation so far, but few reports have systematic-
ally summarized and sorted them out. In the article, the authors
[29] point out the limitations of review articles related to blind
navigation: (1) Incomplete collection of literature; (2) No discus-
sion of the use of relevant algorithms; (3) No comparison between
systems, algorithms used, and methods. Our study presents a clas-
sification approach for blind navigation research. This method is
based on three primary categories, namely, system componen-
t/application technology, navigation mode/device, and navigation
task. We further classify these categories into five subcategories,
which include guide aids, sources, and types of environmental
information data for the aids, algorithms used in the guiding pro-
cess, methods of guiding, and navigation maps used directly by
the BVIP. We apply this methodology to collect research articles
related to blind navigation research, which enables a more sys-
tematic and comprehensive analysis of the literature in this field.
The collected articles were systematically arranged to provide a
Figure 1. Overview of the navigation system process for the blind.
2 L. YUE ET AL.
comprehensive overview and evaluation of the current state of
research on five distinct topics, with a particular focus on their
technical aspects. This serves as a crucial aid in expediting the
comprehension of industry advancements among fellow research-
ers in related fields, by highlighting the strengths and limitations
of the content. Moreover, this paper presents the research gaps
and key scientific challenges in the field of blind navigation, and
provides an outlook on the future prospects based on the current
era of cutting-edge technologies.
Methods
This study uses a systematic review approach [34] and draws on
the structure of Fernando N’s article [35] to collate the data sup-
port needed to conduct an analysis of the current state of
research in the field of outdoor navigation for the blind. A sys-
tematic literature review is a process of identifying, evaluating,
and interpreting all research relevant to a research question and
requires the researcher to summarize all available information
relevant to that research in a thorough and unbiased manner.
Acquisition of data
Determination of data source
The articles analyzed in this study were sourced from diverse aca-
demic disciplines. To obtain the relevant literature, a systematic
search was conducted across eight prominent journal databases,
including Web of Science, Science Direct, Engineering Village, IEEE
Xplore, Springer Link, CNKI, WANFANG Data, and VIP Journal,
selected from the university library based on their scope and
coverage of outdoor blind navigation literature.
Search of data
The keywords “blind navigation”, “outdoor”, and “guide” were
used in the Chinese journal database, and the keywords
“guidance”, “blind people”, “outdoor”, “blind”, “blind navigation”,
and “assist” were used in the English journal database to search
by the combination of title, abstract and keywords according to
the research content and scope.
Selection of content
Inclusion criteria. The collected articles include the following con-
tent or conditions: (1) Studies related to blind navigation in out-
door; (2) Relevant studies that are applicable both indoors and
outdoors; (3) Research and design of navigation systems and devi-
ces for the blind; (4) Development of algorithms related to navi-
gation processes; (5) Design of methods related to navigation
processes; (6) Measurement related to blind navigation; (7)
Concepts related to blind navigation; (8) Publication before and
including 2021. The above are included respectively in each part
of navigation system process for the blind.
Exclusion criteria. Based on the research content of this paper,
articles that appear as follows will be excluded from our collec-
tion: (1) Reviews related to blind navigation; (2) Studies related to
blind navigation in indoor; (3) Aids for the blind that are not
related to travel; (4) Research reports from the experimental pro-
gram phase of blind navigation.
Collation of the literature
Collated from articles obtained from various journal databases in
the previous period:
Culling. (1) Remove repetitive articles; (2) Remove articles that can
be searched but not viewed and downloaded; (3) Eliminate
articles of poor quality.
Collating. Recording the basic information of the collected
articles, including (1) Title; (2) Abstract; (3) Publication date; (4)
Reference citation; (5) Brief research content and research purpose
of the article.
Results. Based on the above collation, a total of 227 articles were
obtained that matched the content of the study, and the specific
flow is shown in Figure 2.
The statistical graph of the literature obtained from the previ-
ous collection is shown in Figure 3. The data indicates a steady
rise in the number of literature available across various journal
databases from 2000 to 2021. Notably, there is a visible shift
from a lower number of relevant literature collections in the ear-
lier decade (before 2010) to a significant increase in the latter
decade. This trend reflects the growing concern for safe outdoor
travel of BVIP, which necessitates an examination and analysis of
the present research status in the area of outdoor blind naviga-
tion, in line with the general trend of social development.
Strategies for data classification
It is divided according to the collected literature data, as shown
in Figure 4 below. The two components, navigation system, and
equipment, account for a relatively large share, 46 and 55% in
China and other countries in total.
The collected literature data are analyzed and classified based
on the article as a whole, and the current research status of out-
door blind navigation is analyzed from various perspectives, fol-
lowed by a summary and outlook based on its development
trend. Accordingly, 179 articles were chosen from the collected
articles and analyzed from the five topics listed below:
Topic 1: Navigation system devices for the blind. This topic is
based on the direct needs of the users, who are mostly in need
of hardware devices or software installed on their cell phones or
portable computers to carry them around when walking outdoors.
Figure 2. Flow chart of literature processing.
A SURVEY AND ANALYSIS OF NAVIGATION FOR THE BLIND 3
The literature referenced in this section is the blind navigation
systems and devices category in Figure 4.
Topic 2: Sources and types of environmental information data
for blind navigation. The data used by the user to sense and
obtain information about the surrounding environment is in the
form of infrared, point cloud, RGB, RGBD, and RFID. The division
of this part is based on information about the device used by the
user, and the literature referenced is consistent with the device
class of navigation systems for the blind.
Topic 3: Algorithmic aspects of blind navigation. In the design
of assistive devices for the blind, various aspects of the navigation
system process for the blind, has localization, detection and rec-
ognition, path planning, and obstacle avoidance, require the use
of algorithms to ensure the accurate characterization of their per-
ceived surroundings. The reference literature is the algorithms
related to blind navigation in Figure 4.
Topic 4: Optimization of navigation methods for the blind. This
category is based on the navigation system process for the blind,
where higher accuracy is achieved by changing methods or strat-
egies in each part of the process. The reference literature is the
method optimization category in Figure 4.
Topic 5: Navigation Map for the Blind. The topic is proposed
as an alternative way for blind people to travel or to travel by
transforming senses, such as hearing and touch into a mental
map. The literature referenced in this section is the navigation
map section for the blind in Figure 4.
Results
Navigation system devices for the blind
The literature related to navigation systems and devices for the
blind was screened and organized in a total of 123 articles,
accounting for 68% of the total literature. This section is mainly
categorized and discussed by equipment aids.
According to scholarly literature, the development of assistive
devices for the visually impaired can be traced back to the intro-
duction of the first guided electronic travel aid (ETA) in 1897.
From the mid-1960s to the 1990s, assistive devices [36] relied
heavily on sensor arrays, such as ultrasonic, laser, and infrared to
detect and alert users to obstacles [37]. During the late 1990s to
the early 2000s, CCD cameras and human-related sensor arrays
Figure 3. Yearly statistics of the number of journals in each journal inclusion database.
Figure 4. Percentage of domestic and foreign outdoor navigation references by category. (a) Percentage of domestic outdoor navigation references by category. (b)
Percentage of foreign outdoor navigation references by category.
4 L. YUE ET AL.
were utilized to provide users with more comprehensive environ-
mental information in the form of “displays” to aid in navigation.
In recent years, cameras, LIDAR, point clouds, and radio frequency
identification (RFID) have become the primary technologies
employed in developing assistive devices for the visually impaired.
These devices facilitate human-machine interaction and enhance
the ease with which blind individuals can travel [25].
Many assistive devices for BVIP utilize sensors to detect
obstacles and guide them. These devices acquire and process
image or signal information, and convey this information through
tactile and auditory means, such as canes, belts, glasses, and hel-
mets. The design of such devices must take into account factors,
such as weight and comfort. However, due to the need for com-
plex functions on small objects, traditional assistive devices may
be limited in their effectiveness.
Analysis of data
For assistive devices for the blind, different classification bases
lead to different classification results. Chen et al. distinguished
between long-distance and near-distance guide aids [36], while
Zhu et al. [11] classified guide dogs into four categories based on
their mode of assistance. Additionally, Cheng [24] differentiated
between traditional and new types of assistive devices for the
blind based on their degree of novelty. In this paper, we present
a classification of assistive devices for visually impaired individuals
based on the way the device is used, namely, wearable aids,
handheld aids, and mobile aids. Based on statistical data gathered
from the research literature, as depicted in Figure 5, the category
of wearable aids has the highest number of articles at 55, fol-
lowed by handheld aids at 42, and mobile aids at 12.
Furthermore, there were 14 articles concerning other types of
aids. It is worth noting that More has been written about wear-
able assistive devices, which are relatively diverse and mostly
available in the form of composite wears, while handheld aids pri-
marily consist of rods and canes. Mobile aids, on the other hand,
are not as prevalent. The remaining category, comprising naviga-
tors and obstacle avoidance devices, is not classifiable as either
wearable, handheld, or mobile. These devices are primarily cre-
ated in China to aid the visually impaired in their travels.
Wearable aids
Numerous academics have conducted research on the topic of
wearable assistive devices. For instance, Lakshmanan et al. [38]
carried out a comprehensive analysis of wearable devices that
facilitate navigation for BVIP in outdoor settings. Various wearable
guide aids have been identified in the literature, including but
not limited to glasses, shoes, gloves, belts, helmets, bracelets,
undershirts, rings, headphones, and composite types. The relevant
types are introduced as shown in Table 1.
Handheld aids
Most of the handheld aids are canes, which are instruments used
by BVIP to extend the tactile sensation of the arms to understand
the surrounding ground environment. According to the National
Standard of the People’s Republic of China specifies that the blind
cane is comprised of four essential components: wrist strap, han-
dle, cane body, and cane tip [92], as depicted in Figure 6.
According to research, the cane has undergone several
changes in design, from the white cane in France after World War
I to the Hoover cane after World War II, and then to the straight
section and folding cane. In China, it has become a popular hand-
held guide aid due to its convenience and affordability. However,
the cane has limitations in detecting objects within a limited
radius of 1 m, and is unable to detect obstacles at the same
height or further away [15]. Additionally, it cannot detect floating
objects, such as strips or branches [93], which can potentially
endanger the safety of the blind person and hinder their effective
travel. To address these limitations, electronic smart guide canes
have been developed, comprising of a cane head, control box,
cane body, and cane tip (chip reading part). The shape and func-
tion of the cane and rod aids for the blind are similar, with the
cane body serving as the primary component to connect corre-
sponding sensing equipment to aid the visually impaired in their
travels [15].
Within the corpus of collected articles, the earliest inquiry into
the realm of stick-related research dates back to 2014 with the
publication of the Location Guide DV [94]. This work employs
RFID technology to enable users to locate themselves. In addition,
Figure 5. Statistical chart of the classification of guide aids.
A SURVEY AND ANALYSIS OF NAVIGATION FOR THE BLIND 5
studies have investigated the incorporation of Geomagnetic sen-
sors into the sticks to facilitate navigation [95], as well as the
inclusion of sensors, such as ultrasonic waves, to provide guid-
ance for obstacle avoidance [96–98]. In 2019, Nav Cane electronic
aids were introduced [99], which were found to possess superior
navigation capabilities compared to traditional white canes.
Furthermore, certain scholars have pursued real-time video
processing through cloud video API to effectuate object detection
and obstacle avoidance while on the road [100].
In the research with white cane as the research content, the
first thing that appears is to guide BVIP to travel on the naviga-
tional path by using RFID technology [101–105]. Furthermore,
additional sensors have been added to the white cane to enhance
its functionality, allowing for obstacle avoidance [106–113].
Table 1. Introduction to various types of assistive devices.
Type Introduction Specific studies Advantages Disadvantages
Navigation glasses for
the blind
A guide detection device that
transmits ultrasonic waves
and then receives them
through an integrated
circuit on the eyeglass
frame and eyeglass lenses
for obstacle determination
and distance measurement
Smart glasses system based on
vision and haptics [39]
Bionic glasses [40,41]
3D glasses [42]
Intelligent navigation glasses
[43–46]
Smart glasses using deep
learning and stereo cameras
[47]
Commercial Goggle Systems [48]
Assistive navigation glasses based
on vision and auditory [49]
Assistive smart glasses based on
vision API [50]
All electronic circuits are
integrated
Small size, good
performance, and quick
response.
Process is prone to failure
Material and control
mechanism is not sound
Prone to some hazards in the
process of user use.
Navigation helmet for
the blind
In the form of a helmet, it
provides face recognition,
object recognition, path
planning, and obstacle
avoidance for BVIP using
IoT technology.
Small Embedded Interactive
System [51]
Head-mounted electronic eye
device [52]
Smart Cap [53]
Equipment designed as a hat [54]
Haptic feedback of the rap head
system [55]
GSM Innovative helmet
system [56]
Closer to the eyes, more
convenient to perceive
the surrounding
information
The degree of portability
directly affects the user’s
comfort;
Wearing is not convenient
and neat, affecting the
overall dress
Navigation shoes for the
blind
Guiding devices worn on the
foot, mostly using
ultrasonic and infrared
sensors for obstacle
avoidance detection
Identify obstacles with
microcontroller [57]
Tactile display mounted in the
foot [58]
Less binding;
Contains the functions of
ordinary shoes.
Not long use cycle;
The degree of lightness
directly affects the comfort
of the user, and the long-
term business value and
user experience is not high.
Navigation gloves for
the blind
Hand-worn guide devices Feedback gloves based on skin
touch [59]
Detect and avoid obstacles [60]
RFID-based auxiliary gloves [61]
Portable It binds the hands and affects
daily behavior.
Navigation bracelet for
the blind
Hand-worn guide devices Arduino bracelet with integrated
vibration motor [62]
Smart bracelet [63]
Vibro-Tactile Ring [64]
Portable Single function
Other Other wearable guide devices Navigation Backpacks for the
blind [65]
Finger tactile vibration ring [64]
Ultrasonic Assisted Headset [66]
Assistive device type not declared
[67–91]
L0
L The length of the blind cane L0 A wide band with red stripes
L
t
Length of blind cane body D Diameter of the cane body
Cane p Cane body DHandle Wristband
L
t
/3
L
t
L
Figure 6. Schematic diagram of the structure of the blind cane.
6 L. YUE ET AL.
Mobile phones have also been integrated with the white cane to
enable navigation and obstacle avoidance [114–119]. Additionally,
the use of vision sensors and computationally capable equipment
has facilitated object detection and subsequent obstacle avoid-
ance [120–127]. Other advanced technologies, such as the
Internet of Things [128–130], echo location [131], and virtual real-
ity [132] have also been utilized to assist with navigation and obs-
tacle avoidance for BVIP.
Mobile aids
Mobile assistive devices for the blind are classified as devices that
can move autonomously, such as guide robots, guide cars, and
electronic guide dogs, which can guide BVIP forward and commu-
nicate with them. After analyzing the articles, it can be observed
that guided robots and guided vehicles are the two primary cate-
gories identified. In particular, Kee et al. [133] have devised an
autonomous robot based on PIC technology that can aid visually
impaired individuals in avoiding obstacles through vocal instruc-
tions. Additionally, a novel social mobile robot has been proposed
in another article [134] that enables visually impaired individuals
to move more effortlessly, thanks to its humanized design. Finally,
the use of a fuzzy controller has been explored by the authors of
yet another article to mitigate steering angle errors and improve
the guidance accuracy of the robot [135]. Researchers affixed
CMOS image sensors onto a robot equipped with wheels, which
utilized the sensors to detect markings on the road and provide
guidance to individuals with visual impairments [136]. Bao et al.
[137] proposed two ways to guide BVIP to its destination by com-
bining GPS navigation and local pedestrian lane tracking with
coarse and fine grained navigation, respectively. Megalingam
et al. [138] introduced an intelligent and efficient path-guiding
robot that can replace guide dogs to help BVIP outdoor travel.
Finally, Ming et al. [139] developed a four-wheeled robot
equipped with a deep learning-based navigation system, specific-
ally utilizing Faster RCNN, which has demonstrated promising
results in practical application.
In the realm of safe navigation techniques for the visually
impaired, various guided vehicle type aids have been developed
by scholars. Specifically, Kammath et al.’s design of a smart elec-
tric vehicle is equipped with the ability to comply with traffic sig-
nals [140]. In addition, Hosseini et al.’s proposal of a guided
bicycle [141], and Jim
enez et al.’s motion device with a controller
[142], All of these aids serve to facilitate BVIP travel by providing
guidance through the use of technology.
Others
In addition to the above types of assistive devices for the blind,
there are others in the form of instruments, such as navigators
[143–147], obstacle avoidance devices [148], detectors [149], etc.
In the collated articles, seven of the reviewed articles [28,150–155]
did not explicitly specify the type of assistive technology that was
evaluated. It is worth noting that the device analyzed in article
[155] has a relatively light weight, which could be addressed by
implementing integration into a microchip, thereby reducing the
device’s size, weight, and cost.
Analysis of results
According to the data analysis above and as shown in Table 2
and Figure 5, it is evident that BVIP exhibits a strong inclination
towards guide aids that are characterized by their lightweight
construction, portability, precision, and longevity.
Numerous types of guide systems and aids possess diverse
functionalities that offer BVIP with efficacious navigation informa-
tion. Nonetheless, their widespread and efficacious application for
BVIP travel remains hindered by certain limitations. For instance,
some devices are bulky and expensive, while others offer only a
single function. Inadequate detection precision, low recognition
rates, and substandard positioning outcomes further impede the
devices’ usability. At the same time, insufficient user feedback on
the matter has led to suboptimal usability, indicating a lack of
systematic evaluation. For such groups as BVIP, how to make
them travel outdoors like normal people is a problem that rele-
vant researchers need to consider.
Based on temporal evidence, the investigation into supporting
equipment has progressed from RFID positioning, ultrasonic dis-
tance measurement, to object detection and identification, remote
control, and other related functions. This gradual expansion has
enabled the realization of multifaceted features, which more
effectively cater to the demands of users engaging in outdoor
excursions while also ensuring the safety of BVIP outdoor travel.
Sources and types of environmental information data for blind
navigation
Data analysis
In any navigation system or device for the blind, different types
of sensors are needed to sense environmental changes, and cue
the BVIP to the destination by planning the traveling path. These
types of sensors include infrared sensors, vision sensors, ultrasonic
sensors, and other types; thus, the following types of data can be
acquired: ultrasonic data, point cloud data, infrared data, RGB,
RGBD data, and RFID tag data. The specific statistics are shown in
the following bar statistics (Figure 7).
Ultrasonic data is obtained by carrying ultrasonic sensors on
the assistive devices for the blind. The sensor determines the dis-
tance of an obstacle in its path by measuring the time interval
between the transmission of an ultrasonic signal and the recep-
tion of the signal’s reflection upon encountering the obstacle.
This process enables BVIP to avoid obstacles and safely reach its
Table 2. Comparison of advantages and disadvantages of different types of assistive devices.
Type Advantages Disadvantages
Wearable devices Hands free
Multi-functional
Rich in expressing information
High restraint
Larger volume
Higher quality
User comfort is not strong
Handheld assistive devices Lightweight and easy to carry the price is acceptable
Wide used
Relatively simple function
Research and development are limited by quality and other aspects
Obstacle avoidance effect is not strong
Mobile category No user burden of weight
Safer travel
Support the infirm
Higher construction cost
Difficult to use in complex terrain
A SURVEY AND ANALYSIS OF NAVIGATION FOR THE BLIND 7
destination. However, the ultrasonic data cannot detect the exact
location of the obstacles and identify the type of obstacle. In
2007, the author utilized three pairs of ultrasonic sensors to iden-
tify obstacles, the sensors transmitted the obstacle data to a
microprocessor, which analyzed the data and generated sound
signals to alert the user [68]. Other researchers have also
employed ultrasonic sensors to detect obstacles and avoid them
[45,66,96–98,108,112,115,117,122,128,143,144,146]. Muneshwara
et al. [54] detected the obstacle’s position using ultrasonic signals.
Aarthi et al. [56] created a cueing device using ultrasonic sensors
and GSM to identify obstacles, but they did not verify the accur-
acy of obstacle detection.
The acquisition of infrared data is facilitated by the use of an
infrared sensor. The sensor operates by emitting signals and
receiving reflected signals upon encountering obstructions,
thereby acquiring pertinent information about the environment
being sensed. Scholars have proposed approaches for avoiding
obstacles through the installation of two infrared sensors on
robots [133]. Furthermore, Dhod et al. [109] have introduced a
novel technique that incorporates two infrared sensors with vary-
ing reference positions on assistive devices, enabling the detec-
tion of obstacles at different positions and enhancing the
precision of detection.
RFID is used to identify the pre-installed electronic tags
through radio waves with a read-write to carry out the transmis-
sion of data, thus guiding the BVIP to travel safely. A data man-
agement system is needed to manage this tag data and thus
provide BVIP with accurate road information. In several academic
papers [72,94,99,103,105,114,156], the authors suggest attaching
electronic RFID tags to guide tiles or affixing them to building
paths as a means of assisting visually impaired pedestrians in nav-
igating safely on blind paths. Subsequently, researchers have pro-
posed combining cell phone inertial navigation with RFID
technology to enable accurate positioning, path planning, and
navigation [78]. In addition, RFID technology has also been used
to correct errors in GPS positioning [101,104] as a complement to
navigation landmark marking [107].
The optical and imaging devices in vision sensors, namely cam-
eras, are utilized to gather external environmental image informa-
tion in the form of RGB and RGBD data. This data aids in
obtaining a more precise representation of obstacle categories by
way of figurative objects. At this stage, vision sensors are divided
into monocular, binocular, trinocular, and circumferential cameras
by the number of lenses, which are important components of
computer vision. This type of data source is mostly used to deter-
mine the obstacle information in certain direction through rele-
vant object detection algorithms and models. For example, In the
initial investigation conducted in 2005, Vel
azquez et al. [39]
affixed two miniature cameras onto the glasses’ frame to identify
Figure 7. Classification of navigation aids based on different environmental information data sources.
Table 3. Navigation aids in the navigation environment information multi-
source data statistics.
Time/data
type
Article
reference RGB Ultrasonic RFID Infrared
Point
cloud RGBD
2001 [91]
2003 [157]
2005 [141]
2009 [150]
2012 [102]
2013 [158]
[140]
2014 [73]
[159]
[121]
2015 [74]
2016 [82]
[131]
[43]
[123]
2017 [118]
[152]
2018 [160]
2019 [154]
[138]
2020 [60]
[153]
[126]
[57]
[129]
2021 [155]
[46]
8 L. YUE ET AL.
obstructions in the path of the walking process, the captured
data was then utilized to establish the obstacle’s location in the
scene and transfer it to the map, subsequently, a computational
device processed the gathered information, which was then con-
veyed to the BVIP through either a tactile or auditory mode. Son
et al. [48] presented a straightforward network that can detect
crosswalks in real-time to address the issue of BVIP passing
through crosswalks. Their approach yielded a detection rate of
91% on a set of 10 video sequences, with a training set compris-
ing 12,450 images and a test set containing 791 images.
Furthermore, the authors employed RGBD sensors to estimate the
distance of obstacles from the downgrade, thereby aiding BVIP in
obstacle avoidance [79,127]. In article [79], the authors propose a
method which aimed at enhancing the precision of object detec-
tion. The approach involved initially excluding the detected
ground area from the depth data, subsequently, the object’s con-
tour was obtained, and finally, the detection box was merged
with the contour to achieve the desired outcome.
Point cloud data refers to data acquired by emitting laser light
onto an object through the use of LIDAR or 3D laser scanning
equipment, followed by processing of the resulting reflections.
Within this field, Kaiser et al. [69] utilized an inertial measurement
unit (IMU) and a laser scanner to obtain sensor data for
Simultaneous Localization and Mapping (SLAM), thereby con-
structing an environmental map and selecting a safe and efficient
route for destination navigation. In a similar vein, Bai et al. [79]
relied on LIDAR measurements to determine the distance of
objects.
In the published related papers, most of the navigation devices
are designed using multiple sensors for combined use, as outlined
in Table 3. The acquired information is from various sources, with
RGB and ultrasonic sensors being the most commonly used,
whereas the RGBD sensor category is the least frequently used.
This outcome is similar to that of navigation devices that solely
rely on single-source environmental information.
Results analysis
According to the survey data (Figure 7), the largest number of
guiding aids in the relevant studies used ultrasound to obtain
data, and the smallest number used depth vision sensors to
obtain information. The specific information about the sources
and types of environmental information data for blind navigation
is shown in Table 4 below, and the similarities and differences
between these data sources are described in detail.
According to the analysis, it can be seen that the use of ultra-
sonic type data sources to help navigate BVIP is the most fre-
quent aids compared to other types, accounting for 24.19%.
However, ultrasonic sensors were the first to appear and the
accuracy of their data is vulnerable to noise. The duration
required for the development of data sources, such as infrared
and RFID is more consistent. However, the application of RFID
technology due to its suboptimal positioning accuracy and high
installation cost, which restricts its widespread usage. On the
other hand, the environment required by infrared sensors is not
suitable for BVIP travel, especially outdoor activities, resulting in
fewer outdoor related studies. The number of studies involving
point cloud and RGBD data are equally consistent, with point
cloud data exhibiting superior positioning accuracy. However, the
cost of obtaining point cloud data is relatively high, exceeding
the budget of most BVIP groups, which leads to a dearth of rele-
vant studies. However, it is reasonable to believe that with the
social and economic development, the living standard of the
majority of people continues to improve, related research will
become abundant. In the field of visual data processing, it is note-
worthy that while RGB sensors are relatively cheaper than RGBD
sensors, they lack the ability to capture depth information, which
indicates the distance between the camera and objects.
Nevertheless, the processing technologies associated with RGB
sensors have undergone significant development, particularly in
the case of monocular cameras. These cameras are capable of
estimating distance information through the application of the
Table 4. Comparison between different data sources.
Data type Ultrasound RGB Point-cloud RGBD Infrared RFID
Data sources Ultrasonic sensors Vision Sensors LIDAR, binocular
camera, 3D
scanner
Depth of vision
sensors
Infrared sensors Labels
Starting time The definition of
ultrasound was
first proposed in
1922
The proposal of
deep learning in
2006
1990s Kinect-v1 was first
announced in
2009
The first infrared
sensors appeared
in the 1940s
RFID theory was
born in 1940–
1950
Data form Ultrasonic signals Three-channel image Lasers Four-channel image infrared Radio waves
Positioning accuracy 0.5 m 0.1–0.5 m Centimeter level 0.1–0.5 m 0.5–1 m 1–10 m
Function Detect the distance
and azimuth of
the obstacle from
the sensor, and
calculate the
speed of the user
forward
Detect and identify
obstacles
Distance
measurement,
positioning,
obstacle detection
Detect and identify
obstacles and
calculate the
distance between
obstacles and
users
Temperature
measurement,
distance
measurement,
obstacle
avoidance
Identification, target
tracking, and data
acquisition
Advantages High frequency, short
wavelength, small
diffraction
phenomenon,
good directionality
The price is
moderate and the
technology is
mature
High precision, small
size, wide
measurement
range, good
robustness
Distance
measurement,
positioning
Low cost, operating
under normal
temperature
Strong penetration,
fast scanning, and
recognition,
durable, small size
Disadvantages Trigonometric errors
exist and are
susceptible to
noise
Cannot obtain 3D
information,
affected by
ambient light and
weather
high cost Large amount of
data, use
environment is
limited, the price
is higher than
RGB camera
Cannot be used in
sunlight or dark
rooms, low
sensitivity and
slow response
High cost,
unresolved
privacy issues,
inconsistent
frequency, use
environment
requirements
A SURVEY AND ANALYSIS OF NAVIGATION FOR THE BLIND 9
small-aperture imaging principle, the PNP method, or the similar
triangle method. Similarly, multilocular cameras can estimate
object distance from the lens using the binocular stereo parallax
principle. The utilization of RGBD-type sensors imposes high
requirements on the operational apparatus for data processing.
While RGB and ultrasonic data sources are the most frequently
used sources of information in navigation environments, com-
puter vision research may surpass research on ultrasonic data
sources, indicating its potential as the next focal point in this
area. In the domain of computer vision, there exists a limitation
with respect to the precision and detection efficacy of the detec-
tion algorithm. Additionally, the efficacy and speed of processing
the data captured from the vision sensor are influenced by the
configuration of the computing equipment. The size and weight
of the high-performance computing equipment is, therefore, a
crucial factor affecting the acceptance of Blind Vision-based assist-
ive devices.
Research on algorithmic aspects of blind navigation
Data analysis
In the domain of navigation for visually impaired individuals,
researchers have suggested a range of algorithms, including those
which improved with existing algorithms. These algorithms per-
tain to various stages of navigation, such as localization, obstacle
avoidance, recognition detection, and path planning. In our
statistical survey, Table 5 reveals that 20% of the articles collected
were published in Chinese journals, while the remaining 10%
were published in journals from other countries.
Result analysis
Table 5 demonstrates that navigation for the blind has given rise
to a range of algorithmic developments. In contrast to other
nations, Chinese researchers are focused on improved pre-existing
algorithms in the field of guidance to augment their precision
and efficacy. Compared with the proposed new algorithms, the
improved algorithms exhibit a greater level of accuracy in detec-
tion. Conversely, the new algorithms offer innovative ideas for
research but may compromise accuracy and have an imperfect
structural system in comparison to existing algorithms. It is note-
worthy that object detection algorithms are extensively utilized in
the navigation system process for the blind, which is an integral
aspect of computer vision technology. This indicates that com-
puter vision technology has emerged as a predominant technol-
ogy in the domain of navigation for the visually impaired.
Optimization of navigation methods for the blind
Data analysis
In the field of navigation for the blind, researchers use a variety
of methods to broaden navigation to help BVIP travel better and
more safely. Meanwhile, with the development of technology,
Table 5. Introduction to the guide algorithm.
Function Corresponding algorithms Overview
Location Synchronous positioning and mapping algorithm [161]
A computer vision algorithm for verifying the positioning
ability of blind guide prototype [162]
BVIP has high requirements for positioning accuracy and real-
time, and the traditional positioning methods cannot meet
the requirements; it needs to provide newer, more
accurate, and faster positioning methods.
Identification detection Cellular automata model [163]
A new algorithm for sidewalk detection [164]
Recognition and detection algorithm based on combination of
three algorithms [165]
Detection algorithm based on depth and gray image
attributes [166]
Detection algorithm based on MATLAB and Python [151]
Based on path and obstacle extraction and detection
algorithm [167]
Particle filter algorithm for estimating the direction and
position of obstacle free ground
Based on the stereo matching algorithm of image convolution
transformation, the recognition and processing algorithm of
specular reflection area is proposed [168]
Obstacle recognition algorithm based on outdoor blind
navigation sonar [169]
A new perceptual substitution algorithm [170]
Recognition and detection use computer vision and deep
learning methods to help BVIP perceive the surrounding
information (e.g., Crosswalks, blind roads, and
intersections).
Obstacle avoidance Collision detection algorithm based on parallax image [171]
collision detection algorithm based on stereo vision [172]
The problem of obstacle avoidance is inevitable for BVIP in
travel, and high-quality research on aspects, such as
positioning and target detection makes it possible to
achieve more effective, faster, and more accurate obstacle
avoidance.
Route Planning trajectory simplification algorithm [173]
based on Internet of Things technology [174]
Particle Swarm Optimization (PSO) algorithm [175]
A fast robot path planning algorithm based on bidirectional
associative learning [176]
Based on the scene map, path planning needs to reasonably
plan a road between the destination and BVIP through
high-precision positioning and target detection, and
complete guidance through listening, touching, and other
sensory methods.
Other scene matching algorithm [177,178]
An algorithm for extracting Head Related Impulse Responses
data [179]
Global stereo matching algorithm based on second-order prior
assumption and Census semi global stereo matching
algorithm based on cross aggregation [180]
Outdoor GPS Navigation Algorithm for BVI [181]
Image deblurring algorithm [182,183]
Integrated navigation algorithm [184]
SIFT algorithm based on local feature point matching [185]
In this field, it is also necessary to consider the quality of the
images in terms of computer vision, the problem of
obtaining the corresponding information, etc., all of which
are worth noting when guiding BVIP travel.
10 L. YUE ET AL.
more effective methods are also being introduced. In this condi-
tion, Chinese article accounts for 5% of the articles in our collec-
tion of statistics, and other countries account for 13% of the
relevant article, as shown in Table 6 of the survey.
Analysis of results
The emergence of optimized methods for navigation for the
blind is marked by the advancement of technology and the con-
tinuous development of technology. Among the methods listed in
Table 6, computer vision is mostly used for information acquisi-
tion to help BVIP.
Navigation maps for the blind
Data analysis
For BVIP, it is particularly difficult to obtain information and spa-
tial relationships of the surrounding area when traveling outdoors.
In this stage, few researches on electronic maps for the blind
[212]. The production and use of navigation maps for the blind
consider more auditory and tactile approaches. Tactile maps
mostly refer to raised maps made with special materials for tactile
perception of BVIP. In contrast, auditory maps mostly use speech
conversion and electronic maps, and other related technologies
for blind navigation. Guided electronic maps intended for the
visually impaired are commonly accessed through assistive tech-
nologies and screen readers on mobile devices [213].
In the collected article, there are few studies on navigation
maps for the blind. The article [214], published in 2001, presented
a classification of tactile maps along with an overview of the lat-
est research findings in this field. Subsequently, in 2010, a virtual
tactile map was introduced, which relied on the use of a Force
Display device for rendering [215]. In a subsequent study, Buzzi
et al. [213] approached the topic of map-based applications from
the standpoint of BVIP and discussed the relevance of interactive
devices in addressing accessibility issues for visually impaired indi-
viduals. Their aim was to develop visual maps that could be
“seen” by individuals with visual impairments, thus providing a
valuable reference point for future development in this area. Fu
Zhong liang et al. conducted a study [212] that examined the use
of geospatial cognition to develop electronic maps for the visually
impaired, which offer substantial assistive support for blind navi-
gation. The article [216] describes a design process for a tactile
map to support the OM (orientation and mobility) course at the
PIC (Paran
a Institute of the Blind) by developing a symbol-based
code system to distinguish buildings. In contrast, Hamid et al.
[217] utilized landmarks instead of sound cues in their study and
conducted a survey to identify suitable sounds for each landmark,
which were then incorporated into an auditory-tactile map.
Analysis of results
Within our compilation of statistical articles, there exists a paucity
of research on navigation maps within the domain of blind navi-
gation. Specifically, an examination of our collection reveals that a
Table 6. Introduction to the optimization of navigation methods for the blind.
Function Method Overview
Location A seamless mobile phone fusion positioning method [186]
A top-down design method [187]
Multi-sensor fusion location based on relative entropy Kalman
filter [188]
Based on the high requirements of positioning accuracy and
real-time performance, more accurate and faster
positioning methods need to be updated.
Target detection and tracking A new method for crosswalk detection [189]
The method of pedestrian marking detection in road
intersection [190]
A method for real-time detection of obstacle free path
[191,192]
Computer vision method for environment perception [193]
Analysis the influence degree of edge and depth information
of pavement detection [194]
Navigation method based on deep learning [195]
Segmentation based stereo vision method [196]
Use computer vision methods to help BVIP perceive
environmental information (e.g., crosswalks, blind roads,
and intersections).
Obstacle avoidance A method of virtual semicircle based on position
movement [197]
How to achieve more effective, fast, and accurate obstacle
avoidance is one of the research focuses in the future
development of the field of blind guidance.
path planning A navigation method for mobile robot based on sensing
unknown environment [198]
Collision avoidance path navigation for blind people based on
depth learning [199]
On the basis of scene map, path planning needs to guide a
reasonably planned road between the destination and BVIP
by means of hearing, touching, and other senses through
high-precision positioning and target detection.
Other A method of extracting high quality frames [200]
Image deblurring [183]
A tactile rendering method based on a series of studies to
help the interface display better [201]
A comprehensive VIP auxiliary technology method based on
statistics [202]
Assisting navigation based on geomagnetic field effect [203]
A method for prompting blind people to move based on
acoustic camera data [204]
A real time map construction method based on stereo
algorithm and visual mileage measurement [205]
Three different sonification techniques in target rotation [206]
Reduce the training cost of the model [207]
A method for rough topographic mapping [208]
Outdoor navigation method based on vision [209]
A new navigation method [210]
A method of translating hand drawn map into tactile map
automatically [211]
In addition to the proposed methods for localization, target
detection and tracking, and obstacle avoidance, other
aspects are proposed.
A SURVEY AND ANALYSIS OF NAVIGATION FOR THE BLIND 11
mere 2% of the articles are of Chinese origin, while related articles
from other countries account for only 3% of the total corpus. This
discrepancy in representation highlights the insufficiency of schol-
arly attention given to the topic at hand. Due to the small
amount of article collected, it is not possible to make a general-
ization. As far as the data collected on the maps for the blind are
concerned, due to the lack of vision of BVIP, they can only use
auditory and tactile senses to “see” the map and understand the
surrounding environment. The auditory map is an interactive
device that allows BVIP to navigate through a mental map by
prompting various sounds to let BVIP receive the Adjacent land-
marks and to achieve the purpose of travel navigation. Tactile
maps, on the other hand, are cumbersome to produce and not
very practical, so they mostly use a combination of auditory and
tactile means to achieve the effect of guiding blind people on
their travels through the integration of devices.
Discussion
The statistical results show that there are more studies on this
aspect of guiding aids and devices within the field of navigation
for the blind, accounting for 68% of the total collected article.
Among these, with the number of wearable devices being rela-
tively high, followed by handheld type aids. Although there are a
variety of wearable aids, they are unattractive, large, and heavy
compared to handheld type electronic guide canes, and have not
been widely used in BVIP. More research has been done on hand-
held aids because they were developed early and used for a long
time. With the development of time, guide aids have achieved
from single function to composite function. With the continuous
advancement of technology and the steady improvement of peo-
ple’s living standards, society is constantly evolving. To expand
customer retention and increase the freedom and enjoyment of
BVIP during outdoor travel, guide aids should become more light-
weight and cost-effective. This will enable BVIP to carry out their
daily tasks without constraints and improve their quality of life.
According to the survey, computer vision technology occupies
its important place in the research of assistive devices, data sour-
ces, guidance algorithms, and related methods. The development
of vision sensors in the field related to navigation for the blind
has greatly improved the current situation on BVIP’s travel. At the
same time, in the face of more complex navigating environmental
information, it means that the processing of vision sensor data is
also facing an important challenge. In the realm of computer
vision, both object detection and tracking necessitate a certain
level of detection accuracy and real-time performance. Despite
these requirements, erroneous detection and omission are still
prevalent, which poses a significant threat to the safety of BVIP
outdoor travelers. In addition, the visual data processing equip-
ment needs to meet the requirements of high performance and
light weight, which can meet the requirements of easy portability
and high availability of BVIP outdoor travel. Furthermore, the
issue of excessive illumination renders the vision sensor ineffect-
ive in daylight conditions, whereas passive vision sensors utilized
for nocturnal navigation fail to acquire pertinent object data, lead-
ing to challenges in object identification. How to carry out real-
time blind navigation 24 h a day will be one of the important
topics in the future.
Therefore, it is an arduous task for researchers to investigate
and develop guiding aids that are economical, efficient, and light-
weight, aimed at aiding BVIP to navigate and participate in their
daily lives like their sighted counterparts. This undertaking
necessitates a significant amount of dedication and diligence
from researchers.
Limitations of the research
This study provides an examination and assessment of the pre-
sent state of outdoor navigation technology for individuals with
visual impairments, while also predicting future research direc-
tions in this domain. Consequently, the paper is segmented into
five sections, namely system equipment, data sources, guidance
algorithms, optimization of related methods, and navigation map
for the blind.
To maintain the focus and specificity of our study, we exclu-
sively gathered literature on outdoor navigation for blind and
refrained from examining indoor blind navigation for our investi-
gation and analysis. Additionally, our inquiry solely evaluated the
interrelationship between blind navigation techniques and did
not furnish any statistical data regarding the implementation of
these technological products. Although the researchers tested
their research on the user side and used data to demonstrate the
feasibility of their research, there was no comprehensive evalu-
ation of these various guide products for the blind and related
research from the user’s perspective. Therefore, it is imperative to
carry out research in the living environment of BVIP and to
employ questionnaires as a means of statistical analysis to obtain
user feedback concerning these products.
In addition, it should be noted that the statistical analysis was
based on a limited sample of eight journal databases, and there-
fore, the number of articles reviewed may not be representative
of the entire corpus of literature on the subject matter.
Conclusion
This systematic review involved an exhaustive examination of vari-
ous data sources until 2021, resulting in the identification of 227
pertinent articles. From this pool, 179 articles were scrutinized to
provide a concise overview of five key areas, namely system
equipment, data sources, guidance algorithms, optimization of
related techniques, and navigation maps for the visually impaired.
The findings indicate that the predominant focus of scholars
centers on the design and creation of guiding aids and devices,
comprising the largest proportion (68%) of the reviewed litera-
ture. Among these, wearable aids exhibit relatively higher utiliza-
tion, albeit not extensively implemented. Subsequently, handheld
aids emerge as the second most prevalent type of assistive devi-
ces studied. The trend towards utilizing vision sensors as the pri-
mary data source for navigation environment information is
increasing. However, this approach is hindered by computer per-
formance limitations, which require further enhancements to
ensure real-time performance and accuracy of the surrounding
environment. Furthermore, due to the intricacy of navigation map
production, there has been limited research on updating the geo-
graphical area involved in the map in real-time.
According to the findings, scholars ought to evaluate the con-
temporary state, merits and demerits, and advancements of per-
tinent technologies through the lens of end-users, to develop an
assistive instrument that can effectively fulfill the outdoor naviga-
tion requirements of visually impaired individuals and further
enhance their quality of life.
Computer vision will be a key breakthrough in the future
development of navigation for the blind.
12 L. YUE ET AL.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Funding
This work was supported by the Natural Science Foundation of
Jiangxi Province: Dynamic Object Detection and Behavior
Perception for Blind Navigation Based on RGB-D Video Stream
(S2020ZRMSB0257) and National Natural Science Foundation of
China: Research on Real-time Geographic Scene Data Model of
OutdoorNavigation for Blind People Supported by Global Path
(42271434).
References
00[1] World Health Organization. World report on vision. Vol.
214. World Health Organization; 2019. p. 1–160.
00[2] Luobin P, Mingxia L. Research on urban blind road gov-
ernance based on public governance theory. Legal Syst
Soc. 2021;4:145–146.
00[3] Chen C. Analysis of urban accessibility problems and
improvement strategies in the context of population
aging: the current situation of blind lane construction in
Suzhou city as an example. Beauty Times. 2019;(5):65–66.
00[4] Weijing L, Ying H, Ping W, et al. The optimal design and
research of blind sidewalks in Changsha. Chinese
Overseas Archit. 2017;11:129–131.
00[5] Cunge G. Analysis of problems and suggestions for solv-
ing the current situation of urban blind design. Beauty
Times. 2015;(9):46–47.
00[6] Hu Xuefeng X, Jing W, Xingping C, et al. Study on spatial
mismatch and majorization of barrier-free facilities for
urban community: a case study of Nanjing. Disabil Res.
2019;(3):63–70.
00[7] Wang X. Blind way recognition system design base on
RFID. Chengdu University of Technology; 2009;1-61.
00[8] Zhu T, Wang B. On disability, accessibility and assistive
devices. Disabil Res. 2016;3:37–42.
00[9] Bo W. Research on barrier-free environment construction
in Beijing and countermeasures. Beijing University of
Architecture; 2017;1-69.
0[10] Yue W, Ge G, Mengzhu W, et al. Survey and analysis of
the application situation of urban barrier-free facilities
(tactile ground surface indicator). Chinese J Tissue Eng
Res. 2020;24(2):271–275.
0[11] Xiangcheng Z, Yaxia L, Xiaoxiao Z, et al. Design of auxil-
iary clothing for the blind based on obstacle detection.
Wool Text J. 2020;48(11):63–67.
0[12] Gallo S, Chapuis D, Santos-Carreras L, et al. Augmented
white cane with multimodal haptic feedback. 2010 3rd
IEEE RAS & EMBS International Conference on Biomedical
Robotics and Biomechatronics; 2010. p. 149–155.
0[13] Keli F. The history and current status of O&M for the
blind. Chinese J Rehabil Theory Pract. 2003;9(2):125–126.
0[14] Ruifeng L, Kun L, Jianing W, et al. A pilot study on the
market and feasibility of travel products for the blind.
Consum Electron Mag. 2020;2:36–81.
0[15] Jiangang M. Research on high-accuracy blind navigation
key issues based on haptic and spatial cognition. Xinjiang
University; 2017;1-75.
0[16] Brunet L, Darses F, Auvray M. Strategies and needs of
blind pedestrians during urban navigation. Trav Hum.
2018; 81(2):141–171. doi: 10.3917/th.812.0141.
0[17] Syed Rizal Alfam Wan A, Mohamad Noh A. Survey on out-
door navigation system needs for blind people. 2013 IEEE
Student Conference on Research and Development
(SCOReD). IEEE; 2013. p. 144–148.
0[18] Wu T. Research on object detection and tracking of out-
door blind sidewalk obstacles based on deep learning.
Ganzhou: Jiangxi University of Science and Technology;
2020.
0[19] Mack K, McDonnell E, Jain D, et al. What do we mean by
“accessibility research”? 2021:1–18.
0[20] Bhowmick A, Hazarika SM. An insight into assistive tech-
nology for the visually impaired and blind people: state-
of-the-art and future trends. J Multimodal User Interfaces.
2017;11(2):149–172. doi: 10.1007/s12193-016-0235-6.
0[21] Kuriakose B, Shrestha R, Sandnes FE. Tools and technolo-
gies for blind and visually impaired navigation support: a
review. IETE Tech Rev. 2022;39(1):3–18. doi: 10.1080/
02564602.2020.1819893.
0[22] Fernandes H, Costa P, Filipe V, et al. A review of assistive
spatial orientation and navigation technologies for the
visually impaired. Univ Access Inf Soc. 2019;18(1):155–168.
doi: 10.1007/s10209-017-0570-8.
0[23] Wu Z, Rong X, Fan Y, et al. A review of the current state
of research on guide robots for the blind. Comput Eng
Appl. 2020;56(14):1–13.
0[24] Hanlin C. Research on guiding aids for the blind based on
deep learning image recognition. Peak Data Sci. 2020;9(7):
242.
0[25] Guansheng W, Jianghua Z, Halik W, et al. Overview on
research and application of navigation/route guidance
assistive devices for the blind. Comput Appl Softw. 2012;
29(12):147–151.
0[26] Dakopoulos D, Bourbakis NG. Wearable obstacle avoid-
ance electronic travel aids for blind: a survey. IEEE Trans
Syst Man Cybern C. 2010;40(1):25–35. doi: 10.1109/TSMCC.
2009.2021255.
0[27] Tapu R, Mocanu B, Tapu E. A survey on wearable devices
used to assist the visual impaired user navigation in out-
door environments. 2014 11th International Symposium
on Electronics and Telecommunications, ISETC; 2015.
0[28] Anandan M, Manikandan M, Karthick T. Advanced indoor
and outdoor navigation system for blind people using
raspberry-PI. J Internet Technol. 2020;21(1):183–195.
0[29] El-Taher FE, Zahraa Taha A, Courtney J, et al. A systematic
review of urban navigation systems for visually impaired
people. 2021:1–35.
0[30] Khan S, Nazir S, Khan HU. Analysis of navigation assistants
for blind and visually impaired people: a systematic
review. IEEE Access; 2020:26712-26734.
0[31] Horton EL, Renganathan R, Toth BN, et al. A review of
principles in design and usability testing of tactile tech-
nology for individuals with visual impairments. Assist
Technol. 2017;29(1):28–36. doi: 10.1080/10400435.2016.
1176083.
0[32] Khan I, Khusro S, Ullah I. Technology-assisted white cane:
evaluation and future directions. PeerJ. 2018;6:e6058. doi:
10.7717/peerj.6058.
0[33] Prandi C, Barricelli BR, Mirri S, et al. Accessible wayfinding
and navigation: a systematic mapping study. Univ Access
Inf Soc. 2021;22:185–212.
A SURVEY AND ANALYSIS OF NAVIGATION FOR THE BLIND 13
0[34] Moher D, Liberati A, Tetzlaff J, et al. Preferred reporting
items for systematic reviews and meta-analyses: the
PRISMA statement. PLOS Med. 2009;6(7):e1000097. doi:
10.1371/journal.pmed.1000097.
0[35] Fernando N, McMeekin DA, Murray I. Route planning
methods in indoor navigation tools for vision impaired
persons: a systematic review. Disabil Rehabil Assistive
Technol. 2021;1–20. doi: 10.1080/17483107.2021.1922522.
0[36] Chen Xiaomeng LM. The current development of orienta-
tion and mobility aids for hearing-impaired persons.
Chinese J Spec Educ. 2017;(9):12–20.
0[37] Hersh MA, Johnson MA. A robotic guide for blind people.
Part 1. A multi-national survey of the attitudes, require-
ments and preferences of potential end-users. Appl Bionics
Biomech. 2010;7(4):277–288. doi: 10.1155/2010/252609.
0[38] Lakshmanan A, Devi, SG, Nisha, Mm. Dhanalakshmi devi-
ces and networking MBTI conference on communication.
Outdoor obstacle detection module to assist visually chal-
lenged. International Conference on Communication,
Devices and Networking; 2019.
0[39] Vel
azquez R, Pissaloux EE, Guinot JC, et al. Walking using
touch: design and preliminary prototype of a non-invasive
ETA for the visually impaired. Annual International
Conference of the IEEE Engineering in Medicine and
Biology–Proceedings; Vol. 7; 2005. p. 6821–6824.
0[40] Karacs K, L
az
ar A, Wagner R, et al. Bionic eyeglass: the first
prototype a personal navigation device for visually
impaired–a review. 2008 First International Symposium on
Applied Sciences on Biomedical and Communication
Technologies; 2008. p. 1–5.
0[41] Karacs K, Kusnyerik A, Radvanyi M, et al. Towards a mobile
navigation device. In: Karacs K, Kusnyerik A, Radvanyi M,
Roska T, Szuhaj M, editors. 2010 12th International
Workshop on Cellular Nanoscale Networks and their
Applications (CNNA 2010). IEEE; 2010. p. 1–4. doi: 10.
1109/CNNA.2010.
0[42] Mattoccia S, Macrı P. 3D glasses as mobility aid for visu-
ally impaired people. In: Agapito L, Bronstein MM, Rother
C, editors. Computer Vision–ECCV 2014 Workshops ECCV
2014 [Internet] (Lecture Notes in Computer Science; Vol.
8927). Agapito: Springer International Publishing; 2014.
Available from: 10.1007/978-3-319-16199-0
0[43] Hu J. Research and implementation of navigation glasses
for the blind based on ultrasonic and image recognition.
Hangzhou: Hangzhou Dianzi University; 2016.
0[44] He T, Zhang R, Liu C, et al. Machine vision-based design
of intelligent guide glasses for the blind. Appl Electron
Tech. 2017;43(4):58–61.
0[45] Liu X-Y, Yang H-X, Liu X-H, et al. The research of the blind
navigation glasses. Coll Phys. 2017;36(6):522–555.
0[46] Zhuhua C, Xiangyu C, Wang J, et al. The design of intelli-
gent blind navigation glasses. J Fujian Comput. 2021;
37(2):91–93.
0[47] Kim JH, Kim SK, Lee TM, et al. Smart glasses using deep
learning and stereo camera. 2019 IEEE 8th Global
Conference on Consumer Electronics (GCCE); Vol. 2; 2019.
p. 294–295.
0[48] Son H, Krishnagiri D, Jeganathan VS, et al. Crosswalk guid-
ance system for the blind. Proceedings of the Annual
International Conference of the IEEE Engineering in
Medicine and Biology Society, EMBS 2020; 2020 Jul. p.
3327–3330.
0[49] Saha S, Shakal FH, Saleque Am Trisha JJ, et al. Vision
maker: an audio visual and navigation aid for visually
impaired person. Proceedings of International Conference
on Intelligent Engineering and Management, ICIEM 2020;
2020. p. 266–271.
0[50] Rajendran PS, Krishnan P, Aravindhar DJ. Design and imple-
mentation of voice assisted smart glasses for visually
impaired people using google vision API. 2020 4th
International Conference on Electronics, Communication
and Aerospace Technology (ICECA) 2020. Institute of
Electrical and Electronics Engineers Inc.; 2020. p. 1221–1224.
0[51] Meliones A, Filios C. Blind helper: a pedestrian navigation
system for blinds and visually impaired. ACM International
Conference Proceeding Series; Vol. 20; 2016 Jun 29. p. 1–4.
0[52] Maiti M, Mallick P, Bagchi M, et al. Intelligent electronic
eye for visually impaired people. 2017:39–42.
0[53] Nishajith A, Nivedha J, Nair SS, et al. Smart cap–wearable
visual guidance system for blind. 2018 International
Conference on Inventive Research in Computing
Applications (ICIRCA); 2018. p. 275–278.
0[54] Muneshwara MS, Lokesh A, Swetha MS, et al. Ultrasonic
and image mapped path finder for the blind people in
the real time system. IEEE International Conference on
Power, Control, Signals and Instrumentation Engineering,
ICPCSI 2017; 2018. p. 964–969.
0[55] Kaul OB, Rohs M, Mogalle M, et al. Around-the-head tact-
ile system for supporting micro navigation of people with
visual impairments. ACM Trans Comput Hum Interact.
2021;28(4):1–35. doi: 10.1145/3458021.
0[56] Aarthi V, Sre CS, Delfina MS, et al. Smart alert and intim-
ation system for VIP (visually impaired person).
Proceedings–5th International Conference on Computing
Methodologies and Communication, ICCMC 2021; 2021. p.
1284–1289.
0[57] Raja L, Santhosh R. Experimental study on shoe based
navigation system for the visually impaired. Mater Today
Proc. 2021;45:1713–1716. doi: 10.1016/j.matpr.2020.08.615.
0[58] Vel
azquez R, Pissaloux E, Rodrigo P, et al. An outdoor
navigation system for blind pedestrians using GPS and
tactile-foot feedback. Appl Sci. 2018;8(4):578. doi: 10.3390/
app8040578.
0[59] Aqeel K, Naveed U, Fatima F, et al. Skin stroking haptic
feedback glove for assisting blinds in navigation. 2017
IEEE International Conference on Robotics and
Biomimetics, ROBIO 2017; 2018 Jan. p. 177–182.
0[60] Khairnar DP, Karad RB, Kapse A, et al. PARTHA: a visually
impaired assistance system. 2020 3rd International
Conference on Communication Systems, Computing and
IT Applications (CSCITA); 2020. p. 32–37.
0[61] Sedighi P, Norouzi MH, Delrobaei M. An RFID-Based assist-
ive glove to help the visually impaired. IEEE Trans Instrum
Meas. 2021;70:1–9. doi: 10.1109/TIM.2021.3069834.
0[62] Brock A, Kammoun S, Mac
e M, et al. Using wrist vibrations
to guide hand movement and whole body navigation. I-
Com. 2014;13(3):19–28. doi: 10.1515/icom.2014.0026.
0[63] Petsiuk AL, Pearce JM. Low-cost open source ultrasound-
sensing based navigational support for the visually impaired.
Sensors. 2019;19(17):3783. doi: 10.3390/s19173783.
0[64] Satpute SA, Canady JR, Klatzky RL, et al. FingerSight: a
vibrotactile wearable ring for assistance with locating and
reaching objects in peripersonal space. IEEE Trans Haptics.
2020;13(2):325–333. doi: 10.1109/TOH.2019.2945561.
14 L. YUE ET AL.
0[65] Xiao J, Ramdath K, Losilevish M, et al. A low cost outdoor
assistive navigation system for blind people. Proceedings
of the 2013 IEEE 8th Conference on Industrial Electronics
and Applications (ICIEA); 2013. p. 828–833.
0[66] Aymaz S, C¸avdar T. Ultrasonic assistive headset for visually
impaired people. 2016 39th International Conference on
Telecommunications and Signal Processing, TSP 2016;
2016. p. 388–391.
0[67] Shoval S, Borenstein J, Koren Y. Mobile robot obstacle
avoidance in a computerized travel aid for the blind. Proc
IEEE Int Conf Robot Autom; 1994. p. 2023–2028.
0[68] Kim CG, Song BS. Design of a wearable walking-guide sys-
tem for the blind. iCREATe 2007–Proceedings of the 1st
International Convention on Rehabilitation Engineering
and Assistive Technology in Conjunction with 1st Tan
Tock Seng Hospital Neurorehabilitation Meeting; 2007. p.
118–122. doi: 10.1145/1328491.1328523.
0[69] Kaiser EB, Lawo MBTIIC on C& IS. Wearable navigation sys-
tem for the visually impaired and blind people. 2012.
0[70] Aizawa T, Iizima H, Abe K, et al. Study on portable haptic
guide device with omnidirectional driving gear. Adv Rob.
2021;35(5):320–336. doi: 10.1080/01691864.2021.1888796.
0[71] Tachiquin R, Vel
azquez R, Del-valle-soto C, et al. Wearable
urban mobility assistive device for visually impaired
pedestrians using a smartphone and a tactile-foot inter-
face. Sensors. 2021;21(16):5274. doi: 10.3390/s21165274.
0[72] Alghamdi S, van Schyndel R, Khalil I. Accurate positioning
using long range active RFID technology to assist visually
impaired people. J Netw Comput Appl. 2014;41(1):135–
147. doi: 10.1016/j.jnca.2013.10.015.
0[73] Lin Q, Han Y. A context-aware-based audio guidance system
for blind people using a multimodal profile model. Sensors.
2014;14(10):18670–18700. doi: 10.3390/s141018670.
0[74] Kanwal N, Bostanci E, Currie K, et al. A navigation system
for the visually impaired: a fusion of vision and depth sen-
sor. Appl Bionics Biomech. 2015;2015:1–16. doi: 10.1155/
2015/479857.
0[75] Katz BFG, Kammoun S, Parseihian G, et al. NAVIG: aug-
mented reality guidance system for the visually impaired:
combining object localization, GNSS, and spatial audio.
Virtual Real. 2012;16(4):253–269. doi: 10.1007/s10055-012-
0213-6.
0[76] Chitra P, Balamurugan V, Sumathi M, et al. Voice naviga-
tion based guiding device for visually impaired people.
Proceedings–International Conference on Artificial
Intelligence and Smart Systems, ICAIS 2021. Institute of
Electrical and Electronics Engineers Inc.; 2021. p. 911–915.
0[77] Zhang X, Zhang H, Zhang L, et al. Double-diamond
model-based orientation guidance in wearable human-
machine navigation systems for blind and visually
impaired people. Sensors. 2019;19(21):4670. doi: 10.3390/
s19214670.
0[78] Huan Z, Chen X, Liang J. Design and implementation of
blind-navigation system based on RFID and smartphones’
inertial navigation. CAAI Trans Intell Syst. 2019;14(3):491–
499.
0[79] Bai J, Liu Z, Lin Y, et al. Wearable travel aid for environment
perception and navigation of visually impaired people.
Electronics. 2019;8(6):697. doi: 10.3390/electronics8060697.
0[80] Meng-Li S, Yi-Chun C, Chia-Yu T, et al. DLWV2: a deep
learning-based wearable vision-system with vibrotactile-
feedback for visually impaired people to reach objects.
2018 IEEE RSJ International Conference on Intelligent
Robots and Systems (IROS). IEEE; 2018. p. 1–9.
0[81] Abobeah R, Hussein M, Abdelwahab M, et al. Wearable
RGB camera-based navigation system for the visually
impaired. VISIGRAPP 2018–Proceedings of the 13th
International Joint Conference on Computer Vision,
Imaging and Computer Graphics Theory and Applications.
SciTePress; 2018. p. 555–562.
0[82] Mocanu B, Tapu R, Zaharia T. When ultrasonic sensors
and computer vision join forces for efficient obstacle
detection and recognition. Sensors. 2016;16(11):1807. doi:
10.3390/s16111807.
0[83] Huang H, Lin TC, Cai D. Non-visual traffic signal informa-
tion: an investigation of the recognition performance of
blind users using the wearable tactile traffic lights assist
device. Int J Ind Ergon. 2017;57:1–9. doi: 10.1016/j.ergon.
2016.11.002.
0[84] Vera Y
anez D, Marcillo D, Pereira A. Portable navigation
device for blind people. 2017 12th Iberian Conference
on Information Systems and Technologies (CISTI); 2017.
p. 1–6.
0[85] Y
anez DV, Marcillo D, Fernandes H, et al. Blind guide: any-
time, anywhere. ACM International Conference Proceeding
Series. Association for Computing Machinery; 2016. p.
346–352.
0[86] Scheggi S, Talarico A, Prattichizzo D. A remote guidance
system for blind and visually impaired people via vibro-
tactile haptic feedback. 22nd Mediterranean Conference
of on Control and Automation (MED); 2014. p. 20–23.
0[87] Jos
e J, Farrajota M, Rodrigues JM, et al. The smart vision
local navigation aid for blind and visually impaired per-
sons. Int J Digit Content Technol Appl. 2011;5(5):362–375.
doi: 10.4156/jdcta.vol5.issue5.40.
0[88] Nie M, Ren J, Li Z, et al. IEW: an auditory guidance system
based on environment understanding for the visually
impaired people. 2009 Annual International Conference of
the IEEE Engineering in Medicine and Biology Society.
IEEE; 2009. p. 7240–7243.
0[89] Balakrishnan G, Sainarayanan G, Nagarajan R, et al. On
stereo processing procedure applied towards blind navi-
gation aid–SVETA. Eighth International Symposium on
Signal Processing & Its Applications; 2005. p. 567–570.
0[90] Ran L, Helal S, Drishti MS. An integrated indoor/outdoor
blind navigation system and service. Proceedings–Second
IEEE Annual Conference on Pervasive Computing and
Communications, PerCom; 2004. p. 23–30.
0[91] Farcy R, Damaschini RM. Guidance–assist system for the
blind. Biomonit Endosc Technol. 2001;4158:209–214.
0[92] GB-16930.1-2014. National standards full text public sys-
tem-tactile sticks-part 1: a sign of safety color. 2014.
0[93] Poggi M, Mattoccia S. A wearable mobility aid for the
visually impaired based on embedded 3D vision and deep
learning. Proc IEEE Symp Comput Commun 2016; 2016
Aug. p. 208–213.
0[94] Junior M, Silva D, Bertolino FJ, et al. Location guide DV.
2014 IEEE Brasil RFID [Internet]; 2014. Available from:
http://www.iso.org
0[95] Chai Y, Lingling M. Design and realization of a new blind
lane and detector. 2018;41.
0[96] Mehta U, Alim M, Kumar S. Smart path guidance mobile
aid for visually disabled persons. Procedia Comput Sci.
2017;105:52–56. doi: 10.1016/j.procs.2017.01.190.
A SURVEY AND ANALYSIS OF NAVIGATION FOR THE BLIND 15
0[97] Munir MU, Mahmood H, Zeb F, et al. The voice enabled
stick. 2017 20th International Conference of Computer
and Information Technology (ICCIT); 2017. p. 1–5.
0[98] Sen A, Sen K, Das J. Ultrasonic blind stick for completely
blind people to avoid any kind of obstacles. 2018 IEEE
SENSORS; New Delhi, India. IEEE; 2018. p. 1–4.
0[99] Meshram VV, Patil K, Meshram VA, et al. An astute assist-
ive device for mobility and object recognition for visually
impaired people. IEEE Trans Hum Mach Syst. 2019;49(5):
449–460. doi: 10.1109/THMS.2019.2931745.
[100] Ambawane P, Bharatia D, Rane P. Smart e-stick for visually
impaired using video intelligence API. 2019 IEEE Bombay
Section Signature Conference (IBSSC); 2019. p. 1–6.
[101] Faria J, Lopes S, Fernandes H, et al. Electronic white cane
for blind people navigation assistance. 2010 World
Automation Congress; Kobe, Japan. IEEE; 2010. p. 1–7.
[102] Han N. Design and implement of blind navigation system
based on RFID. Jilin: School of Computer Science and
Technology, Jilin University; 2012.
[103] Kassim AM, Jaafar Hi Azam Ma Abas N, Yasuno T, et al.
Design and development of navigation system by using
RFID technology. Proceedings–2013 IEEE 3rd International
Conference on System Engineering and Technology, ICSET
2013; 2013. p. 258–262.
[104] Fernandes H, Filipe V, Costa P, et al. Location based serv-
ices for the blind supported by RFID technology. Procedia
Comput Sci. 2014;27:2–8. doi: 10.1016/j.procs.2014.02.002.
[105] Bai R, Gong J, Hu M, et al. The design of outdoor guide
system based on RFID technology. Video Eng. 2019;43(5):
76–79.
[106] Du Buf JMH, Castells D, Rodrigues JMF, et al. Obstacle
detection and avoidance on sidewalks [Internet]; 2010.
Available from: https://www.researchgate.net/publication/
216435190
[107] Hans Du Buf JM, Barroso J, Rodrigues JMF, et al. The
SmartVision navigation prototype for blind users. Int J
Digit Content Technol Appl. 2011;5(5):351–361. doi: 10.
4156/jdcta.vol5.issue5.39.
[108] Noorithaya A, Kumar MK, Sreedevi A. Voice assisted navi-
gation system for the blind. Proceedings of International
Conference on Circuits, Communication, Control and
Computing, I4C 2014; 2014 Nov. p. 177–181.
[109] Dhod R, Singh G, Singh G, et al. Low cost GPS and GSM
based navigational aid for visually impaired people.
Wireless Pers Commun. 2017;92(4):1575–1589. doi: 10.
1007/s11277-016-3622-0.
[110] Srinivasan S, Rajesh M. Smart walking stick. 2019 3rd
International Conference on Trends in Electronics and
Informatics (ICOEI); Tirunelveli, India. IEEE; 2019. p. 576–579.
[111] Kumar KNS, Sathish R, Vinayak S, et al. Braille assistance
system for visually impaired, blind & deaf-mute people in
indoor & outdoor application. 2019 4th International
Conference on Recent Trends on Electronics, Information,
Communication & Technology (RTEICT); Bangalore, India.
IEEE; 2019. p. 1505–1509. doi: 10.1109/RTEICT46194.2019.
9016765.
[112] Narayani L, Sivapalanirajan T, Keerthika M, et al. Design of
smart cane with integrated camera module for visually
impaired people. Proceedings–International Conference
on Artificial Intelligence and Smart Systems, ICAIS 2021.
Institute of Electrical and Electronics Engineers Inc.; 2021.
p. 999–1004.
[113] Lin JP. Design and implementation of guidance system
for blind people. Comput Knowl Technol. 2021;17(4):24–
28. Available from: www.dnzs.net.cn
[114] Fernandes H, Faria J, Paredes H, et al. An integrated sys-
tem for blind day-to-day life autonomy. In: Miesenberger
K, Klaus J, Zagler WL, Karshmer AI, editors. The 13th
International ACM SIGACCESS Conference on Computers
and Accessibility [Internet] (Lecture Notes in Computer
Science; Vol. 4061). Berlin; Heidelberg: Springer Berlin
Heidelberg; 2011. doi: 10.1145/2049536.2049579.
[115] Jeon DH, Jeon JU, Beak HH, et al. Situation-awareness white
cane using a mobile device. J Korea Soc Comput Inform.
2014;19(11):167–173. doi: 10.9708/jksci.2014.19.11.167.
[116] Intelligent mobility cane for people who are blind and
deaf-blind: a multidisciplinary design project that assists
people with disabilities. Proceedings of the ASME 2015
International Mechanical Engineering Congress and
Exposition; 2015. p. 13–19.
[117] Hejun W, Peksi S, W, Seng G. An affordable and attach-
able electronic device for the blind. 2015 Asia-Pacific
Signal and Information Processing Association Annual
Summit and Conference (APSIPA); Hong Kong, China.
IEEE; 2015. p. 559–562.
[118] Anandkumar KM, Krishnan A, Deepakraj G, et al. Remote
controlled human navigational assistance for the blind
using intelligent computing. ACM International Conference
Proceeding Series. Association for Computing Machinery;
2017.
[119] Lifang H, Zhiling Z, Jiehua W, et al. Design and implemen-
tation of multi-functional intelligent blind crutch.
Electronic Test [Internet]; 2019 Jul. p. 28–29.
[120] Paper R, Lin Q, Hahn H, et al. Top-view-based guidance
for blind people using directional ellipse model. 2013:1–
13.
[121] Fan MY, Bao JT, Tang HR. A guide cane system for assist-
ing the blind in travelling in outdoor environments AMM.
2014;631–632:568–571. doi: 10.4028/www.scientific.net/
AMM.631-632.568.
[122] Gupta S, Sharma I, Tiwari A, et al. Advanced guide cane
for the visually impaired people. 2015 1st International
Conference on Next Generation Computing Technologies
(NGCT); Dehradun, India. IEEE; 2015. p. 452–455.
[123] Jeong GY, Yu KH. Multi-section sensing and vibrotactile
perception for walking guide of visually impaired person.
Sensors. 2016;16(7):1070. doi: 10.3390/s16071070.
[124] Nandini AV, Dwivedi A, Kumar NA, et al. Smart cane for
assisting visually impaired people. Proceedings of the
TENCON 2019: Technology, Knowledge, and
Society :TENCON 2019–2019 IEEE RegionConference
(TENCON); 2019. p. 546–551.
[125] Zhu W, Liu H, Wang B, et al. An intelligent blind guidance
system based on visual-touch cross-modal perception.
CAAI Trans Intell Syst. 2020;15(1):39–46.
[126] Yuan Z. A raspberry Pi-based navigation cane for the
blind with obstacle avoidance. J Ezhou Univ. 2020;27(4):
98–100.
[127] Landa-Hern
andez A, Casarubias-Vargas H, Bayro-
Corrochano E. Geometric fuzzy techniques for guidance of
visually impaired people. Appl Bionics Biomech. 2013;
10(4):139–157. doi: 10.1155/2013/539521.
[128] SathyaNarayanan E, Gokul Deepan D, Nithin BP, et al. IoT
based smart walking cane for typhlotic with voice assist-
ance. 2016 Online International Conference on Green
16 L. YUE ET AL.
Engineering and Technologies (IC-GET); Coimbatore, India.
IEEE; 2016. p. 1–6.
[129] Kunta V, Tuniki C, Sairam U. Multi-functional blind stick
for visually impaired people. 2020 5th International
Conference on Communication and Electronics Systems
(ICCES); Coimbatore, India: IEEE; 2020. p. 895–899.
[130] Barathi Kanna S, Kumar G, Niranjan TR, et al. Low cost
smart navigation system for the blind. 2021 7th
International Conference on Advanced Computing and
Communication Systems, ICACCS 2021. Institute of
Electrical and Electronics Engineers Inc.; 2021. p. 466–471.
[131] Krishnan A, Deepakraj G, Nishanth N, et al. Autonomous
walking stick for the blind using echolocation and image
processing. 2016 2nd International Conference on
Contemporary Computing and Informatics (IC3I); 2016. p.
13–16.
[132] Zhao Y, Bennett CL, Benko H, et al. Enabling people with
visual impairments to navigate virtual reality with a haptic
and auditory cane simulation. Conference on Human
Factors in Computing Systems–Proceedings. Association
for Computing Machinery; 2018.
[133] Kee GM, Zain Zm, Salimin Rh. Design and development
PIC-based autonomous robot. 2008 IEEE International
Conference on Robotics, Automation and Mechatronics,
RAM 2008; 2008. p. 1–5.
[134] Lee MFR, Chiu FHS, Zhuo C. Novel design of a social
mobile robot for the blind disabilities. 2013 IEEE/SICE
International Symposium on System Integration, SII 2013;
2013. p. 161–166.
[135] Razali MF, Toha SF, Abidin ZZ. Intelligent path guidance
robot for visually impaired assistance. Procedia Comput
Sci. 2015;76:330–335. doi: 10.1016/j.procs.2015.12.303.
[136] Wu TF, Tsai PS, Hu NT, et al. Intelligent wheeled mobile
robots for blind navigation application. Eng Comput.
2017;34(2):214–238. doi: 10.1108/EC-08-2015-0256.
[137] Bao J, Yao X, Tang H, et al. Outdoor navigation of a
mobile robot by following GPS waypoints and local ped-
estrian lane. 8th Annual IEEE International Conference on
Cyber Technology in Automation, Control and Intelligent
Systems, CYBER 2018; 2019. p. 198–203.
[138] Megalingam RK, Vishnu S, Sasikumar V, et al. Autonomous
path guiding robot for visually impaired people. Vol. 768,
Advances in intelligent systems and computing.
Singapore: Springer Singapore; 2019. p. 257–266.
[139] Ming S, Wang L, Hong Z, et al. Research on guideline
navigation system based on deep learning. 2021 4th
International Conference on Intelligent Robotics and
Control Engineering (IRCE); 2021. p. 96–. .
[140] Kammath AG, Nair KS, Sudar S, et al. Design of an intelli-
gent electric vehicle for blind. 7th International
Conference on Intelligent Systems and Control, ISCO
2013; 2013. p. 244–249.
[141] Hosseini MSK, Mokhtari M, Atyabi M. The intelligent elec-
tronic system especially for quadriplegics wheelchair and
blinds bicycle (new method in paralympics). 2005 ICSC
Congress on Computational Intelligence Methods and
Applications; Istanbul, Turkey. IEEE; 2005.
[142] Jim
enez MF, Mello RC, Bastos T, et al. Assistive locomotion
device with haptic feedback for guiding visually impaired
people. Med Eng Phys. 2020;80:18–25. doi: 10.1016/j.
medengphy.2020.04.002.
[143] Liang Y, Zhu W, Wang H, et al. Design of blind navigator
based on ultrasonic tester. J Shanghai Univ Electric Power.
2010;26(6):601–604.
[144] Huang C, Wang D, Gao Y, et al. About the design and
research of intelligent walking aid. China Sci Technol
Inform. 2013;10:150–153.
[145] Chen Z, Ni Z, Wang Y, et al. Design and realization of the
intelligent blind navigation. New Technol New Process.
2015;3:36–38.
[146] Ni Z. Design and implementation of an intelligent naviga-
tion device for the blind. Shanxi: Xi’an University of
Technology; 2016.
[147] Zhai J, Wang M. Research of the guide based on the pro-
cess of the ultrasonic characteristic signal. J Ordnance
Equip Eng. 2018;39(9):178–182.
[148] Zhao J, Guo B, Li D. Research and design on intelligent
obstacle avoidance apparatus based on microcontroller
12C5A60. Microprocessors. 2015;36(1):79–83.
[149] Rosas-Flores B, Hern
andez-Zavala A, Huerta-Ruelas J.
Lightweight obstacle detector based on scattered IR and
Lock-In filtering. Infrared Phys Technol. 2020;105:103157.
doi: 10.1016/j.infrared.2019.103157.
[150] Iqbal A, Farooq U, Mahmood H, et al. A low cost artificial
vision system for visually impaired people. 2009
International Conference on Computer and Electrical
Engineering, ICCEE 2009; 2009. p. 474–479.
[151] Khade S, Dandawate YH. Hardware implementation of
obstacle detection for assisting visually impaired people
in an unfamiliar environment by using raspberry Pi. In:
Communications in Computer and Information Science.
Singapore: Springer ; 2016. p. 889–895.
[152] Ranaweera PS, Madhuranga SHR, Fonseka HFAS, et al.
Electronic travel aid system for visually impaired people.
2017 5th International Conference on Information and
Communication Technology (ICoIC7); 2017.
[153] Tang W. Tactile guidance system for the blind based on
digital image processing. J Phys Conf Ser. 2020;1486(4):
042037. doi: 10.1088/1742-6596/1486/4/042037.
[154] Ma’muriyah N, Yulianto A, Lili. Design prototype of audio
guidance system for blind by using raspberry pi and fuzzy
logic controller. J Phys Conf Ser. 2019;1230(1):1–10.
[155] Rahman A, Sadi MS. IoT enabled automated object recog-
nition for the visually impaired. Comput Methods Prog
Biomed Update. 2021;1:100015. doi: 10.1016/j.cmpbup.
2021.100015.
[156] Sammouda R, Alrjoub A. Mobile blind navigation system
using RFID. GSCIT 2015–Global Summit on Computer and
Information Technology–Proceedings; 2015. p. 1–4.
[157] Kayama K, Yairj IE, Igi S. Semi-autonomous outdoor mobil-
ity support system for elderly and disabled people. IEEE
International Conference on Intelligent Robots and
Systems; 2003. p. 2606–2611.
[158] Al-Fahoum AS, Al-Hmoud HB, Al-Fraihat AA. A smart infra-
red microcontroller-based blind guidance system. Act
Passive Electron Compon. 2013;2013:1–7. doi: 10.1155/
2013/726480.
[159] S
ov
eny B, Kov
acs G, Kardkov
acs ZT. Blind guide: a virtual
eye for guiding indoor and outdoor movement. J
Multimodal User Interfaces. 2015;9(4):287–297. doi: 10.
1007/s12193-015-0191-6.
[160] You J. A vibrotactile feedback based cooperative naviga-
tion system for the blind. Southeast University; 2018:1–83.
A SURVEY AND ANALYSIS OF NAVIGATION FOR THE BLIND 17
[161] Joshi A, Agrawal H, Agrawal P. Simultaneous localization
and mapping for visually impaired people for outdoor
environment. Proceedings of the Second International
Conference on Computer and Communication
Technologies Advances in Intelligent Systems and
Computing. New Delhi: Springer India; 2016. p. 107–115.
[162] Bitonto D, Corriero P, Pesare N, et al. Integrating com-
puter vision object recognition with location based serv-
ices for the blind. International Conference on Universal
Access in Human–Computer Interaction. Springer; 2014. p.
228–237.
[163] Roska T, B
alya D, L
az
ar A, et al. System aspects of a bionic
eyeglass. Proceedings–IEEE International Symposium on
Circuits and Systems; 2006. p. 161–164.
[164] Costa P, Fernandes H, Vasconcelos V, et al. Fiducials marks
detection to assist visually impaired people navigation. Int
J Digit Content Technol Appl. 2011;5(5):342–350. doi: 10.
4156/jdcta.vol5.issue5.38.
[165] Jo
se J, Du Buf JMH, Rodrigues JMF. Visual navigation for
the blind: path and obstacle detection. International
Conference on Pattern Recognition Applications and
Methods; 2012. p. 515–519.
[166] Tsai DM, Hsu H, Chiu WY. 3-D vision-assist guidance for
robots or the visually impaired. Ind Robot. 2014;41(4):351–
364. doi: 10.1108/IR-12-2013-427.
[167] Mal
ukas U, Maskeli
unas R, Dama
sevi
cius R, et al. Real time
path finding for assisted living using deep learning. J Univ
Comput Sci. 2018;24(4):475–487.
[168] Chen Z. Research on stereo matching algorithm for blind
travel. Jiangxi University of Technology; 2018:1–63.
[169] Meliones A, Llorente JL. Study and development of a
sonar obstacle recognition algorithm for outdoor blind
navigation. ACM International Conference Proceeding
Series. Association for Computing Machinery; 2019. p.
129–137.
[170] Real S, Araujo A. VES: a mixed-reality system to assist mul-
tisensory spatial perception and cognition for blind and
visually impaired people. Appl Sci. 2020;10(2):523. doi: 10.
3390/app10020523.
[171] Costa P, Fernandes H, Martins P, et al. Obstacle detection
using stereo imaging to assist the navigation of visually
impaired people. Procedia Comput Sci. 2012;14:83–93.
doi: 10.1016/j.procs.2012.10.010.
[172] Costa P, Fernandes H, Barroso J, et al. Obstacle detection
and avoidance module for the blind. World Automation
Congress Proceedings. IEEE; 2016. p. 1–6.
[173] Ivanov R. Real-time GPS track simplification algorithm for
outdoor navigation of visually impaired. J Netw Comput
Appl. 2012;35(5):1559–1567. doi: 10.1016/j.jnca.2012.02.002.
[174] Wang Z, Fang M, Lin X, et al. Route planning in handheld
blind navigation system based on RFID. Comput Eng Des.
2012;33(5):2063–2067.
[175] Yusof TST, Toha SF, Yusof HM. Path planning for visually
impaired people in an unfamiliar environment using par-
ticle swarm optimization. Procedia Comput Sci. 2015;76:
80–86. doi: 10.1016/j.procs.2015.12.281.
[176] Zhao M, Lu H, Yang S, et al. A fast robot path planning
algorithm based on bidirectional associative learning.
Comput Ind Eng. 2021;155:1–16.
[177] Xiaoli W. Research on key technology of local blind image
navigation based on scene matching. Changchun
University of Science and Technology; 2011:1–53.
[178] Wang X, Wang L. The teaching assistance system of java
programming based or SSH structure. J Changchun Univ
Nat Sci Ed. 2010;20(5):10–12.
[179] Haraszy Z, Cristea DG, Micut S, et al. Efficient algorithm
for extracting essential head related impulse response
data for acoustic virtual reality development. WSEAS
International conference on systems;WSEAS CSCC multicon-
ference. World Scientific and Engineering Academy and
Society (WSEAS). 2011;315–320.
[180] Li J. Research on binocular stereo matching algorithms for
blind guider system. Wuhan University; 2013:1–117.
[181] Lu Y, Jiang J. Navigation algorithm and implementation
for blind based on GPS trajectory. J Comput Appl. 2013;
33(4):1161–1164. doi: 10.3724/SP.J.1087.2013.01161.
[182] Rajakaruna N, Rathnayake C, Abhayasinghe N, et al.
Inertial data based deblurring for vision impaired naviga-
tion. International Conference on Indoor Positioning &
Indoor Navigation. IEEE; 2014. p. 416–420.
[183] Rajakaruna N, Rathnayake C, Chan KY, et al. I. Image
deblurring for navigation systems of vision impaired peo-
ple using sensor fusion data. IEEE Ninth International
Conference on Intelligent Sensors. IEEE; 2014. p. 21–24.
[184] Wenyuan J, Tong W. Integrated navigation method of the
blind walking based on computer vision/GPS/MG attitude
measurement. Navig Control. 2017;16(5):13–20.
[185] Hu Q. Research on blind navigation algorithm based on
machine vision. Jiangsu: Nanjing University of Aeronautics
and Astronautics; 2020.
[186] Zeng Q, Wang J, Meng Q, et al. Seamless pedestrian navi-
gation methodology optimized for indoor/outdoor detec-
tion. IEEE Sensors J. 2018;18(1):363–374. doi: 10.1109/
JSEN.2017.2764509.
[187] Elbes M, Al-Fuqaha A. Design of a social collaboration and
precise localization services for the blind and visually
impaired. Procedia Comput Sci. 2013;21:282–291. doi: 10.
1016/j.procs.2013.09.037.
[188] Hu E, Deng Z, Xu Q, et al. Relative entropy-based Kalman
filter for seamless indoor/outdoor multi-source fusion
positioning with INS/TC-OFDM/GNSS. Cluster Comput.
2019;22(S4):8351–8361. doi: 10.1007/s10586-018-1803-1.
[189] Romic K, Galic I, Leventic H, et al. Pedestrian crosswalk
detection using a column and row structure analysis in
assistance systems for the visually impaired. Acta Polytech
Hung. 2021;18(7):25–45. doi: 10.12700/APH.18.7.2021.7.2.
[190] Le MC, Phung SL, Bouzerdoum A. Pedestrian lane detec-
tion for assistive navigation of blind people. Proceedings–
International Conference on Pattern Recognition (ICPR);
2012. p. 2594–2597.
[191] Ortigosa N, Morillas S. Fuzzy free path detection from dis-
parity maps by using least-squares fitting to a plane. J
Intell Robot Syst. 2014;75(2):313–330. doi: 10.1007/s10846-
013-9997-1.
[192] Ortigosa N, Morillas S, Peris-Fajarn
es G. Obstacle-free path-
way detection by means of depth maps. J Intell Robot
Syst. 2011;63(1):115–129. doi: 10.1007/s10846-010-9498-4.
[193] Adjouadi M. A man-machine vision interface for sensing
the environment. J Rehabil Res Dev. 1992;29(2):57–76. doi:
10.1682/jrrd.1992.04.0057.
[194] Ståhl A, Newman E, Dahlin-Ivanoff S, et al. Detection of
warning surfaces in pedestrian environments: the impor-
tance for blind people of kerbs, depth, and structure of
tactile surfaces. Disabil Rehabil. 2010;32(6):469–482. doi:
10.3109/09638280903171543.
18 L. YUE ET AL.
[195] Tang W. Research on object detection and tracking of out-
door blind sidewalk obstacles based on deep learning.
Jiangxi: Jiangxi University of Science and Technology; 2021.
[196] Tang H, Zhu Z. A segmentation-based stereovision
approach for assisting visually impaired people.
International Conference on Computers Helping People
with Special Needs. Springer-Verlag; 2012. p. 581–587.
[197] Farah RN, Irwan N, Zuraida RL, et al. Modified virtual
semi-circle approach for a reactive collision avoidance of
a mobile robot in an outdoor environment. Appl Mech
Mater. 2014;679:171–175. doi: 10.4028/www.scientific.net/
AMM.679.171.
[198] Faibish S, Abramovitz M. Perception and navigation of
mobile robots. Proceedings of the 1992 IEEE International
Symposium on Intelligent Control; 1992. p. 335–340.
[199] Zhang H, Cheng F. Deep learning-based navigation path
planning with collision avoidance for the blind. J Nanjing
Univ Inform Sci Technol Nat Sci. 2021;14(2):220–226.
[200] Tian L, Tian Y, Yi C. Detecting good quality frames in vid-
eos captured by a wearable camera for blind navigation.
Proceedings–2013 IEEE International Conference on
Bioinformatics and Biomedicine, BIBM 2013. IEEE; 2013. p.
334–337.
[201] Kim Y, Harders M, Gassert R. Identification of vibrotactile
patterns encoding obstacle distance information. IEEE
Trans Haptics. 2015;8(3):298–305. doi: 10.1109/TOH.2015.
2415213.
[202] Leiva KMR, Ja
en-Vargas M, Codina B, et al. Inertial meas-
urement unit sensors in assistive technologies for visually
impaired people: a review. Sensors. 2021;21(14):1–26.
[203] Lee KM, Li M, Lin CY. Magnetic tensor sensor and way-
finding method based on geomagnetic field effects with
applications for visually impaired users. IEEE/ASME Trans
Mechatron. 2016;21(6):2694–2704. doi: 10.1109/TMECH.
2016.2582850.
[204] Ritterbusch S, Jaworek G. Camassia: monocular interactive
mobile way sonification. In: Miesenberger K,
Kouroupetroglou G, editors. BT–computers helping people
with special needs. Cham: Springer International
Publishing; 2018. p. 12–18.
[205] Agrawal M, Konolige K, Bolles RC. Localization and map-
ping for autonomous navigation in outdoor terrains: a
stereo vision approach. 2007.
[206] Ahmetovic D, Avanzini F, Barat
e A, et al. Sonification of
rotation instructions to support navigation of people with
visual impairment. 2019 IEEE International Conference on
Pervasive Computing and Communications (PerCom);
2019. p. 1.
[207] Ying JC, Li CY, Wu GW, et al. A deep learning approach to
sensory navigation device for blind guidance.
Proceedings–20th International Conference on High
Performance Computing and Communications, 16th
International Conference on Smart City and 4th
International Conference on Data Science and Systems,
HPCC/SmartCity/DSS 2018; 2019. p. 1195–1200.
[208] Gupta A, Khandelwal S, Gandhi T. Blind navigation using
ambient crowd analysis. 2018 IEEE 8th International
Advance Computing Conference (IACC). IEEE; 2018. p.
131–135.
[209] Lin Q, Han Y, Hahn H. Vision-based navigation using top-
view transform and beam-ray model. 2011 International
Conference on Advanced Computer Science and
Information Systems(ICACSIS) [Internet]; 2011. p. 371–376.
Available from: https://www.researchgate.net/publication/
254048246
[210] Iakovidis DK, Diamantis D, Dimas G, et al. Digital enhance-
ment of cultural experience and accessibility for the visu-
ally impaired. In: Paiva S, editor. Technological trends in
improved mobility of the visually impaired. EAI/springer
innovations in communication and computing. Springer
Science and Business Media Deutschland GmbH. Cham:
Springer; 2020. p. 237–271.
[211] Chen J, Takagi N. A pattern recognition method for auto-
mating tactile graphics translation from hand-drawn
maps. 2013 IEEE International Conference on Systems,
Man, and Cybernetics. IEEE; 2013. p. 4173–4178.
[212] Zhongliang F, Baofeng W, Yulong H. Research of blind e-
map based on geo-space cognition. J Nanchang Univ Eng
Technol. 2015;37(3):295–299.
[213] Buzzi MC, Buzzi M, Leporini B, et al. Making visual maps
accessible to the blind. In: Lecture Notes in Computer
Science (Including Subseries Lecture Notes in Artificial
Intelligence and Lecture Notes in Bioinformatics); Vol.
6766; 2011. p. 271–280.
[214] Smelser NJ, Baltes PB. Tactile maps in geography. In:
International encyclopedia of the social & behavioral sci-
ences; 2001. p. 15435–15437.
[215] Satoi T, Koeda M, Yoshikawa T. Virtual haptic map using
force display device for visually impaired. IFAC Proc Vol.
2009;42(16):645–650. doi: 10.3182/20090909-4-JP-2010.
00109.
[216] D, Oliveira ST, Suemitsu K, Okimoto MLLR. Design of a
tactile map: an assistive product for the visually impaired.
Adv Intell Syst Comput. 2016;485:711–719.
[217] Abd Hamid NN, Wan Adnan WA, Razak FHA. Identifying
sound cues of the outdoor environment by blind people
to represent landmarks on audio-tactile maps. Lecture
Notes in Computer Science (Including Subseries Lecture
Notes in Artificial Intelligence and Lecture Notes in
Bioinformatics); Vol. 10279; 2017. p. 279–290.
A SURVEY AND ANALYSIS OF NAVIGATION FOR THE BLIND 19
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Urban environments, university campuses, and public and private buildings often present architectural barriers that prevent people with disabilities and special needs to move freely and independently. This paper presents a systematic mapping study of the scientific literature proposing devices, and software applications aimed at fostering accessible wayfinding and navigation in indoor and outdoor environments. We selected 111 out of 806 papers published in the period 2009–2020, and we analyzed them according to different dimensions: at first, we surveyed which solutions have been proposed to address the considered problem; then, we analyzed the selected papers according to five dimensions: context of use, target users, hardware/software technologies, type of data sources, and user role in system design and evaluation. Our findings highlight trends and gaps related to these dimensions. The paper finally presents a reflection on challenges and open issues that must be taken into consideration for the design of future accessible places and of related technologies and applications aimed at facilitating wayfinding and navigation.
Article
Full-text available
This paper reports on the progress of a wearable assistive technology (AT) device designed to enhance the independent, safe, and efficient mobility of blind and visually impaired pedestrians in outdoor environments. Such device exploits the smartphone’s positioning and computing capabilities to locate and guide users along urban settings. The necessary navigation instructions to reach a destination are encoded as vibrating patterns which are conveyed to the user via a foot-placed tactile interface. To determine the performance of the proposed AT device, two user experiments were conducted. The first one requested a group of 20 voluntary normally sighted subjects to recognize the feedback provided by the tactile-foot interface. The results showed recognition rates over 93%. The second experiment involved two blind voluntary subjects which were assisted to find target destinations along public urban pathways. Results show that the subjects successfully accomplished the task and suggest that blind and visually impaired pedestrians might find the AT device and its concept approach useful, friendly, fast to master, and easy to use.
Article
Full-text available
Tactile patterns are a means to convey navigation instructions to pedestrians and are especially helpful for people with visual impairments. This article presents a concept to provide precise micro-navigation instructions through a tactile around-the-head display. Our system presents four tactile patterns for fundamental navigation instructions in conjunction with continuous directional guidance. We followed an iterative, user-centric approach to design the patterns for the fundamental navigation instructions, combined them with a continuous directional guidance stimulus, and tested our system with 13 sighted (blindfolded) and 2 blind participants in an obstacle course, including stairs. We optimized the patterns and validated the final prototype with another five blind participants in a follow-up study. The system steered our participants successfully with a 5.7 cm average absolute deviation from the optimal path. Our guidance is only a little less precise than the usual shoulder wobbling during normal walking and an order of magnitude more precise than previous tactile navigation systems. Our system allows various new use cases of micro-navigation for people with visual impairments, e.g., preventing collisions on a sidewalk or as an anti-veering tool. It also has applications in other areas, such as personnel working in low-vision environments (e.g., firefighters).
Article
Full-text available
Objective Route planning is key support, which can be provided by navigation tools for the blind and visually impaired (BVI) persons. No comprehensive analysis has been reported in the literature on this topic. The main objective of this study is to examine route planning approaches used by indoor navigation tools for BVI persons with respect to the route determination criteria, how user context is integrated, algorithms adopted, and the representation of the environment for route planning. Methods A systematic review was conducted covering the period 1982–July 2018 and thirteen databases. Results From the 1709 articles that resulted from the initial search, 131 were selected for the study. Route length was the sole factor used to determine the best route in the majority of the studies. Routes with less obstacles, less turns, more landmarks and close to walls were selected by the other studies. User variations were considered in few studies. Unique needs of the BVI persons were addressed by few novel algorithms which had integrated multiple parameters and BVI-friendly route modelling. Conclusion Differences between the navigation capabilities of the sighted and the BVI community were not a major concern when deciding the optimum routes in the majority of BVI indoor navigation tools. How to trade-off between factors affecting optimum routes, and how to model suitable routes in complex buildings needs to be studied further, looking from the user perspective. • Implications for rehabilitation • Navigation differences between sighted and blind and vision-impaired (BVI) communities are not concerned frequently when planning routes in indoor navigation tools of BVI persons. • Selecting routes avoiding areas difficult to traverse, close to walls, having landmarks, and less turns are some approaches used to address the unique needs of the BVI community. • Assisting for recovery from veering and real-time obstacle detouring are useful features offered by these tools. • Identifying and prioritizing different factors contributing to better routes, concerning user variations, and adopting multi-objective route optimization will help to develop improved route planning methodologies for BVI indoor navigation tool
Article
Background Visual impairments have become one of the most predominant problems for the last few decades. To keep doing their daily tasks, vision-impaired people usually seek help from others. An automated common object and currency recognition system can improve the safe movement and transaction activity of visually impaired people. Objective To develop a system that can identify indoor and outdoor objects, notify the users, and send all information to a remote server repeatedly at a fixed time interval. Methods The proposed system assists the visually impaired to recognize several objects and provides an audio message to aware the user. Four laser sensors are used in the system to detect the objects in the direction of the front, left, right and ground. The proposed system uses Single Shot Detector (SSD) model with MobileNet and Tensorflow-lite to recognize objects along with the currency note in the real-time scenario in both indoor and outdoor environments. Results Among 375 participants, 82% reacted that the price of the proposed system is reasonable, 13% treated as the cost is moderate and the rest 5% people responded that the cost is relatively high for them. In terms of size and weight, 73% reacted that the size and weight are considerable, 20% treated that the size is not suitable, and weight needs to lessen, and the rest 7% people responded that the system is bulky. Regarding input signal observation, 98% responded that they have heard the sound appropriately and the remaining 2% of individuals missed hearing the signal. Conclusions This paper represents an IoT-enabled automated object recognition system that simplifies the mobility problems of the visually impaired in indoor and outdoor environments. The overall accuracy of the proposed system in object detection and recognition is 99.31% and 98.43% respectively. In addition, the proposed system sends all processed data to a remote server through IoT.