Fig 2 - uploaded by Dirk Wollherr
Content may be subject to copyright.
The software architecture of the navigation subsystem of ACE.

The software architecture of the navigation subsystem of ACE.

Source publication
Conference Paper
Full-text available
One of the greatest challenges nowadays in robotics is the advancement of robots from industrial tools to companions and helpers of humans, operating in natural, populated environments. In this respect, the Autonomous City Explorer (ACE) project aims to combine the research fields of autonomous mobile robot navigation and human robot interaction. A...

Contexts in source publication

Context 1
... software architecture of the navigation and behavior selection subsystem of the ACE robot, is illustrated in Fig. 2. It is divided into three layers: (i) Processing Layer, (ii) Control Layer and (iii) Execution ...
Context 2
... navigating through urban environments, a robot is faced with different kinds of situations that require different kinds of navigation behaviors. It must be able to explore the environment, follow a sidewalk, safely cross a street, approach a person or follow a certain direction. As can be seen in Fig. 2 the Control is split into the Behavior Selection and the Behavior Control modules, which are described later in this section. Next the behaviors of the ACE robot are ...
Context 3
... was pointing into the direction the ACE robot should follow in order to reach Marienplatz. The direction is perceived by the vision system with a detection rate of 80.2% and an accuracy of 6.8 • , and send to the Behavior Control module. There it is transformed and forwarded to the Path Planning module, which is active in this situation. In Fig. 6(b2) the Voronoi Graph (blue lines), bounding boxes (blue rectangles), visibility graph (cyan) and resulting path (red) can be seen. The shown path illustrates the advantage of the dual path planning approach. The first two waypoints (red dots) correspond to nodes retrieved from the Voronoi Graph. While the path would proceed straight to ...

Similar publications

Article
Full-text available
For robots navigating using only a camera, illumination changes in indoor environments can cause re-localization failures during autonomous navigation. In this paper, we present a multi-session visual SLAM approach to create a map made of multiple variations of the same locations in different illumination conditions. The multi-session map can then...
Conference Paper
Full-text available
This paper shows that the field of visual SLAM has matured enough to build a visual SLAM system from open source components. The system consists of feature detection, data association, and sparse bundle adjustment. For all three modules we evaluate different libraries w.r.t. ground truth. We also present an extension of the SLAM system to create de...
Article
Full-text available
In large-scale environments, scale drift is a crucial problem of monocular visual simultaneous localization and mapping (SLAM). A common solution is to utilize the camera height, which can be obtained using the reconstructed 3D ground points (3DGPs) from two successive frames, as prior knowledge. Increasing the number of 3DGPs by using more proceed...
Article
Full-text available
Detecting already-visited regions based on their visual appearance helps reduce drift and position uncertainties in robot navigation and mapping. Inspired from content-based image retrieval, an efficient approach is the use of visual vocabularies to measure similarities between images. This way, images corresponding to the same scene region can be...
Conference Paper
Full-text available
Self localization and mapping with vision is still an open research field. Since redundancy in the sensing suite is too expensive for consumer-level robots, we base on vision as the main sensing system for SLAM. We approach the problem with 3D data from a trinocular vision system. Past experience shows that problems arise as a consequence of inaccu...

Citations

... One fundamental capability for robots to be able to establish common ground and joint attention with humans is the ability to respond to and perform deictic, or pointing, gestures. These have been studied and evaluated in various contexts, including in collaborations between humans and robots [6], museum exhibits [7], and even robots asking humans for help [8]. Symbolic gestures, among others, can be used by robots to establish themselves as social agents and contributors to an interaction [9], for example by waving "hello" or "goodbye" in greeting. ...
Conference Paper
Full-text available
In this paper, we present our work in close-distance non-verbal communication with tabletop robot Haru through hand gestural interaction. We implemented a novel hand gestural understanding system by training a machine-learning architecture for real-time hand gesture recognition with the Leap Motion. The proposed system is activated based on the velocity of a user's palm and index finger movement, and subsequently labels the detected movement segments under an early classification scheme. Our system is able to combine multiple gesture labels for recognition of consecutive gestures without clear movement boundaries. System evaluation is conducted on data simulating real human-robot interaction conditions, taking into account relevant performance variables such as movement style, timing and posture. Our results show robustness in hand gesture classification performance under variant conditions. We furthermore examine system behavior under sequential data input, paving the way towards seamless and natural real-time close-distance hand-gestural communication in the future.
... Said differently, perceiving and analyzing the environment around the user would offer to predict potential locomotion intentions several steps in advance. Intriguingly, comparable terrain detection systems have been widely developed for mobile robotic applications in various environments like urban [16], off-road [17], and planetary [18] explorations. Typically, these systems also combines several proprioceptive and exteroceptive sensory modalities such as cameras, IMUs, or 2D and 3D lasers [19]. ...
Article
Terrain detection systems have been developed for a large body of applications. For instance, a bionic leg prosthesis would have to adapt its behavior as a function of the terrain, in order to restore a sound lower-limb biomechanics to the amputee. Visually impaired people benefit from such systems in order to collect information about their locomotion environment and avoid obstacles. Finally, mobile robots use them for estimating terrain traversability, and adjusting control algorithms as a function of the surface type. This diversity of applications led to a large repertoire of systems, regarding both hardware (sensors, processing unit) and software used for classification. This paper provides an extended review of these systems, with a specific focus on the assistance of disabled walker. More precisely, it overviews the sensory systems and algorithms that were implemented to identify different locomotion terrains in indoor or urban environments (flat ground, stairs, slopes) in a way that they are or can be worn by a human user, and running in real-time. Contributions from mobile robotics are also included, pending that they could be adapted to a scenario of locomotion assistance. The systems are classified into two categories: these relying on proprioceptive sensors only and those further using exteroceptive sensors. Contributions from both categories are then compared according to their main specifications, such as accuracy and prediction time. The paper unambiguously shows that systems with exteroceptive sensors have higher prediction capability than systems with proprioceptive sensors only, and should thus be favored for assistive devices requiring predicting transitions between locomotion tasks.
... Path planning in urban environments has received a substantial attention in the robotics community. Several urban navigation robots such as Obelix [18] or the Autonomous City Explorer [19] use A * on topo-metric (a) OSM map. ...
Conference Paper
Full-text available
Although most robots use probabilistic algorithms to solve state estimation problems, path planning is often performed without considering the uncertainty about the robot's position. Uncertainty, however, matters in planning, but considering it often leads to computationally expensive algorithms. In this paper, we investigate the problem of path planning considering the uncertainty in the robot's belief about the world, in its perceptions and in its action execution. We propose the use of an uncertainty-augmented Markov Decision Process to approximate the underlying Partially Observable Markov Decision Process, and we employ a localization prior to estimate how the belief about the robot's position propagates through the environment. This yields to a planning approach that generates navigation policies able to make decisions according to the degree of uncertainty while being computationally tractable. We implemented our approach and thoroughly evaluated it on different navigation problems. Our experiments suggest that we are able to compute policies that are more effective than approaches that ignore the uncertainty, and also to outperform policies that always take the safest actions.
... II. RELATED WORK Although planning under uncertainty has received substantial attention, most robotic systems such as Obelix [10] or the Autonomous City Explorer [11] still use A * to navigate in urban environments. Navigation in urban environments often exploits topological or topo-metric maps [9]. ...
... These robots are generally even more difficult to develop but they are free from constant human assistance. An autonomous explorer robot was presented in [3][4][5][6] which is capable to navigate between locations in an intra-urban environment. Mobile robots are also commonly used for measurement and recon operations in human-risky environments such as operations under toxic air conditions. ...
Chapter
The current work reports of an obstacle avoidance and object tracking algorithm integrated and tested on robot. The system is designed with high torque geared DC motors and it has a payload capacity of up to 3 kg. The robot is fully autonomous and it is designed to patrol high security/hazardous zones and dynamically report any suspicious activity which when observed initiates an alarm to the security for further action whiles it continues to track and report the suspect. This has a major advantage over conventional CCTV systems in terms of cost and memory requirements and would not require constant human supervision. The robot proposed hereof uses wheel-encoders, ICbased Gyroscope, IR Line Laser, and spy camera as the basic sensing elements. It also has a smart charging feature which makes it energy efficient.
... A path planning method based on the hierarchical hybrid algorithm is proposed by [11], and [12] also focuses on the architecture that makes up the navigation subsystem of the autonomous city explorer. Meanwhile, many studies focus on algorithms for obstacle avoidance an enhancement to the earlier developed vector field histogram method is presented for the mobile robot [13], and compound PRM method is presented for path planning of the tractor-trailer mobile robot [14]. A modified RRT algorithm is applied in [15], and an effective algorithm based on A* (D*) is proposed in [16], in an uncooperative environment. ...
Article
Full-text available
The Jump Point Search (JPS) algorithm is adopted for local path planning of the driverless car under urban environment, and it is a fast search method applied in path planning. Firstly, a vector Geographic Information System (GIS) map, including Global Positioning System (GPS) position, direction, and lane information, is built for global path planning. Secondly, the GIS map database is utilized in global path planning for the driverless car. Then, the JPS algorithm is adopted to avoid the front obstacle, and to find an optimal local path for the driverless car in the urban environment. Finally, 125 different simulation experiments in the urban environment demonstrate that JPS can search out the optimal and safety path successfully, and meanwhile, it has a lower time complexity compared with the Vector Field Histogram (VFH), the Rapidly Exploring Random Tree (RRT), A*, and the Probabilistic Roadmaps (PRM) algorithms. Furthermore, JPS is validated usefully in the structured urban environment.
... Lidoris et al. [15] conducted the ACE project for developing a robot that can reach a destination without an existing map or GPS sensors and move outdoors. The main contribution of their paper was that the robot could reach a destination by asking pedestrians. ...
... Therefore, the robot has to rely on different sensors and switch between different navigation algorithms for a safe reach of the target location. Lidoris et al. [6] presents a robot navigation method in unknown urban environments relying only on the information extracted through the interaction with passers-by and its local perception capabilities. Thrapp et al. [7] presents a robust localization for the robot in outdoor environments using GPS and extended Kalman filter. ...
Conference Paper
Full-text available
Abstract— Intelligent robot navigation in urban environments is still a challenge. In this paper we test if it is possible to train neural networks to control the robot to reach the target location in urban dynamic environments. The robot has to rely on GPS and compass sensor to navigate from the starting point to the goal location in an environment with moving obstacles. We compare the performance of three neural architectures in different environments settings. The results show that neural controller with separated hidden neurons has a fast response to sensory input. The performance of evolved neural controllers is also tested in real robot navigation. In addition to the neural network based navigation, the robot has also to switch between other navigation algorithms to reach the target location in the university campus.
... In the last several decades, major technological achievements have been accomplished with respect to the development of autonomous mobile robots for outdoor environments. Robust and reliable systems for navigation [6], obstacle avoidance [7], and localization [8] have been successfully integrated into several kinds of robotic platforms [9]. A number of methods have been developed to allow robots to navigate safely around people in specific, typically nongeneralizable tasks [10]. ...
Conference Paper
Full-text available
This study describes and evaluates two new methods for finding and following people in urban settings using a humanoid service robot: the Continuous Real-time POMCP method, and its improved extension called Adaptive Highest Belief Continuous Real-time POMCP follower. They are able to run in real-time, in large continuous environments. These methods make use of the online search algorithm Partially Observable Monte-Carlo Planning (POMCP), which in contrast to other previous approaches, can plan under uncertainty on large state spaces. We compare our new methods with a heuristic person follower and demonstrate that they obtain better results by testing them extensively in both simulated and real-life experiments. More than two hours, over 3 km, of autonomous navigation during real-life experiments have been done with a mobile humanoid robot in urban environments.
... These robots are generally even more difficult to develop but they are free from constant human assistance. An autonomous explorer robot was presented in [3] which is capable to navigate between locations in an intra-urban environment. Mobile robots are also commonly used for measurement and recon operations in human-risky environments such as operations under toxic air conditions. ...
Article
This paper presents the experimental application of an autonomous mobile robot for gas leak detection in indoor environments. The application is focused to automatize a human-risky operation in indoor areas. The goal of the autonomous mobile robot is the localization of a toxic gas leak source. So, the mobile robot has to explore the whole area and perform an auto-localization procedure based on a SLAM method and a LIDAR sensor. The mobile robot measures gas concentration by using a photoionization detector. The experimentation was realized in a large indoor environment in a university facility with a simulated gas leak source. The combination of the results from the auto-localization procedure with the information of the sensors allows the estimation of the gas leak source location.