ArticlePDF Available

Obstacle-avoiding robot with IR and PIR motion sensors

Authors:

Abstract and Figures

Obstacle avoiding robot was designed, constructed and programmed which may be potentially used for educational and research purposes. The developed robot will move in a particular direction once the infrared (IR) and the PIR passive infrared (PIR) sensors sense a signal while avoiding the obstacles in its path. The robot can also perform desired tasks in unstructured environments without continuous human guidance. The hardware was integrated in one application board as embedded system design. The software was developed using C++ and compiled by Arduino IDE 1.6.5. The main objective of this project is to provide simple guidelines to the polytechnic students and beginners who are interested in this type of research. It is hoped that this robot could benefit students who wish to carry out research on IR and PIR sensors.
Content may be subject to copyright.
This content has been downloaded from IOPscience. Please scroll down to see the full text.
Download details:
IP Address: 192.30.84.205
This content was downloaded on 02/11/2016 at 00:25
Please note that terms and conditions apply.
Obstacle-avoiding robot with IR and PIR motion sensors
View the table of contents for this issue, or go to the journal homepage for more
2016 IOP Conf. Ser.: Mater. Sci. Eng. 152 012064
(http://iopscience.iop.org/1757-899X/152/1/012064)
Home Search Collections Journals About Contact us My IOPscience
Obstacle-avoiding robot with IR and PIR motion sensors
R Ismail1*, Z Omar1 and S Suaibun2
1Electrical Engineering Department, Polytechnic Kota Bharu, Malaysia
2Mechanical Engineering Department, Polytechnic Kota Bharu, Malaysia
*roslindaismail.poli@1govuc.gov.my
Abstract. Obstacle avoiding robot was designed, constructed and programmed which may be
potentially used for educational and research purposes. The developed robot will move in a
particular direction once the infrared (IR) and the PIR passive infrared (PIR) sensors sense a
signal while avoiding the obstacles in its path. The robot can also perform desired tasks in
unstructured environments without continuous human guidance. The hardware was integrated
in one application board as embedded system design. The software was developed using C++
and compiled by Arduino IDE 1.6.5. The main objective of this project is to provide simple
guidelines to the polytechnic students and beginners who are interested in this type of research.
It is hoped that this robot could benefit students who wish to carry out research on IR and PIR
sensors.
1. Introduction
Many robots for automation and navigation have been developed in recent years like wall-following,
edge-following, human following and obstacle avoiding robots. The obstacle avoiding robot will evade
obstacles it encounters in its path towards its operational goal. Due to the reliability, accessibility and
cost effectiveness of using mobile robot in industry and technical applications, the obstacle avoiding
robots are very important in factory floor. On the other hand, Unmanned Aerial Vehicles (UAVs) are
playing a vital role in defence as well as civilian applications [14]. The military applications include
reconnaissance, surveillance, battle damage assessment and communications. Meanwhile, civilian
applications include disaster management, remote sensing, traffic monitoring, etc. Many of the UAVs
applications need the capability to navigate in urban environment or unknown terrains that have many
obstacles of different types and sizes. Basic requirement of autonomous UAVs is to detect obstacles in
its path and avoid them. This paper proposed an example of the obstacle avoiding robot algorithm and
design of the robot base using IR and PIR sensors. The developed robot can be used as a platform for
several applications in educational, research or industrial.
The behaviour of a mobile robot is dictated by the interaction between the program running on the
robot (the ‘task’), physical hardware of the robot (the way its sensors and motor work) and the terrain
(environment) [4]. These three fundamental components are as shown in Figure 1. The development in
this paper is aimed to highlight the hardware and software implementation to design an autonomous
platform. It serves to achieve research purposes like developing software modules to realize obstacle
avoidance and investigating the interaction between the robot task and terrain. This platform serves for
achieving educational purposes to reveal the latest technology for different sensors and actuators that
is used in mobile robot design [19]. Besides that, it also provides the opportunity to study on how to
AEROTECH VI - Innovation in Aerospace Engineering and Technology IOP Publishing
IOP Conf. Series: Materials Science and Engineering 152 (2016) 012064 doi:10.1088/1757-899X/152/1/012064
Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution
of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Published under licence by IOP Publishing Ltd 1
program the microcontroller chip to acquire sensors data and react autonomously with environment in
order to perform tasks.
Figure 1. Obstacle avoidance fundamental component
2. Literature review
Robots need miscellaneous of sensors to obtain information about the world around them. Sensors will
help detect position, velocity, acceleration and range for the object in the robot's workspace. There is a
variety of sensors used to detect the range of an object. One of the most common range finders is the
ultrasonic transducer. Vision systems are also used to greatly improve the robot’s versatility, speed and
accuracy for its complex and difficult task. Varun et al. [13] developed obstacle avoidance equipped
with pan-tilt mounted vision system. In this case, the robot uses histograms of images obtained from a
monocular and monochrome camera to detect and avoid obstacles in a dynamic environment. Kim et
al. [1] presented the detection of moving obstacles (particularly walking humans) using single camera
attached to a mobile robot. Detection of object that moves near the robot is searched by block-based
motion estimation. Shoval et al. [8] described the use of mobile robot obstacle avoidance system as a
guidance device for blind and visually impaired people. Electronic signals are sent to a mobile robot’s
motor controllers and auditory signals can guide the blind traveller around the obstacles. Kumar [3]
proposed an alternative design for cost effective and simplified version of the obstacle avoidance robot
using three ultrasonic sensors. Borenstein et al. [9] built a mobile robot system, capable of performing
various tasks for the physically disabled people. The developed robot uses ultrasonic range finder for
detection and mapping to avoid collision with the unexpected obstacles. Sathiyanaranayan et al. [10]
developed self-controlled robots for military purposes. It uses GPS and magnetic compass, and adjusts
strategies based on the surroundings using path planning and obstacle detection algorithms. Bhanu et
al. [12] built a system for obstacle detection during rotorcraft low altitude flight. The requirements of
an obstacle detection system for rotorcraft in low altitude flight based on various rotorcraft motion
constraints is analysed in details. Huballi et al. [11] discussed on Smart Distance Measurement using
IR sensor. From the research, they found that a major drawback of IR based sensors is their capability
of detecting only in short distance.
3. Sensors selection
The objective of having obstacle avoiding robot is to enable autonomous functioning without human
supervision. A lot of sensors are available for obstacles detection such as ultrasonic sensor, infrared
sensor, camera and LIDAR (laser based sensor system), which has been considered as one of the most
accurate schemes for generating spatial information about the shape and surface characteristics of any
object [18]. In this paper, infrared sensors are used to accomplish evading obstacle. IR and PIR sensors
have been placed on prototype for obstacle detection as shown in Figure 2.
Accuracy with low cost is the constraint of this research, hence the IR sensor was selected for the
design. Stereo camera vision does not perform well under some environment conditions such as plain
wall, glass surfaces or poor lighting conditions whereas IR sensor can be used to improve the overall
vision system of mobile robot [17]. IR sensors are widely used for measuring distances, so they can be
used in robotics for obstacles avoidance. IR sensors are also faster in response time than ultrasonic
sensors [17]. In addition, the power consumption of IR sensor is lower than ultrasonic sensors [11].
Active Infrared (IR) sensors can be an emitter and detector, which operate at the same wavelength. It is
also known as photoelectric sensor working with reflective surfaces [15]. IR sensor can be categorized
ROBOT
TASK
TERRAINS
(Environment)
AEROTECH VI - Innovation in Aerospace Engineering and Technology IOP Publishing
IOP Conf. Series: Materials Science and Engineering 152 (2016) 012064 doi:10.1088/1757-899X/152/1/012064
2
as retro-reflective sensors and diffuse reflection sensors. Retro-reflective sensors are proper for harsh
environment conditions and have much larger detection range than the diffuse reflective sensor [15].
IR sensors use a specific light sensor that can detect a selective light wavelength in the IR spectrum.
When an object is close to the sensor, the light from the LED bounces off the object and into the light
sensor as shown in Figure 3.
Figure 2. Position of IR and PIR sensor
Figure 3. IR sensor object detection
3.1. IR and PIR sensors operation
Infrared sensors detect the object's distance with infrared radiation. When the beam detects an object,
the light beam returns to the receiver with an angle after reflection. The method of triangulation is as
shown in Figure 4. PIR sensors are also known as Pyroelectric Infrared sensor, Passive Infrared sensor
or IR motion sensor, which detect the difference in temperature, thermal radiation, human body or an
animal [15]. PIR sensor operates with the radiation of body heat as shown in Figure 5. The hotter the
detected object, there will be more emission occurs in PIR sensor.
Figure 4. IR sensor working principle [15]
Figure 5. PIR sensor object detection
Object temperature calculation is based on Stefan-Boltzmann Law.
AEROTECH VI - Innovation in Aerospace Engineering and Technology IOP Publishing
IOP Conf. Series: Materials Science and Engineering 152 (2016) 012064 doi:10.1088/1757-899X/152/1/012064
3
PIR sensor relatively has a lower power consumption compared to IR sensor. PIR sensor also senses
accurate detection in narrow areas and is compatible to work in microcontrollers [15]. The IR and PIR
sensors can act as a transducer since they use infra-red signal as the input and convert it to analogue
electrical output signal [15].
3.2. Limitation of IR and PIR sensor
Performance of IR sensors has been limited by their poor tolerance to light reflections such as ambient
light or bright object colours. No object recognition at the dead zone area, for example Sharp GP2D12
IR distance sensor dead zone between 0 to 4 cm. IR sensors also give inaccurate detection result with
transparent or bright colour materials. Detection results also depend on the weather conditions and the
sensing reliability of IR sensors decreases with moisture and humidity. Furthermore, IR sensors can
sense IR radiation from the sunlight, which can cause correctable or non-correctable errors at output.
Besides that, if analogue IR sensor is used, signal losses will occur at the amplifier circuit. Meanwhile,
PIR motion sensor needs a long calibration time and is sensitive to thermal radiation. Besides that, PIR
sensor is insensitive to very slow motions or to objects in standing mode.
4. Experiment set up
Circuit diagram of the obstacle avoiding robot is given in Figure 6. The hardware developed consists
of Microcontroller Atmel ATMEGA 32, IR sensor, two DC motors as differential driving system, PIR
(Passive Infra-Red) sensor and motor driver L293D. The microcontroller ATMEGA 32 is the central
brain of the mobile robot. Figure 7 shows the chassis of the robot. The IR detector (rear) is connected
to PD6 of ATMEGA32. If any object is located at the rear part of the robot frame, the IR sensor output
will alert the microcontroller that an obstacle is detected. The IR sensor used as shown in Figure 8.
PIR sensor (front) is connected to PD7 I/O pins. If any object is moving in front of the robot, the PIR
sensor output will alert the microcontroller that an obstacle is detected. Motion sensor has 90 degrees
field of view and it will be triggered when a warm object moves across the area it is facing. It is very
sensitive and will trigger with just a hand movement. PIR motion sensor is shown in Figure 9.
Figure 6. Circuit diagram of obstacle
avoiding robot
Figure 7. Chassis of obstacle avoiding robot
Figure 8. IR sensor
Figure 9. PIR sensor
AEROTECH VI - Innovation in Aerospace Engineering and Technology IOP Publishing
IOP Conf. Series: Materials Science and Engineering 152 (2016) 012064 doi:10.1088/1757-899X/152/1/012064
4
Motor driver L293D (H-bridge) decides which motor will be moved or stopped in accordance to
the incoming signal from the microcontroller ATmega32. The selected dimension of the mobile base is
suitable to install sensors to interact with the environment. The wheel used in Figure 7 will provide
sufficient traction and stability for the robot to cover all of the desired terrain. Standard wheel and
caster wheel was used in this design. Standard wheel with rotation around motorized axle and rotation
around the contact point. Meanwhile, caster wheel rotation around the offset steering joint and rotation
around the contact point. The software is written using Arduino Integrated Development Environment
or Arduino Software (IDE). It contains a text editor for writing code, a message area, a text console, a
toolbar with buttons for common functions and a series of menus. It connects to the Arduino hardware
to upload programs and communicate with them. Once the compile file (Hex code) is obtained, it can
be downloaded into microcontroller.
5. Result
A brief overview of the software functions and the system architecture is shown in Figure 10. From the
flowchart, the calling sequence and the relationship between the functions are visualized.
Figure 10. Flow chart of the obstacle avoidance robot movement
The working principle of the robot is transmitting sensed signal to the microcontroller to control the
DC motors for obstacle avoidance. The H-bridge L293D controls the direction of the motors to move
either clockwise or anti-clockwise directions as provided by the microcontroller. Figure 10 shows that,
if PIR sensor detect a moving object while IR sensor does not detect any object, the robot will move
backward (motor 1 and motor 2 counter clockwise). On the other hand, if PIR detect object and IR
sensor also detect object, the robot will stop (motor 1 and motor 2 OFF). After 50 ms, motor 1 will
move clockwise and the robot will turn left. After 500 ms, the robot will move forward (motor 1 and
motor 2 clockwise) and after 1000ms, both motors will stop. From the flow chart in Figure 10, it
shows that the IR and PIR sensors are very effective in sensing signals in their path for the obstacle
avoiding robots to evade obstacles in its path.
7. Discussion and conclusion
The developed robot platform was not designed for specific task but as a general wheeled autonomous
platform. It can therefore be used for educational, research or industrial implementation. Students can
AEROTECH VI - Innovation in Aerospace Engineering and Technology IOP Publishing
IOP Conf. Series: Materials Science and Engineering 152 (2016) 012064 doi:10.1088/1757-899X/152/1/012064
5
use it to learn the microcontroller programming using C++, Arduino Uno 1.6.5 compiler, IR and PIR
sensors characteristics, motor driving circuit and signal condition circuit design. Research on obstacle
avoidance robot at the polytechnic level can help students to develop communication, technical skills
and teamwork. The design of such robot is very flexible and various methods can be adapted for
another implementation. From this paper, it shows that PIR sensors are more sensitive compared to IR
sensors while detecting human being.
8. Limitation and further improvement
The robot has been successful in avoiding obstacle but it also has certain limitations as well. Further
improvement can be achieved by adding sensors on the left and right side of the robot. Besides that,
computer vision with camera features can be implemented for monitoring applications. For further
improvement, to implement an obstacle avoidance in aerospace, well-suited sensors should be used to
gather the accurate information about the environment and obstacles. The laser based (LIDAR) sensor
system is robust especially in off-road outdoor environments [18]. LIDAR based mapping gives most
accurate scheme for generating information about the shape and surface characteristics of any object.
The recent advancement in LIDAR technologies allows researchers and practitioners to examine the
environment with higher precision, accuracy and flexibility than before. LIDAR sensor is considered
as an effective solution to the problem of obstacle detection and recognition. However, the obstacle
avoidance poses challenges to the image processing using LIDAR sensor.
References
[1] Jeongdae K and Yongtae D 2012 Procedia Engineering 41 911-6
[2] Kumar R C, Saddam Khan M, Kumar D, Birua R, Mondal S and Parai M K 2013 International
Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering 2
1430-4
[3] Dey M K 2016 International Journal of Soft Computing and Engineering 6 66-9
[4] Ali W G 2011 Journal of King Saud University-Engineering Sciences 23 131-8
[5] Medina-Santiago A, Camas-Anzueto J L, Vazquez-Feijoo J A, Hernandez-de Leon H R and
Mota-Grajales R 2014 Journal of Applied Research and Technology 12 104-10
[6] Syed Ali A, Zanziger Z, Debose D and Stephens B 2016 Building and Environment 100 114-26
[7] Candelas F A, Garcia G J, Puente S, Pomares J, Jara C A, Perez J, Mira D and Torres F 2015
IFAC Workshop on Internet Based Control Education
[8] Shoval S, Borenstein J and Koren Y 1994 Proceeding of 1994 IEEE Robotics and Automation
Conference
[9] Borenstein J and Koren Y 1988 Journal of Robotics and Automation 4 213-8
[10] Sathiyanarayanan M, Azharuddin S, Santhosh K and Khan G 2014 International Journal For
Technological Research in Engineering 1 1075-7
[11] Hubballi P, Bannikoppa S, Bhairi V and Jahagirdar R G Indian J. Sci. Res. 12 155-8
[12] Bhanu B, Das S, Roberts B and Duncan D 1996 IEEE Transaction on Aerospace and Electronic
Systems 32 875-97
[13] Raj K V, Bidargaddi S V, Krisnanand K N and Ghose D 2007 13th National Conference on
Mechanisms and Machines
[14] Gade M M, Hangal S, Krishnan D and Arya H 2016 International Federation of Automatic
Control
[15] Saracoglu B 2009 A Guide to IR/PIR Sensor Set-up and Testing Michigan State University
[16] Gwee J Electronic Review 20 16-8
[17] Mohamed T 2009 International Journal of Mechanical, Aerospace, Industrial, Mechatronic and
Manufacturing Engineering 3 267-72
[18] Yaghi M, Onder Efe M and Dincer K 2015 Ankara International Aerospace Conference
[19] Nehmzow U 2001 Proc. Trends in Robotics
AEROTECH VI - Innovation in Aerospace Engineering and Technology IOP Publishing
IOP Conf. Series: Materials Science and Engineering 152 (2016) 012064 doi:10.1088/1757-899X/152/1/012064
6
... This situation can lead to congestion in busy urban areas [1]. At road intersections, there is often a density of traffic flow on the road, so traffic lights are needed to regulate traffic flow [2]. Traffic lights in Indonesia still mostly use timings for changing lights on with a fixed duration on each side and are also commonly regulated by the police on duty manually [3]. ...
... To solve this problem, a traffic light control device was created that was expected to solve congestion based on the density of vehicles in each lane of the road [7]. Desert lanes should have a longer green light duration than other roads [2]. Therefore, the author aims to create a tool that is used to regulate traffic lights based on the level of road density so that no time is wasted on green lights with empty road conditions while roads with congested road conditions get red lights. ...
Article
Full-text available
Traffic lights at crossroads help regulate the flow of vehicles so they can run smoothly and avoid congestion. One of the causes of traffic congestion is a traffic control light system that is not optimal because it still uses a fixed timer system, so the lights' duration is less efficient in its application. When the green light is on, the streets are deserted. But there is a traffic jam on another road. Therefore the concept of traffic light timing needs to be developed to get efficient timing. This study aims to create a traffic light control system that uses a PIR (Passive Infrared) motion sensor and an ultrasonic sensor based on the Arduino Mega2560 microcontroller based on the congestion level. The method used in this traffic control prototype is an experimental method of reading vehicle objects mounted with PIR sensors and ultrasonic sensors. This system can detect the movement of vehicles approaching a traffic light, measure the distance between the car and the traffic light, and set the timing of the light signal according to the level of congestion detected. This system has been tested using vehicle movement simulations brought closer to the PIR and ultrasonic sensors at different distances. The test results show the system can work properly and produce the light signal according to the detected traffic conditions. For its application to traffic lights in the future, it is feasible to help the traffic unit.
... The study of Philip et al. [8] employed the use of a laser scanner sensor on a small mobile robot platform to generate a path free from collision from a grid map algorithm. Ismail et al. [9] developed an obstacle-avoiding robot with IR and PIR motion sensors using the Arduino platform. PIR sensors were more to be more sensitive compared to IR sensors in detecting human beings. ...
Article
Full-text available
Obstacle detection can be considered central and paramount in designing mobile robots. This technique enables mobile robots equipped with sensors to transverse and maneuver freely in an environment preventing damage as a result of a collision with obstacles in its path. Several systems with different approaches have been developed for the anti-collision of a robot with obstacles. The approach to Sensor selection, path planning, and navigation processes determines the operation of such a system and differs from one another. This paper presents a low-cost ultrasonic distance sensor for obstacle detection to enhance anti-collision in mobile robot navigation. The system is designed with the C/C++ programming of the Arduino software (IDE) and implemented on the ATMega 2560 Microcontroller of the Arduino board. An ultrasonic sensor detects an obstacle and sends the data collected to the controller which directs the motor driver to stop or move the robot while following a visible predefined path (blackline) embedded in the ground and detected with the help of an IR sensor placed beneath the robot. Experimental results with varied obstacle positions show a decent performance scoring 96.4% accuracy at a 50cm distance to the obstacle.
... Our project is an unmanned all-terrain defense vehicle based on wireless technology. A microcontroller-based multi-purpose defense robot can shoot the incoming enemy by sensing through a PIR sensor or manual operation [9]. This robot has two modes of operation, manual and automatic. ...
Article
Full-text available
Robotic systems that can assist soldiers in dangerous operations are essential technologies that improve safety and efficiency. This paper presents the development of mobile robots that can assist soldiers with target acquisition and surveillance on the battlefield. We have built a prototype that incorporates a firearm system that can perform shooting tasks. Our aim is to design and develop mobile robots that can cope with challenging scenarios such as terrorist attacks and rescue missions. Our robots can transmit photographic images or live streaming via a spy camera, and based on the visual information, soldiers can fire at their target with a gun mounted on the robot. We discussed the key technologies needed to design and manufacture such combat robots. Existing high-specification combat robots are expensive and inaccessible to students and researchers interested in the field. This paper reveals the technologies to develop a low-cost combat robot. This system can be used for real-time applications at an affordable price by using more advanced technology to improve its accuracy.
... Compared to ultrasonic (US) sensors, they are less expensive and respond more quickly. However, the Infrared (IR) sensors have non-linear characteristics and depends on the reflectance properties of the object surfaces [8] . One of the most difficult tasks faced by the blind and visually impaired is the identification of people. ...
... For example, infrared range sensors are widely used in field robotics but are very susceptible to lighting conditions. In very sunny conditions, they might not work at all or could give very poor results [17]. Nvidia's Isaac Sim simulator can do photo-realistic simulation but it does not have any kind of agricultural environments/components that can be easily used. ...
Article
The development of a Human Follower Robot incorporating IR and Ultrasonic Sensors represents an innovative venture in robotics and automation, aimed at creating a versatile machine capable of tracking and following targets while navigating obstacles. This project is fuelled by the vision of combining sensor technology, microcontroller programming, and robotic design to birth a robot capable of graceful traversal and intelligent target pursuit. At its core, the project relies on the integration of two essential sensor technologies: Infrared (IR) and Ultrasonic sensors. IR sensors, functioning akin to human eyesight, detect objects by emitting and receiving infrared light, enabling the robot to recognize the presence of a target within its line of sight. Concurrently, ultrasonic sensors enhance the robot's perception by determining distances through sound waves, facilitating obstacle avoidance and maintaining a safe tracking distance from the target. The project's foundation comprises key components working in tandem to fulfil the robot's objectives. A microcontroller, such as Arduino or Raspberry Pi, orchestrates actions by processing sensor data and regulating movements. IR sensors act as vigilant eyes, detecting the target, while ultrasonic sensors serve as the echolocation system, enabling seamless navigation in cluttered environments. Motors and wheels propel the robot, providing motion and directional control, while a reliable power supply, such as batteries, sustains its operations. Four pivotal milestones define the project's core goals. Firstly, the development of algorithms and code for effective object detection using IR sensors enables the robot to discern the presence of a target. Secondly, obstacle avoidance mechanisms are implemented through ultrasonic sensors, ensuring adept manoeuvring around barriers and hazards. Thirdly, proximity control is prioritized to maintain an optimal and safe following distance from the target, preventing collisions. Lastly, efforts focus on enabling smooth and precise movement to ensure accurate and responsive tracking behaviour. The potential applications of such a robot are diverse. Apart from assisting visually impaired individuals by guiding and avoiding obstacles, these robots can serve as automated tour guides, offering informative experiences in museums or exhibitions. Additionally, their utility extends to surveillance and security, where they can monitor areas and track intruders. Ultimately, this project holds promise for a future where intelligent robots seamlessly follow, guide, and protect, enriching lives with their remarkable capabilities. Key Words: Human Follower Robot, IR Sensors, Human Follower Robot, IR Sensors, Ultrasonic Sensors, Robotics, Automation, Sensor Technology etc
Conference Paper
This study presents a comprehensive review of AI-based vision detection algorithms for autonomous mobile robots. Over the years, research on autonomous mobile robotics, artificial intelligence (AI), and vision detection algorithms has significantly advanced. Furthermore, an in-depth analysis of some AI-based image detection algorithms for autonomous mobile robots is presented, demonstrating the complicated web of factors that go into creating an algorithm, including the algorithmic approach, methodology, data training, performance standards, real-time implications, and potential for improvement. This study acts as a roadmap for the development of improved AI-driven robotic vision systems as the algorithm's capabilities continue to change the field of artificial intelligence.
Article
Full-text available
Accurate characterization of parameters that influence indoor environments is often limited to the use of proprietary hardware and software, which can adversely affect costs, flexibility, and data integration. Here we describe the Open Source Building Science Sensors (OSBSS) project, which we created to design and develop a suite of inexpensive, open source devices based on the Arduino platform for measuring and recording long-term indoor environmental and building operational data. The goal of OSBSS is to allow for more flexibility in synchronizing a large number of measurements with high spatial and temporal resolution in a cost effective manner for use in research projects and, eventually, in building automation and control. Detailed tutorials with instructions for constructing the data loggers using off-the-shelf electronic components are made available freely online. The project currently includes a variety of sensors and data loggers designed to measure a number of important parameters in buildings, including air and surface temperatures, air relative humidity, human occupancy, light intensity, CO2 concentrations, and a generic voltage data logger that can log data from other sensors such as differential pressure sensors. We also describe results from co-location tests with each data logger installed for one week in an educational building alongside their commercial counterparts, which demonstrate excellent performance at substantially lower costs.
Article
Full-text available
This paper presents the development and implementation of neural control systems in mobile robots in obstacle avoidance in real time using ultrasonic sensors with complex strategies of decision-making in development (Matlab and Processing). An Arduino embedded platform is used to implement the neural control for field results.
Article
Full-text available
This paper presents some preliminary results of the detection of moving obstacles (particularly walking humans) by the use of a single camera attached to a mobile robot. When a camera moves, simple moving object detection techniques for a stationary camera, such as background subtraction or image differencing, cannot be employed. We thus detect objects that move near the robot by block-based motion estimation. In the method, an image is firstly divided into small blocks, and then the motion of each block is searched by comparing two consecutive images. If the motion between matching blocks is significantly large, the block in the current image is classified as belonging to moving objects. The method is verified by the indoor navigation experiments of a robot.
Article
Full-text available
This paper presents the development, implementation, and testing of a semi-autonomous robotic platform, which may potentially be used for educational and research purposes. Educational purposes include: teaching the student how to design a stable electromechanical platform, exploring different types of sensors to navigate around any obstacles, interfacing different electronic components to a microcontroller, and demonstrating how to program the microcontroller chip in order to control a robotic platform. Research purposes include: developing and investigating the performance of different control algorithms to achieve behaviour analysis and obstacles avoidance. A modular hardware design is implemented using I2C bus to interface different sensors and motor drivers to the ATMEL microcontroller chip (AVR ATmega32). The hardware is integrated in one application board as embedded system design. The software is developed using C-compiler (ImageCraft) and a top-down approach is adopted to design different software modules. Experimental results are given to demonstrate the potential of the developed hardware and software modules.
Article
Full-text available
A mobile robot system, capable of performing various tasks for the physically disabled, has been developed. To avoid collision with unexpected obstacles, the mobile robot uses ultrasonic range finders for detection and mapping. The obstacle avoidance strategy used for this robot is described. Since this strategy depends heavily on the performance of the ultrasonic range finders, these sensors and the effect of their limitations on the obstacle avoidance algorithm are discussed in detail
Article
Full-text available
An airborne vehicle such as a rotorcraft must avoid obstacles like antennas, towers, poles, fences, tree branches, and wires strung across the flight path. Automatic detection of the obstacles and generation of appropriate guidance and control actions for the vehicle to avoid these obstacles would facilitate autonomous navigation. The requirements of an obstacle detection system for rotorcraft in low-altitude Nap-of-the-Earth (NOE) flight based on various rotorcraft motion constraints is analyzed here in detail. It is argued that an automated obstacle detection system for the rotorcraft scenario should include both passive and active sensors to be effective. Consequently, it introduces a maximally passive system which involves the use of passive sensors (TV, FLIR) as well as the selective use of an active (laser) sensor. The passive component is concerned with estimating range using optical flow-based motion analysis and binocular stereo. The optical flow-based motion analysis that is combined with on-board inertial navigation system (INS) to compute ranges to visible scene points is described. Experimental results obtained using land vehicle data illustrate the particular approach to motion analysis.
  • M Yaghi
  • Onder Efe
  • M Dincer
Yaghi M, Onder Efe M and Dincer K 2015 Ankara International Aerospace Conference [19] Nehmzow U 2001 Proc. Trends in Robotics
  • B Bhanu
  • S Das
  • B Roberts
  • D Duncan
Bhanu B, Das S, Roberts B and Duncan D 1996 IEEE Transaction on Aerospace and Electronic Systems 32 875-97
  • A Syed Ali
  • Z Zanziger
  • D Debose
  • B Stephens
Syed Ali A, Zanziger Z, Debose D and Stephens B 2016 Building and Environment 100 114-26
  • M Sathiyanarayanan
  • S Azharuddin
  • K Santhosh
  • G Khan
Sathiyanarayanan M, Azharuddin S, Santhosh K and Khan G 2014 International Journal For Technological Research in Engineering 1 1075-7