Conference PaperPDF Available

Helvis - a Small-scale Agricultural Mobile Robot Prototype for Precision Agriculture

Authors:

Abstract and Figures

The use of agricultural robots is emerging in a complex scenario where it is necessary to produce more food to feed a crescent population, decrease production costs, fight plagues and diseases, and preserve nature. Around the world, there are many research institutes and companies trying to apply mobile robotics techniques in agricultural fields. Mostly, large prototypes are being used and their shapes and dimensions are very similar to tractors and trucks. In the present study, a small-scale prototype was designed, aiming to facilitate the controller development phase and the execution of experiments in the university using a farm-like scenario (before validating the controllers in a real scenario). It is important to highlight that all control parameters were parameterized to allow the control portability to other prototypes. Helvis is an electric small-scale car-like platform whose traction and steering systems are powered by Maxon motors and driven by EPOS2 boards. Its navigation system uses 2 LiDAR sensors (UTM-30LX) to scan the environment (one in the front and the other in the back) and an Inertial Navigation System (IG500N) to estimate its orientation. In this paper we present experiments carried out in rows of a corn crop field. As previously mentioned, before making the experiments in a real farm, a farm-like scenario was constructed in the lab to calibrate the controller parameters. Since each cornrow constitutes itself a discontinuous wall, a filter based on the LiDAR data was developed in order to create virtual continuous walls. So, these virtual walls were used as references for a wall-follower control system. When it is possible to create walls in both sides of the robot, the navigation problem can be simplified to moving the robot in a virtual aisle. The filter calculates the distances between robot and virtual walls, which are used as input data for the fuzzy controller responsible to keep the robot in the path between corn rows. Its output signal acts in Helvis’ steering system. Real environment experiments allowed adjusting fuzzy rule set and improving robot performance.
Content may be subject to copyright.
HELVIS - A SMALL-SCALE AGRICULTURAL MOBILE
ROBOT PROTOTYPE FOR PRECISION AGRICULTURE
Velasquez A. E. B+, Higuti V. A. H+, Guerrero H. B+, Milori D. M*, Magalhães D. V+,
Becker M+.
+ São Carlos Engineering School, São Paulo University, São Paulo, Brazil.
*Embrapa Instrumentation, São Carlos, São Paulo, Brazil
A paper from the Proceedings of the
13th International Conference on Precision Agriculture
July 31August 4, 2016
St. Louis, Missouri, USA
Abstract. The use of agricultural robots is emerging in a complex scenario where it is necessary to
produce more food to feed a crescent population, decrease production costs, fight plagues and
diseases, and preserve nature. Around the world, there are many research institutes and companies
trying to apply mobile robotics techniques in agricultural fields. Mostly, large prototypes are being
used and their shapes and dimensions are very similar to tractors and trucks. In the present study, a
small-scale prototype was designed, aiming to facilitate the controller development phase and the
execution of experiments in the university using a farm-like scenario (before validating the controllers
in a real scenario). It is important to highlight that all control parameters were parameterized to allow
the control portability to other prototypes. Helvis is an electric small-scale car-like platform whose
traction and steering systems are powered by Maxon motors and driven by EPOS2 boards. Its
navigation system uses 2 LiDAR sensors (UTM-30LX) to scan the environment (one in the front and
the other in the back) and an Inertial Navigation System (IG500N) to estimate its orientation. In this
paper we present experiments carried out in rows of a corn crop field. As previously mentioned,
before making the experiments in a real farm, a farm-like scenario was constructed in the lab to
calibrate the controller parameters. Since each cornrow constitutes itself a discontinuous wall, a filter
based on the LiDAR data was developed in order to create virtual continuous walls. So, these virtual
walls were used as references for a wall-follower control system. When it is possible to create walls
in both sides of the robot, the navigation problem can be simplified to moving the robot in a virtual
aisle. The filter calculates the distances between robot and virtual walls, which are used as input data
for the fuzzy controller responsible to keep the robot in the path between corn rows. Its output signal
acts in Helvis’ steering system. Real environment experiments allowed adjusting fuzzy rule set and
improving robot performance.
Keywords. Mobile Robot, Precision Agriculture, Perception, Navigation.
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 2
The authors are solely responsible for the content of this
paper, which is not a refereed publication.. Citation of this wor
k should state that it
is from the Proceedings of the 13
th International Conference on Precision Agriculture. EXAMPLE:
Lastname, A. B. & Coauthor, C. D.
(2016). Title of paper. In Proceedings of the 13
th
International Conference on Precision Agriculture (unpaginated, online). Monticello, IL:
International Society of Precision Agriculture.
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 3
INTRODUCTION
Nowadays agriculture is facing unprecedented demands. United Nations estimate that the world
population will achieve 10 billion people by 2050 (UNFPA, 2015) and that there will be globally about
795 million people undernourished in 2014-16 (FAO, 2015). In this scenario, agricultural production
must grow substantially while confronting the need of reducing its environmental footprint (Foley et
al., 2011). In this context, it was possible to observe in the last few years that robotics technology is
becoming more present in many agriculture areas (Reina et al., 2015).
Cheein and Carelli (2013) summarize the most important abilities of automatic agricultural vehicles
into four categories: guidance, detection, action, and mapping. For these activities, devices like
camera modules, Real Time Kinetics Global Positioning System (RTK-GPS) and Light Detection and
Ranging (LiDAR) are needed to retrieve information from environment. An example of vision based
system can be found in (English et al., 2014) which describes a crop rows tracking method using
vision based texture model. An automatic and accurate guidance of a four-wheel-steering mobile
robot using RTK-GPS is described in (Cariou et al., 2009). Weiss and Biber (2011) used a 3D LiDAR
sensor to detect single plants in crop rows in real time.
Auto-guided agricultural vehicles with GPS-based navigation systems started appearing around
1997. They were able to follow a predefined map, bringing benefits like accuracy increase and
driver’s fatigue reduction (Heraud and Lange, 2009). But when dealing with unmanned vehicles,
which do not have an operator to manually interfere in the presence of environmental alterations, a
GPS-based system is not enough to guarantee that the vehicle will not ride over the crop and
unmapped elements (like other vehicles, animals, humans or recent changes) thus raising a major
safety issue (Reina et al., 2015). Besides that, high-accuracy GPS systems, such as differential GPS
and RTK-GPS, are expensive but even so, subject to interruptions or poor availability of the signal
which leads to great position errors and possibly, navigation failure (Hiremath et al., 2009). Although
great improvement can be seen in (English et al., 2014), vision based methods are sensitive to
variations in environment light and atmospheric effects. Outdoor environments, such as the
agricultural ones, would require frequent calibration procedures (Hiremath et al., 2009).
Regarding LiDAR-based navigation systems, Hiremath et al. (2009) proposed and showed the
robustness of a LiDAR-based autonomous navigation based on a particle filter. Barawid et al. (2007)
highlighted two advantages of using such systems: as it takes into account real-time local
information, a pre-surveying task to generate a map can be omitted (thus saving time); and they can
be used when GPS becomes non-functional. In addition, Weiss and Biber (2011) state that in
contrast to most stereo vision cameras, the ranging information is calculated on the LiDAR sensor
itself, thus no further time-consuming calculation has to be carried out on the computer, and the
distance values’ precision is also independent from the measured distances.
For the aforementioned reasons, this study is focused on the development of a LiDAR-based
autonomous navigation system of a car-like mobile robot through cornfield without relying on a
planned path. For this purpose, a fuzzy controller is proposed and experimentally tuned for the crop
row following. Appropriate row exit and entrance maneuvering routines based on LiDAR and robot’s
heading readings were also implemented. Although high-level decision making would require
additional resources, such as landmarks, Radio-Frequency IDentification (RFID) tags or even a GPS-
based map, the developed navigation system is able to work in standalone mode ensuring crop’s
integrity.
HELVIS PROTOTYPE
Helvis (Fig 1) is an electric small-scale car-like platform (Length | Width | Height: 0.65m x 0.23m
0.45m) whose traction set consists in an EPOS2 50/5 board, an automotive-like differential
mechanism and an EC-4Pole 30 Maxon Motor (reference 309756) which has two shafts (rear and
front). The rear shaft is coupled to an incremental encoder (reference 255778) and the front shaft is
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 4
coupled to a planetary gearhead GP 32 (reference 326661) for a 23:1 reduction. Its steering set has
an EPOS2 24/5 board, an Ackerman mechanism, a power screw and a DC Maxon motor REmax29
(reference 226802). The rear shaft of DC Maxon Motor is coupled to an incremental encoder
(reference 225805) and its front shaft is coupled to a planetary gearhead GP 32 (reference 166938)
for a 33:1 reduction. In order to scan the environment, it has two embedded LiDAR sensors (UTM-
30LX) manufactured by Hokuyo (Hokuyo Automatic, 2012): the first one is located in the front part of
the robot and the other one is in the rear part. Also, the robot has an Inertial Navigation System
(IG500N) manufactured by SBG Systems (SBG Systems, 2013) in order to estimate its orientation
and position (it will be used in future works). In this work, orientation refers to yaw orientation angle.
Its main control unit is a Raspberry Pi 2 running Raspbian operation system. For monitoring some
variables during tests, Helvis has a router to allow remote communication between Raspberry Pi 2
and an external computer. The interaction between its embedded devices is presented in Fig 2.
Finally, characterization of sensors and description of each part of Helvis structure are more detailed
in (Velasquez, 2015).
Fig 1. Helvis in middle of cornrow crop.
The steering system described in (Velasquez, 2015) was changed in order to increase its robustness
and steering angle ranges. Its new steering and traction systems are showed in Fig 2.
Fig 2. Steering and traction sets of Helvis.
CONTROL SYSTEM OF HELVIS
Helvis’ control system is divided in two levels: High and Low control level systems (Fig 3). High-level
system (also called wall-follower control system) consists of a fuzzy controller that keeps the robot on
the path between the cornrows. Its input is the difference between distances from the robot to each
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 5
wall of corn rows and its output is the desired angle of the robot’s front wheels. As previously
mentioned in abstract, each corn row constitutes itself a discontinuous wall, then a filter was
developed to generate two continuous virtual walls based on the sensor readings (vehicle’s
orientation measured by IG500N and distances measured by UTM-30LX). The controllers and filters
are programmed with C++ language.
Fig 3. Interaction of Helvis’ embedded devices.
The high-level control system output is front wheels’ desired steering. This output and vehicle’s
desired speed, which maximum value is fixed at 0.2 m/s, are sent to low-level control system through
serial communication. Low-level control system consists of two EPOS2 boards in charge of
controlling traction and steering motors’ speed and position. For this purpose, the board EPOS2 has
embedded Proportional and Integral (PI) controllers (their description is available in Maxon Motors,
2013). While EPOS2 24/5 gets desired steering angle and makes DC Maxon Motor’s position control,
EPOS2 50/5 gets desired robot speed and makes EC Maxon Motor’s velocity control.
Filter to create the virtual walls
As previously mentioned in abstract, all experiments presented in this paper were carried out in a
farm-like scenario representing rows of corn crop field. Each row has about 40 corn plants organized
in a straight line (Fig 4). Separation between each corn plant is 0.2m and distance between each row
crop is approximately 1m.
Fig 4. Farm-like scenario created in the lab.
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 6
The robot uses a LiDAR sensor to scan the environment. LiDAR’s scan range is 270° and its angular
resolution is 0.25° (270°/1080 steps). Every 0.1s, the sensor updates a vector with 1080 positions
with distances (in mm) between the sensor and any detected object for each step of its scan range.
When it comes to the filter, three situations are defined: the first situation is when robot is moving
without orientation error (eψ) (Fig 5a); the second one is when it is moving with a positive orientation
error (Fig 5b); and the last one is when it is moving with a negative orientation error (Fig 5c).
Orientation error (eψ) is the angular difference between path orientation (red arrow) and actual robot
orientation (blue arrow). Path orientation is the obtained orientation when the robot enters the crop
row.
Fig 5. a- The robot is running without orientation error. b- The robot is running with positive orientation error. c- The robot is
running with negative orientation error.
The filter’s main goal is to generate virtual walls based on information of two interest regions. The
first interest region, further referred as right-side interest region, comprehends the obtained readings
of steps located between the two orange arrows showed in Fig 5a, 5b, and 5c. The second region,
further referred as left-side interest region, contains the obtained readings of steps located between
the two black arrows showed in Fig 5a, 5b, and 5c. Angular separation between black arrows and
between orange arrows are both 60° (240 steps). For all cases presented in Fig 5, definition of an
initial and a final step for each interest region is needed. For the case A (right-side interest region -
Fig 5a), initial step is 180 (0°) and final step is given by equation 1.
__ =__ +󰇡
.󰇢 (1)
For the case of left-side interest region (Fig 5a), initial and final steps are defined by equations 3 and
2, respectively.
__ =__ +󰇡
.󰇢 (2)
__ =__ 󰇡 
.󰇢 (3)
For the cases B and C (Fig 5b and Fig 5c), initial and final steps of right-side interest region are
defined by equations 4 and 5, respectively, while initial and final steps of the left-side interest region
are defined by equations 6 and 7, respectively.
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 7
__ =180 + 󰇡 
.󰇢 (4)
__ =__ +󰇡
.󰇢 (5)
__ =__ +󰇡
.󰇢 (6)
__ =__ 󰇡 
.󰇢 (7)
Each region has readings of 240 steps. Then, two vectors (each one with 240 positions) were
created in order to save information about interest regions. The first vector is called
points_right_region and contains information about right-side interest region. The second one is
called points_left_region and contains information about left-side interest region.
Fig 6. Global and Local Coordinate frames.
The main issue with information saved in points_right_region and points_left_region vectors is getting
their x and y coordinates (in meters) relative to coordinate frame {2}, shown by X2 and Y2 axes in Fig
6. This coordinate frame is a translation of global coordinate frame {G} (XG,YG axes in Fig 6) to the
origin of robot’s local coordinate frame {R} (XR,YR in Fig 6). While {G}’s origin is fixed at one corner of
the corn field, {R} is attached to the robot, with its origin in the center of front LiDAR sensor and XR
axis coincident with robot’s longitudinal axis.
As it can be seen in Fig 6, orientation error (eψ) is the angle between X2 and XR. Thus in order to
obtain projections in frame {2} from points_right_region and points_left_region vectors, equations 8,
9, 10 and 11 uses eψ to compensate robot’s orientation deviations. Using these equations, four
vectors are generated, and they contain x and y coordinates of points_right_region and
points_left_region vectors. Their names are X_right_region, Y_right_region, X_left_region and
Y_left_region.
__()=__()sin( 0.) +/1000 (8)
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 8
__()=__()cos( 0.25) +/1000 (9)
__()=__()sin( 0.) +/1000 (10)
__()=__()cos( 0.25) +/1000 (11)
where steprw is an incremental variable (with increments equal to 1) which starts at initial_stepright-
side_region and ends at final_stepright-side_region. The steplw is another incremental variable (also with
increments equal to 1) which starts at initial_stepleft-side_region and ends at final_stepleft-side_region. Finally,
posrw and poslw are described by equations 12 and 13, respectively.
 = __ (12)
 = __ (13)
Those projection vectors are filtered to generate virtual walls. The filter for virtual left wall is described
by the routine presented in equation 14.
if ((Y_left_region (stepfilter) > -1.2) && ((Y_left_region (stepfilter) < -0.2)
{
Yvirtual_left_wall (stepfilter) = Y_left_region (stepfilter);
Xvirtual_left_wall (stepfilter) = X_left_region (stepfilter);
}
(14)
And the filter for right wall is presented in equation 15.
if ((Y_right_region (stepfilter) < 1.2) && ((Y_right_region (stepfilter) > 0.2)
{
Yvirtual_right_wall (stepfilter) = Y_right_region (stepfilter);
Xvirtual_right_wall (stepfilter) = X_right_region (stepfilter);
}
(15)
Where stepfilter is an incremental variable (with increments equal to 1) that starts at 0 and ends at
239. Filtered vectors have X and Y coordinates of virtual walls.
Fig 7. a- Map obtained without filter. b- Map with the virtual walls obtained with the filter.
In order to calculate distances between robot and virtual walls, mean values of Yvirtual_left_wall and
Yvirtual_right_wall are obtained. Yvirtual_left_wall‘s mean value is the distance of the robot relative
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 9
to left virtual wall (dlw) while Yvirtual_right_wall’s mean value is the distance of the robot relative to
right wall (drw). Fig 7a shows the map obtained with LiDAR readings without the filter and Fig 7b
shows the map with virtual walls obtained with filter.
Wall-follower system control
As mentioned before, the wall-follower system control has a fuzzy controller whose main objective is
to keep the robot in the path between the cornrows. Block diagram of wall follower system control is
shown in Fig 8.
Fig 8. Block diagram of the wall-follower system control.
For this subsection, distance represents the extension of space between Helvis longitudinal axis and
middle between crop rows. Wall-follower control system’s input is desired distance (dd) and its output
is the actual distance (da). This latter value is constantly obtained by using LiDAR sensor and the
filter mentioned in the previous section, and it is negative feedback to the fuzzy controller through
variable ed (distance error) which is the difference between dd and da. Value of desired distance is
always zero and value of real distance is the sum of drw and dlw (described in the filter section).
Fuzzy controller’s output is steering angle (u) for Helvis. Implementation of fuzzy controller was
based on the fuzzy controllers developed in (Guerrero et al., 2014; Higuti et al., 2015). It has an input
fuzzy set with five triangular pertinence functions (Fig 9a) and an output fuzzy set with three
triangular pertinence functions (Fig 9b). The five pertinence functions of input fuzzy set are named:
VFLW (Very_Far_Left_Wall), FLW (Far_Left_Wall), MP (Middle Path), FRW (Far_Right_Wall) and
VFRW (Very_Far_Right_Wall). And the three pertinence functions of the output fuzzy set are named:
LS (Left_Steering), NS (No_Steering) and RS (Right_steering).
Fig 9. a- Input fuzzy sets. b- Output fuzzy sets.
A value of distance error (ed) generates two pertinence values (µ1 and µ2) in two different input fuzzy
sets at the same time. These pertinence values are used to cut the pertinence function of the output
fuzzy set according to the fuzzy rules described in equation 16.
If ed = VFLW or if ed = FLW, then µ = LS
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 10
If ed = MP, then µ = NS (16)
If ed = FRW or if ed = VFRW, then µ = RS
When pertinence values cut output fuzzy set’s pertinence functions, some output fuzzy values (Z) are
obtained. These values are used to find the defuzzified output value (Z*) which is obtained with the
centroid method (Guerrero et al., 2014). The Z* is fuzzy controller’s output (u) and it is a steering
angle for the robot.
SWITCHING BETWEEN TWO CROP ROWS
This section describes the maneuver process used to change crop rows. There are two maneuvers.
The first maneuver is used when the robot changes to another row located on its right side (Fig 10).
And the second one is a mirrored version as it is used when the following row is located to the left
side. The maneuver process is divided into four steps (Fig 10).
1. End of first crop row.
2. Finding next crop row
3. Entrance to next crop row.
4. On second crop row.
First step comprehends: detection of row end, measurement of actual robot’s orientation (ψfirst_step),
calculation and subsequent movement to desired orientation for second step (equation 17).
_ =_  (17)
In order to find row end, information of Yvirtual_left_wall and Yvirtual_right_wall vectors is used.
When the sum of number of elements of those two vectors is less than 220 elements, it indicates row
end was found. At this point, a mean value of 100 consecutive readings of orientation angle is taken
and set as desired orientation for first step. Then, ψsecond_step is calculated by equation 17 and the
robot continues moving forward but with front wheels totally steered towards right side. About
orientation signal, when robot steers to the right, the measured orientation decreases. And for the
opposite movement, it increases. When actual robot’s orientation is near (±10°) ψsecond_step, the robot
motion stops and front wheels are steered back to 0° (end of first step).
Fig 10. Maneuver to change the crop row.
In the beginning of second step, desired orientation (ψsecond_step) is updated. Reverse motion, crop
rows’ detection filter and fuzzy controller for reverse motion are activated. The condition to finish
second step is the counting of two crop rows by the robot. When this condition is satisfied, robot
motion, fuzzy controller and filter are stopped.
Third step consists in the entrance to the next row, which involves a 90° right or left turn. Thus,
desired orientation for third step (ψthird_step) is calculated by equation 18.
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 11
_ =_  (18)
Subsequently, robot motion is activated and its front wheels are steered to the maximum steering
value towards its right side. When the robot orientation is near (±10°) ψthird_step, robot motion is
stopped and front wheels return to 0° (end of third step). In the final step, ψfourth_step is given by
equation 19 as the best estimative for the new crop row is that its path orientation is parallel but in
the opposite direction of the previous row. Then, robot motion is activated. From this point until next
turning maneuver, ψfourth_step is used in the filter (used to create the virtual walls) as the path
orientation. Finally, the wall-follower control system is activated.
_ =_  (19)
The second maneuver is used when the next crop row is located in the left side. This maneuver is
divided in the same steps as first one. Orientation for second step (calculated in the first step) and
fourth step (calculated in the fourth step) are different and they are defined by equations 20 and 21,
respectively. In the first and third steps of this maneuver, the robot steers its front wheels to its left
side. Finally, the second and fourth steps are the same.
_ =_ + (20)
_ =_ + (21)
Fuzzy Controller for the Reverse Motion
In order to reduce deviations while going backwards (second step of maneuver process), another
appropriate version of fuzzy controller was implemented, also following (Guerrero et al., 2014; Higuti
et al., 2015). Similar to wall-follower control system, a block diagram representing this control system
can be seen in Fig 11.
Fig 11. Block diagram of Reverse Motion Control System
Reverse motion control system’s input is desired orientation (ψd) and its output is the actual
orientation (ψa). This latter value is constantly obtained by using IG500N unit sensor, and it is
negative feedback to the fuzzy controller through variable eψ (orientation error) which is the difference
between ψd and ψa. As mentioned, the value of desired orientation is updated in the beginning of the
second step and should be around 90 degrees off the orientation Helvis left the crop row. Fuzzy
controller’s output is steering angle (u) for Helvis. This fuzzy controller also has an input fuzzy set
with five triangular pertinence functions (Fig 12a) and the same output fuzzy (Fig 12b) as it handles
the steering like in the wall-follower. Here, the five pertinence functions of input fuzzy set are named:
VNOE (Very_Negative_Orientation_Error), NOE (Negative_Orientation_Error), COE
(Central_Orientation_Error), POE (Positive_Orientation_Error) and VPOE
(Very_Positive_Orientation_Error).
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 12
Fig 12. A- Input fuzzy sets. B- Output fuzzy sets.
Analogous to the fuzzy controller of wall-follower control system, a value of orientation error (eψ)
generates two pertinence values (µ1 and µ2) in two different input fuzzy sets at the same time. These
pertinence values are used to cut the output fuzzy set’s pertinence function according to the fuzzy
rules described in equation 22.
If eψ = VNOE or if eψ = NOE, then µ = LS
If eψ = COE, then µ = NS (22)
If eψ = POE or if eψ = VPOE, then µ = RS
A better understanding of equation 22 can be achieved analyzing Fig 6: in reverse motion, negative
eψ requires steering to the left while positive eψ needs steering to the right in order to correct robot’s
orientation. With the output fuzzy values obtained when pertinence values cut output fuzzy set’s
pertinence functions, the defuzzified output value (Z*) can be found using the centroid method
(Guerrero et al., 2014). Again, the Z* is fuzzy controller’s output (u) and it is a steering angle for the
robot.
Detection of Crop Rows in Reverse Motion
This subsection describes how crop rows are detected in reverse motion. For this purpose, rear
LiDAR readings are used. As the filter used to create virtual walls, an interest region of LiDAR
readings is defined. Location of the interest region depends on the maneuver used to change the
crop row. For the first maneuver, interest region comprehends the obtained readings of steps located
between the two red arrows shown in Fig 13a.
Fig 13. Interest region used to detect the crop rows.
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 13
Fig 13b shows the interest region (obtained readings of steps located between the two blue arrows)
for the second maneuver. For both cases, angular separation between arrows is 5° (20 steps) and
black arrow indicates the sense of robot’s reverse motion. Initial and final steps of the interest region
for first maneuver are defined by equations 23 and 24, respectively.
_ =900 +󰇡
.󰇢 (23)
_ =_ 󰇡 
.󰇢 (24)
Initial and final steps of the interest region for second maneuver are defined by equations 25 and 26,
respectively.
_ =180 +󰇡
.󰇢 (25)
_ =_ +󰇡
.󰇢 (26)
In order to obtain vectors with projections of LiDAR readings for the first maneuver (X_left_region
and Y_left_region), equations 10, 11, 23 and 24 are used. For the second maneuver, vectors with
projections of LiDAR readings (X_right_region and Y_right_region) are obtained using equations 8, 9,
25 and 26. To detect a crop row for the first maneuver, a filter (equation 27) is applied to
X_left_region and Y_left_region vectors.
if ((X_left_region (stepfilter) > 0) && (X_left_region (stepfilter) < 0.02))
{
if ((Y_left_region (stepfilter) > -1.7) && (Y_left_region (stepfilter) < -0.6))
{
interest_points = interest_points + 1;
}
}
(27)
Where interest_points is the count of filtered points. If the value of interest_points is greater than 5
then a possible crop row was detected. In order to consider that a real crop row was detected, three
consecutives possible crop rows might be detected. For the second maneuver, the used filter is
described by equation 28.
if ((X_right_region (stepfilter) > 0) && (X_right_region (stepfilter) < 0.02))
{
if ((Y_right_region (stepfilter) > 0.6) && (Y_right_region (stepfilter) < 1.7))
{
interest_points = interest_points + 1;
}
}
(28)
Fig 14. shows the moment when a crop row was detected in a test performed. The green lines
represent the interest region of LiDAR readings, blue points are Cartesian projections of rear LiDAR
and red crosses are the filtered points used to determine if a crop row was detected. Finally, the
position of rear LiDAR is represented with a filled orange square.
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 14
Fig 14. A crop row is detected.
EXPERIMENTAL RESULTS
In the mentioned farm-like scenario, several tests were carried out to improve overall system
algorithms, define parameters, and evaluate filters and controller systems described in previous
sections. Except for rainy days, occasions when tests are cancelled as the robot is not waterproof,
environment factors such as temperature and luminosity have not interfered in Helvis performance in
a perceptible way.
A successful test is one in which the robot drives through five crop rows with bounded distance error
and response to disturbances quick enough to avoid crashes with the surroundings. When the robot
arrives in the end of crop row, it needs to turn, alternately to the right and to the left, and enter the
following crop row.
Both Fig 15. and Fig 16 represent a successful complete course around the five crop rows in
scenario and show the fuzzy controller output for the steering system. The first depicts distance error
when the robot is moving along the crop rows (sections ;A;C;E;G;I), thus trying to follow the
virtual walls. And the second shows the orientation error in the maneuver process (sections ;B;D
;F;H).
;A ;B ;C ;D ;E ;F ;G ;H ;I
;I
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 15
Fig 15. Distance error (blue plot) and Fuzzy controller output (red plot) through time
Fig 16. Orientation error (blue plot) and Fuzzy controller output (red plot) through time
As one may observe in Fig 15, the output is proportional to distance error when this value is smaller
than 0.1m. There is a substantial increase in output when distance error is greater than 0.1m, which
can be observed around time 281s in Fig 17a, a zoomed view of section ;I. This behavior is
expected as a result of the chosen pertinence functions of input fuzzy set for wall-follower, which
already accounts an absolute distance error of 0.1m either as Far (to) Right Wall or Far (to) Left Wall
and begins to consider it as Very Far (to) Right Wall or Very Far (to) Left Wall.
The beginning of sections ;C, ;E, ;G, and ;I presents a high value of distance error because
maneuvering process does not guarantee that the robot will start row in the exact middle between
crop rows. Rather, because of maneuver, it is expected that the robot starts parallel to rows. And in
the end of these sections, increased oscillatory behavior is expected as less information is available
to create virtual walls.
Sections ;B, ;D, ;F, and ;H in Fig 16. represent the steps 1, 2, and 3 of the maneuvering
process. A sawtooth shape represents the step 1 and 3 as the robot turns from its actual orientation
until desired one (90° off the initial one). Between sawtooth shapes, reverse motion (step 2) is
adjusted using fuzzy controller with actual orientation error as input. This part can be seen in more
details in Fig 17b.
Fig 17. Zoomed views of: a Fig 15. b Fig 16.
Initial experiments were carried out without fuzzy controller for reverse motion. It was set that the
robot should go backwards with zero steering until it counts two rows. Because of uneven surface,
;A ;B ;C ;D ;E ;F ;G ;H ;I
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 16
there was always deviation and the robot either diverged, being too far to consider LiDAR readings of
rows as valid ones, or converged, riding over the row. Although not optimized to quickly diminish the
orientation error, the controller is able to counteract soil disturbances in order to keep the robot more
or less orthogonal to row crops when going backwards, greatly improving chances of successful
maneuver. For this experiment, orientation error was always less than 5°.
CONCLUSIONS AND FUTURE WORKS
This study focused on developing controllers and filters for an application of Helvis in agricultural
context. It is a local navigation system that can be further used with a global one based on GPS data.
Its purpose is, therefore, guaranteeing crop safety at all times, as GPS-based systems can suffer
from signal loss. The wall-follower fuzzy controller presented an overall great response to soil
disturbances, but in the borders of the rows, improvements can be made. First, a more reliable
orientation estimation can be achieved using LiDAR readings and Simultaneous Localization and
Mapping (SLAM) algorithms. As it will take into account surroundings in order to define path
orientation, the possibility of wrong estimations because robot suffered disturbances on its orientation
is mitigated. In addition, for a range of few meters, UTM30LX LiDAR sensor proved to be reliable
under different luminosity conditions. Second, the presented maneuver process is a starting point for
the switching row task. As one of the first maneuvers implemented in Helvis, it relies on few sensor
measurements and simple calculations. However, it has the major drawback of not being able to
overcome environmental changes, such as uneven soil, leading to orientation deviations, and dense
foliage, complicating the correct counting of rows.
Acknowledgements
The authors would like to thank to FINEP, CNPq, CAPES, EMBRAPA and EESC-USP, for their
support in this work.
References
Barawid Jr, O. C., Mizushima, A., Ishii, K., & Noguchi, N. (2007). Development of an Autonomous Navigation System using a
Two-dimensional Laser Scanner in an Orchard Application. Biosystems Engineering, doi:
10.1016/j.biosystemseng.2006.10.012.
Cariou, C., Lenain, R., Thuilot, B., & Berducat, M. (2009). Automatic guidance of a four-wheel-steering mobile robot for
accurate field operations. Journal of Field Robotics, doi: 10.1002/rob.20282.
Cheein, F. A., & Carelli, R. (2013). Agricultural robotics: Unmanned robotic service units in agricultural tasks. IEEE Industrial
Electronics Magazine, doi: 10.1109/MIE.2013.2252957.
English, A., Ross, P., Ball, D., & Corke, P. (2014). Vision based guidance for robot navigation in agriculture. In Proceedings of
EEE International Conference on Robotics and Automation (ICRA) (website). Hong Kong, Korea: IEEE. doi:
10.1109/ICRA.2014.6907079.
FAO (2015). The State of Food Insecurity in the World. Online Document. FAO. http://www.fao.org/3/a4ef2d16-70a7-460a-
a9ac-2a65a533269a/i4646e.pdf. Accessed 21 April 2016.
Foley, J. A., Ramankutty, N., Brauman, K. A., Cassidy, E. S., Gerber, J. S., Johnston, M., Mueller, N. D., O’connell, C., Ray, D.
K., West, P. C., Balzer, C., Bennett, E. M., Carpenter, S. R., Hill, J., Monfreda, C., Palasky, S., Rockstrom, J., Sheehan, J.,
Siebert, S., Tilman, D., &Zaks, D. P. M. (2011). Solutions for a cultivated planet. Nature, doi:10.1038/nature10452.
Guerrero, H. B., Velasquez, A. E. B., Barrero, J. F., Côco, D. Z., Risardi, J. C., Magalhães, D. V., & Becker, M. (2014).
Orientation (Yaw) Fuzzy Controller Applied to a Car-Like Mobile Robot Prototype. In Proceedings of the IEEE 5th
Colombian Workshop on Circuits and Systems - CWCAS2014 (website). Bogota, Colombia: IEEE. doi:
10.1109/CWCAS.2014.6994603.
Heraud, J. A., & Lange, A. F. (2009). Agricultural Automatic Vehicle Guidance from Horses to GPS: How We Got Here, and
Where We Are Going. Michigan: American Society of Agricultural and Biological Engineers.
Higuti, V. A. H., Guerrero, H. B., Velasquez, A. E. B., Pinto, R. M., Tinelli, L. M., Magalhães, D. V., Becker, M., Barrero, J. F., &
Milori, D. M. B. P. (2015). Low-cost embedded computer for mobile robot platform based on raspberry board. In
Proceedings of 23rd ABCM International Congress of Mechanical Engineering - Cobem2015 (website). Rio de Janeiro,
Brazil: ABMC. doi: 10.20906/CPS/COB-2015-0336.
Hiremath, S. A., van der Heijden, G. W. A. M., van Evert, F. K., Stein, A., & ter Braak, C. J. F. (2014). Laser range finder model
Proceedings of the 13th International Conference on Precision Agriculture
July 31 August 3, 2016, St. Louis, Missouri, USA Page 17
for autonomous navigation of a robot in a maize field using a particle filter. Computers and Electronics in Agriculture, doi:
10.1016/j.compag.2013.10.005.
Hokuyo Automatic (2012). Scanning Laser Range Finder UTM-30LX/LN Specifications. Product Specifications. Hokuyo
Automatic co. http://www.hokuyo-aut.jp/02sensor/07scanner/download/pdf/UTM-30LX_spec_en.pdf. Accessed 20 April
2016.
Maxon Motors ag (2013).EPOS 2 Positioning controllers. Application Notes Collection. Maxon Motors ag.
http://www.maxonmotor.com/medias/sys_master/8811457937438/EPOS2-Application-Notes-Collection-En.pdf. Accessed
21 April 2016.
Reina, G., Milella, A., Rouveure, R., Nielsen, M., Worst, R., & Blas, M. R. (2015). Ambient awareness for agricultural robotic
vehicles. Biosystems Engineering. doi:10.1016/j.
SBG Systems (2013).IG-500N. Product Specifications. SBG Systems. http://www.sbg-systems.com/docs/IG-500N-Leaflet.pdf.
Accessed 20 April 2016.
UNFPA (2015). State of world population. Online Document. ONU. http://www.unfpa.org/sites/default/files/pub-
pdf/State_of_World_Population_2015_EN.pdf. Accessed 20 April 2016.
Velasquez, A. E. B. (2015). elvis III Desenvolvimento e Caracterização da Plataforma Robótica. Masters dissertation.
University of Sao Paulo. http://www.teses.usp.br/teses/disponiveis/18/18149/tde-05052015-101452/es.php. Accessed 21
April 2016.
Weiss, U., & Biber, P. (2011). Plant detection and mapping for agricultural robots using a 3D LiDAR sensor. Robotics and
Autonomous Systems, doi: 10.1016/j.robot.2011.02.011.
... ARMbased computing units are nowadays achieving computing capabilities that are suitable to accomplish several tasks, as demonstrated in [70,178]. Common ARM-based solution are Raspberry Pi and UDOO models, exploited in [62,111,131,179], respectively. As the size and weight of the computer are crucial factors in the design of mobile robots, Mini ITX motherboards, used in [43,60,133], are often an excellent alternative to ARM architectures. ...
... Once the position of the crop edges on the left and right sides of the robot is found, a center point identifying the location of the row center is calculated. The error between the determined center point and the predefined setpoint in front of the robot is fed through standard control loops in all of the test cases described in this section (with the exception of case [131], which uses a fuzzy logic controller) to determine the wheel velocities required to move the robot to the center, thereby reducing the error to zero. ...
... The system combines different sensors, including GNSS, compass, and gyroscope, to accurately guide the machinery. In [12], the authors developed a small-scale agricultural mobile robot prototype named Helvis for precision agriculture. The robot is designed to navigate autonomously in agricultural fields, collect data, and perform targeted agricultural tasks such as spraying or fertilizing. ...
Article
Full-text available
In this paper, we investigated the idea of including mobile robots as complementary machinery to tractors in an agricultural context. The main idea is not to replace the human farmer, but to augment his/her capabilities by deploying mobile robots as assistants in field operations. The scheme is based on a leader–follower approach. The manned tractor is used as a leader, which will be taken as a reference point for a follower. The follower then takes the position of the leader as a target, and follows it in an autonomous manner. This will allow the farmer to multiply the working width by the number of mobile robots deployed during field operations. In this paper, we present a detailed description of the system, the theoretical aspect that allows the robot to autonomously follow the tractor, in addition to the different experimental steps that allowed us to test the system in the field to assess the robustness of the proposed scheme.
... Another shortcoming is the ineffective weed control achieved in areas outside of ducks' active moving areas. We consider that robotics technologies, especially for small robots [18,19], provide the potential to solve these diverse shortcomings. The aim of this study was to develop an autonomous mobile robot that guides mallards to realize highly efficient rice-duck farming. ...
Article
Full-text available
This study was conducted to develop robot prototypes of three models that navigate mallards to achieve high-efficiency rice-duck farming. We examined two robotics navigation approaches based on imprinting and feeding. As the first approach, we used imprinting applied to baby mallards. They exhibited follow behavior to our first prototype after imprinting. Experimentally obtained observation results revealed the importance of providing imprinting immediately up to one week after hatching. As another approach, we used feed placed on the top of our second prototype. Experimentally obtained results showed that adult mallards exhibited wariness not only against the robot, but also against the feeder. After relieving wariness with provision of more than one week time to become accustomed, adult mallards ate feed in the box on the robot. However, they ran away immediately at a slight movement. Based on this confirmation, we developed the third prototype as an autonomous mobile robot aimed for mallard navigation in a paddy field. The body width is less than the length between rice stalks. After checking the waterproof capability of a body waterproof box, we conducted an indoor driving test for manual operation. Moreover, we conducted outdoor evaluation tests to assess running on an actual paddy field. We developed indoor and outdoor image datasets using an onboard monocular camera. For the outdoor image datasets, our segmentation method based on SegNet achieved semantic segmentation for three semantic categories. For the indoor image datasets, our prediction method based on CNN and LSTM achieved visual prediction for three motion categories.
... Therefore, the concept design of vehicles became a crucial step in vehicle development [1]. One alternative to performing an early evaluation of the new vehicle concept is the use of small-scale prototypes, aiming to facilitate the controller development, identify possible issues and avoiding the high cost of the real scale prototype [3]. Integrated system laboratory-LabSIn chassis dynamometer [10] Besides the cost advantages of producing small-scale vehicle prototypes, also its performance needs to be evaluated to ensure that it represents a similar vehicle on a real scale. ...
Chapter
In the transport industry scenario, evaluate vehicle performance is essential to development. This assessment is frequently made in roller dynamometers, which enables reliable and controlled experiments. To simplify the development process, a small-scale dynamometer has been developed and allows reduced cost tests and less time spent. Therefore, this work analyses the roller behavior, one of its essential pieces, approaching it by a Laval rotor and evaluating the displacements of its shaft centers. The results presented that, under the working conditions, the values of the center displacements are very low and can be disregarded and the rollers do not reach the resonance.
... In the last decades, automation techniques have been developed to benefit several industrial, military and civilian activities. In agricultural production, for example, mobile robots have been employed to operate in crop monitoring, harvesting, maintenance of nurseries and greenhouses, among others, see Velasquez et al. (2016); Cheein and Carelli (2013). Unmanned aerial vehicles (UAVs) are commonly used in these tasks, as in Schmale III et al. (2008), but in many cases a ground viewpoint is necessary and Unmanned Ground Vehicles (UGVs) are used, see Lenain et al. (2006). ...
... HelvisIII is an electric small-scale car-like platform developed by the Laboratory of Mobile Robotics (LabRoM) of the University of Sao Paulo. Its main characteristics can be found in Higuti et al. (2016); Velasquez et al. (2016). RAM is a mobile agriculture robot developed by the University of Sao Paulo and a detailed description of its main characteristics can be found in Sousa (2016). ...
... To scan the environment, Helvis3 has a 2D LiDAR (model: UTM-30LX, Hokuyo, Osaka, Japan) sensor positioned at the front. More detail about mechanical and electric characteristics of Helvis3 can be found in Velasquez et al. (2016) and Higuti et al. (2016). ...
Article
Full-text available
Severe working conditions, the cost reduction need and production growth justify the adoption of technological techniques in agricultural production. Currently, Global Navigation Satellite System (GNSS) based navigation systems are very popular in agriculture. However, relying only on GNSS data and the availability of crop maps that also need to be updated may become a problem. One may face lack of GNSS satellite signal during the path and the navigation system may fail. As an alternative, this paper presents the modeling, development and validation of a reactive navigation system for agricultural fields based on a frontal light detection and ranging (LiDAR) sensor and a H∞ robust controller. A small-scale mobile robot (Helvis3) was used to validate the controllers and carry out experiments in both controlled and farm scenarios. According to the experimental results, the proposed navigation system is capable of controlling the robot displacement between crop rows, keeping it in the middle of the corresponding path with minimum error, despite environmental disturbances.
... (a) First version of Frey robot. (b) Third version of Helvis robot[8] ...
Conference Paper
Full-text available
Due to the increasingly growth in population, it is important to better use natural resources for food production and efficiency, driving the use of sensors each time more to monitor several aspects of the soil and of the crops in the field. However, it is known that the harsh conditions of the field environment demands more robust and energy efficient sensor devices. One example is soil water monitoring for irrigation: Brazil, for example, consumes 69% of its freshwater only for irrigation purposes, which shows the need of using adequate water moisture sensors. Based on that, this work proposes a modular architecture that integrates several sensor technologies, including battery-less sensors and low power sensors for soil moisture measurements, but not limited to them. The proposed system relies on a mobile robot that can locate each deployed sensor autonomously, collect its data and make it available on-line using cloud services. As proof of concept, a low-cost mobile robot is built using a centimeter level accuracy location system, that allows the robot to travel to each sensor and collect their data. The robot is equipped with an UHF antenna to provide power to RF powered battery-less sensors, a Bluetooth low energy data collector and a Zigbee data collector. An experimental evaluation compares reading distance and successful rate of sensor location and reading.
Preprint
Full-text available
Abstract In recent years, the role of robotics in managing poultry conditions and increasing livestock welfare, which has led to increased production and reduced costs, has received much attention. The purpose of the present study was to build a small robot to measure and plot the distribution of temperature, humidity, ammonia and oxygen parameters in the poultry farm and to apply the necessary controls to optimize the conditions. The speed of the robot motors was automatically adjusted by calculating the orientation error, which was the difference between the directional angle and the actual orientation of the robot. The average error of the robot was 1.34 and 4.5 cm, respectively. To evaluate the robot's performance in adjusting the environmental parameters of the poultry farm, an experiment was performed using 700 laying hens from 20 days of age for 20 days. In order to control each cage locally, due to the variability of the waste, the activity of the poultry, as well as the proximity to the poultry ventilation, a control circuit was provided to optimize the conditions of each cage, which was considered as a backup circuit. It included two high speed fans, a cooling / heating element, a generator, a water level sensor and a water pump. Based on the distribution charts of the temperature, humidity, ammonia and oxygen parameters, the different points of danger were determined and necessary action was taken.
Article
Accurate path scouting control of an autonomous agricultural robot is substantially influenced by terrain variability, field patterns, and uncertainties in sensed information. Based on conventional farming techniques, the targeted test crop of strawberries grows in semistructured environments. Thus in this study, the proposed scouting control architecture comprises of three distinct portions and in each portion different sensors are used. Based on range finder (RF) information, the first region uses a proportional-integral-derivative (PID) controller with logic steps to account for undesirable pop-up events. In the other two portions, vision-based robust controllers are developed, in which a new bound is derived for the focal length uncertainty in vision. Stabilities of the controllers are proven and the reachabilities are analyzed to guarantee that the final state of each portion is within a desired initial region of the next portion controller. The proposed multiphase scouting control is successfully validated for our custom-designed robot in a commercial strawberry farm.
Thesis
Full-text available
The main objective of this work is the development and characterization of a robotic vehicle ℏelvis III in order to use it in the development of researches focused on the fields of mobile robotics control and navigation. Initially the propulsion system was characterized in order to determine the real velocity of vehicle in real conditions (four different kinds of grounds were used). In addition to this, the steering system was also characterized by applying the well-known bicycle kinematic model. During these experimental tests we could find the relation between the position of the servo-motor and the value of steering angle of the bicycle model. The real values of CEP (Circular Error Probability) and SEP (Spherical Error Probability) errors of the vehicle embedded GPS (Global Positioning System) were determined based on two experiments: the first one was carried out in São Carlos – SP (Brazil) and the second one in Villavicencio – Meta (Colombia). During the GPS experiments we could also characterize the vehicle embedded IMU (Inertial Measurement Unit). Then we could observe and measure the effect of solar light on the LIDAR sensor (Laser Imaging Detection and Ranging) performance. Finally, the forward vehicle dynamics is described, with the determination of the center of mass of the vehicle and the observation of the normal forces behavior in the vehicle wheels when it is stopped or moved on an inclined floor.
Conference Paper
Full-text available
Nowadays embedded computers are becoming more popular and available for everybody. Due to this, computational costs may be reduced. Based on this, we decided to evaluate the use of a Raspberry Pi B+ as the main control unit for an experimental mobile platform. In order to carry out the tests we developed a controller for a differential drive mobile robot to achieve wall-following navigation which will keep it at a desired distance to the wall nearby. The locomotion of the mobile robot is supported on the actuation of two DC Maxon motors which are managed by two EPOS2 boards. A LIDAR sensor (Laser Imaging Detection and Ranging) has been implemented with the aim of measuring the distance to wall. First, all device drivers were configured for this work under Raspbian operational system. Then, a fuzzy controller was implemented in C++ for the wall-following strategy. Finally, we conducted tests in indoor environment. The results enlightened the influence of the parameters in the fuzzy sets with regards to overshoot and settling time. Also, they demonstrated the potential of the mobile robot for further use in control strategy learning.
Article
Full-text available
In the last few years, robotic technology has been increasingly employed in agriculture to develop intelligent vehicles that can improve productivity and competitiveness. Accurate and robust environmental perception is a critical requirement to address unsolved issues including safe interaction with field workers and animals, obstacle detection in controlled traffic applications, crop row guidance, surveying for variable rate applications, and situation awareness, in general, towards increased process automation. Given the variety of conditions that may be encountered in the field, no single sensor exists that can guarantee reliable results in every scenario. The development of a multi-sensory perception system to increase the ambient awareness of an agricultural vehicle operating in crop fields is the objective of the Ambient Awareness for Autonomous Agricultural Vehicles (QUAD-AV) project. Different onboard sensor technologies, namely stereovision, LIDAR, radar, and thermography, are considered. Novel methods for their combination are proposed to automatically detect obstacles and discern traversable from non-traversable areas. Experimental results, obtained in agricultural contexts, are presented showing the effectiveness of the proposed methods.
Conference Paper
Full-text available
This paper describes results of real experiments carried out with a small-scale mobile robot in order to evaluate a fuzzy controller designed to keep a desired heading orientation of the vehicle. The vehicle heading orientation was measured using an Inertial Measurement Unit (IMU). A Global Positioning System (GPS) receiver was used to monitor the vehicle localization. Sensor data are preprocessed and sent via serial communication to an embedded computer that runs the High Level Controller. Its outputs were sent to a Low Level Controller that was responsible for acting directly on the mobile robot actuators (driving and steering motors).
Conference Paper
This paper describes a novel vision based texture tracking method to guide autonomous vehicles in agricultural fields where the crop rows are challenging to detect. Existing methods require sufficient visual difference between the crop and soil for segmentation, or explicit knowledge of the structure of the crop rows. This method works by extracting and tracking the direction and lateral offset of the dominant parallel texture in a simulated overhead view of the scene and hence abstracts away crop-specific details such as colour, spacing and periodicity. The results demonstrate that the method is able to track crop rows across fields with extremely varied appearance during day and night. We demonstrate this method can autonomously guide a robot along the crop rows.
Article
The application of agricultural machinery in precision agriculture has experienced an increase in investment and research due to the use of robotics applications in the machinery design and task executions. Precision autonomous farming is the operation, guidance, and control of autonomous machines to carry out agricultural tasks. It motivates agricultural robotics. It is expected that, in the near future, autonomous vehicles will be at the heart of all precision agriculture applications [1]. The goal of agricultural robotics is more than just the application of robotics technologies to agriculture. Currently, most of the automatic agricultural vehicles used for weed detection, agrochemical dispersal, terrain leveling, irrigation, etc. are manned. An autonomous performance of such vehicles will allow for the continuous supervision of the field, since information regarding the environment can be autonomously acquired, and the vehicle can then perform its task accordingly.
Article
Autonomous navigation of robots in an agricultural environment is a difficult task due to the inherent uncertainty in the environment. Many existing agricultural robots use computer vision and other sensors to supplement Global Positioning System (GPS) data when navigating. Vision based methods are sensitive to ambient lighting conditions. This is a major disadvantage in an outdoor environment. The current study presents a novel probabilistic sensor model for a 2D range finder (LIDAR) from first principles. Using this sensor model, a particle filter based navigation algorithm (PF) for autonomous navigation in a maize field was developed. The algorithm was tested in various field conditions with varying plant sizes, different row patterns and at several scanning frequencies. Results showed that the Root Mean Squared Error of the robot heading and lateral deviation were equal to 2.4 degrees and 0.04 m, respectively. It was concluded that the performance of the proposed navigation method is robust in a semi-structured agricultural environment.
Article
The objective of this study was to develop an automatic guidance system capable of navigating an autonomous vehicle travelling between tree rows in a real-time application. The study focused solely on straight line recognition of the tree rows using a laser scanner as a navigation sensor. A 52 kW agricultural tractor was used as the platform on which the laser scanner was mounted. A Hough transform was used as the algorithm to recognise the tree row. An auto-regression method eliminated the white Gaussian noise in the laser scanner data. A calibration method was used to select the offset position of the laser scanner and to correct the heading and lateral error evaluation. An appropriate speed for tractor was also determined. By obtaining an accuracy of 0·11 m lateral error and 1·5 ° heading error, it was possible to navigate the robot tractor autonomously between the orchard row crops.
Article
As world population growth requires an increasing level of farm production at the same time that environmental preservation is a priority, the development of new agricultural tools and methods is required. In this framework, the development of robotic devices can provide an attractive solution, particularly in the field of autonomous vehicles. Accurate automatic guidance of mobile robots in farming constitutes a challenging problem for researchers, mainly due to the low grip conditions usually found in such a context. From assisted-steering systems to agricultural robotics, numerous control algorithms have been studied to achieve high-precision path tracking and have reached an accuracy within ±10 cm, whatever the ground configuration and the path to be followed. However, most existing approaches consider classical two-wheel-steering vehicles. Unfortunately, by using such a steering system, only the lateral deviation with respect to the path to be followed can be satisfactorily controlled. Indeed, the heading of the vehicle remains dependent on the grip conditions, and crabwise motions, for example, are systematically observed on a slippery slope, leading to inaccurate field operations. To tackle this drawback, a four-wheel-steering (4WS) mobile robot is considered, enabling servo of both lateral and angular deviations with respect to a desired trajectory. The path tracking control is designed using an extended kinematic representation, allowing account to be taken online of wheel skidding, while a backstepping approach permits management of the 4WS structure. The result is an approach taking advantage of both rear and front steering actuations to fully compensate for sliding effects during path tracking. Moreover, a predictive algorithm is developed in order to address delays induced by steering actuators, compensating for transient overshoots in curves. Experimental results demonstrate that despite sliding phenomena, the mobile robot is able to automatically and accurately achieve a desired path, with lateral and angular errors, respectively, within ±10 cm and ±2 deg, whatever its shape and whatever the terrain conditions. This constitutes a promising result in efforts to define efficient tools with which to tackle tomorrow's agriculture challenge. © 2009 Wiley Periodicals, Inc.