PreprintPDF Available
Preprints and early-stage research may not have been peer reviewed yet.

Abstract and Figures

Robotic competitions stand as platforms to propel the forefront of robotics research while nurturing STEM education , serving as hubs of both applied research and scientific innovation. In Portugal, the Portuguese Robotics Open (FNR) is an event with several robotic competitions, including the Robot@Factory 4.0 competition. This competition presents an example of deploying autonomous robots on a factory shop floor. Although the literature has works proposing frameworks for the original version of the Robot@Factory competition, none of them proposes a system framework for the Robot@Factory 4.0 version that presents the hardware, firmware, and software to complete the competition and achieve autonomous navigation. This paper proposes a complete robotic framework for the Robot@Factory 4.0 competition that is modular and open-access, enabling future participants to use and improve it in future editions. This work is the culmination of all the knowledge acquired by winning the 2022 and 2023 editions of the competition.
Content may be subject to copyright.
A Robotic Framework for the
Robot@Factory 4.0 Competition
Ricardo B. Sousa∗† , Cl´
audia D. Rocha†‡ , Jo˜
ao G. Martins∗† , Jo˜
ao Pedro Costa,
Jo˜
ao Tom´
as Padr˜
ao, Jos´
e Maria Sarmento†‡ , Jos´
e Pedro Carvalho∗§ , Maria S. Lopes∗† ,
Paulo G. Costa∗† , Ant ´
onio Paulo Moreira∗†
Faculty of Engineering, University of Porto (FEUP), 4200-465 Porto, Portugal
Email: up{201503004,201806222,201806431,202108766}@edu.fe.up.pt
{jpcarvalho,mslopes,paco,amoreira}@fe.up.pt
CRIIS Centre for Robotics in Industry and Intelligent Systems
INESC TEC Institute for Systems and Computer Engineering, Technology and Science, 4200-465 Porto, Portugal
Email: {claudia.d.rocha,jose.m.sarmento}@inesctec.pt
University of Tr´
as-os-Montes and Alto Douro (UTAD), 5000-801 Vila Real, Portugal
§C2SR Cyber-Physical Control Systems and Robotics
SYSTEC Research Center for Systems and Technologies, 4200-465 Porto, Portugal
Abstract—Robotic competitions stand as platforms to propel
the forefront of robotics research while nurturing STEM edu-
cation, serving as hubs of both applied research and scientific
innovation. In Portugal, the Portuguese Robotics Open (FNR)
is an event with several robotic competitions, including the
Robot@Factory 4.0 competition. This competition presents an
example of deploying autonomous robots on a factory shop floor.
Although the literature has works proposing frameworks for
the original version of the Robot@Factory competition, none of
them proposes a system framework for the Robot@Factory 4.0
version that presents the hardware, firmware, and software to
complete the competition and achieve autonomous navigation.
This paper proposes a complete robotic framework for the
Robot@Factory 4.0 competition that is modular and open-access,
enabling future participants to use and improve it in future edi-
tions. This work is the culmination of all the knowledge acquired
by winning the 2022 and 2023 editions of the competition.
Index Terms—autonomous mobile robots, robot competitions,
robot at factory 4.0
I. INTRODUCTION
Since 1997, the RoboCup1federation has organised interna-
tional robot soccer competitions, including Small and Middle
Size League robots and humanoids, and more recently, com-
petitions focused on service (RoboCup@Home) and industrial
(RoboCup@Work) autonomous robots. Another example is
the Formula Student2competition, which has been held in
Europe since 1999. This event includes several categories
This work is financed by National Funds through the Portuguese fund-
ing agency, FCT Fundac¸˜
ao para a Ciˆ
encia e a Tecnologia, within
project LA/P/0063/2020 (DOI: 10.54499/LA/P/0063/2020). This work is
also financially supported by national funds through the FCT/MCTES
(PIDDAC), under the RELIABLE project PTDC/EEI-AUT/3522/2020 (DOI
10.54499/PTDC/EEI-AUT/3522/2020), the Associate Laboratory Advanced
Production and Intelligent Systems ARISE LA/P/0112/2020 (DOI
10.54499/LA/P/0112/2020) and the Base Funding (UIDB/00147/2020) and
Programmatic Funding (UIDP/00147/2020) of the R&D Unit Center for
Systems and Technologies SYSTEC.
1https://www.robocup.org/
2https://www.imeche.org/events/formula-student
(combustion, electric, hybrid, and driverless autonomous sys-
tems) while being an educational engineering platform in
several fields, such as robotics, computer science, mechanical,
and electrical and computer engineering. More recently, the
F1TENTH [1] and the Indy Autonomous Challenge [2] were
created to push further the research on autonomous systems
and also serve as STEM (Science, Technology, Engineering
and Mathematics) education platforms for students.
In Portugal, the Portuguese Robotics Open (FNR)3is
an event hosting several competitions divided into major
and junior categories. These competitions include qualifi-
cation events to RoboCup, international championships, au-
tonomous driving, dragster, and the Robot@Factory Lite and
Robot@Factory 4.0 competitions, more oriented to industrial
cases. In particular, the Robot@Factory 4.0 competition aims
to represent the deployment of autonomous mobile robots on
a factory shop floor. The robots should be able to transport
materials between warehouses or machines that process those
materials, considering the robots’ ability to self-localise and
navigate in order to avoid collisions with the walls.
The current literature has works oriented toward the original
and present versions of the Robot@Factory competition hosted
at FNR (Robot@Factory and Robot@Factory 4.0, respec-
tively). Ribeiro et. al. [3] presented their design and develop-
ment path towards the robot used to participate in the original
version of the competition. The robot was a three-wheeled
omnidirectional robot with five processing units (including a
Raspberry Pi 2 Model B and two Arduino Mini), a 2D laser
scanner, and an Inertial Measurement Unit (IMU) used to ob-
tain a pose estimation. Similarly, Pinto et. al. [4] also proposed
a three-wheeled omnidirectional robot to participate in the
original competition. The proposed framework used wheeled
odometry and 2D laser data to estimate the robot pose with
3https://www.festivalnacionalrobotica.pt/
the PerfectMatch [5] algorithm with some modifications [6],
while only using a single processing unit, namely, a Raspberry
Pi 3. Finally, Braun et. al. [7] presented the development of a
four-wheeled mecanum platform specifically designed for the
Robot@Factory 4.0 competition, with a Raspberry Pi 4 and an
Arduino Mega 2560 as processing units. This work focused on
the mechanical design and electronics of the robot, including
the description of the sensory perception system composed by
a camera and a 2D laser scanner. In conclusion, the works
of Ribeiro et. al. [3] and Pinto et. al. [4] are not intended
for the present version of the Robot@Factory competition,
while the work of Braun et. al. [7] does not cover the software
architecture and implementation part of the robot.
Thus, this paper proposes a modular and open-access system
framework for an autonomous robot to be able to participate
in future editions of the Robot@Factory 4.0 competition. The
description of the proposed framework includes the hardware,
firmware, and software implemented by the 5dpo Robotics
Team4that led the team to win two consecutive editions of
the competition. All the code relative to firmware and software
developed within the context of the competition, 3D models
of the robot parts, and additional documentation are available
in a public GitHub repository5. This repository contains the
implementation of the proposed framework that may be used
by future participants in the competition to bootstrap their
development and push it even further to foster their STEM
education knowledge. To the best of the authors’ knowledge,
and according to the related works information found in the
literature, no framework encompasses hardware, software, and
system integration to allow building an entire robot system for
the Robot@Factory 4.0 competition.
The remainder of this paper is organised as follows. Sec-
tion II presents a brief summary of the current version of
the Robot@Factory 4.0 competition. Section III describes
the architecture of the proposed system framework. Then,
Sections IV, V, and VI characterise the hardware, firmware,
and software implemented and used in the competition, re-
spectively. Section VII presents the results originated from the
participation in two editions of Robot@Factory 4.0. Lastly,
Section VIII formulates the final conclusions and future work.
II. ROB OT@FACTORY 4. 0 COMPETITION
The competition presents the problem of deploying au-
tonomous mobile robots on a factory shop floor (illustrated
in Fig. 1). One or more robots should transport materials
between warehouses or machines, depending on the type of
materials received in the input warehouse. The robots will
transport three different parts from the incoming warehouse:
blue, green, and red boxes. The first is considered a final part;
thus, it is sent directly to the output warehouse. The second one
is an intermediate part that must receive a transformation to a
blue box at either machines A or B. Lastly, the red box must
suffer two transformations, first on machine A to a green part,
4https://5dpo.github.io/
5https://github.com/5dpo/5dpo ratf 2023
Fig. 1: Shop floor of the Robot@Factory 4.0 2023 edition composed
by warehouses (input on the top-left and output on the bottom-right
of the floor), machines (middle wall markings, where the machines
A are on the left and B on the right side), and ArUco markers [8].
and then on B to a blue box. Furthermore, the competition is
divided into three rounds. The rounds are distinguished by the
type of parts received in the incoming warehouse: only blue
parts are considered in the first round, while in the second,
green parts are introduced, usually up to two boxes. Finally,
the third round may have both green and red parts, usually,
two and one boxes, respectively. The robot must communicate
with a Wi-Fi server to get which part types are available for
each slot in the incoming warehouse at the start of each run.
As for the robot design, each robot must fit into a 30×30 cm
rectangle with a maximum height of 30 cm. The robot must
be completely autonomous and cannot communication with
any type of external systems that are not provided by the or-
ganisation. Unlike the original version of the Robot@Factory
competition, the shop floor has no guiding lines nor outer
walls delimiting the shop floor area. The teams can use the
ArUco markers on the floor for localisation (the guidelines
of the competition available in [8] present the ground-truth
position of those). Additionally, the teams can place their
own markers (e.g., reflectors or poles) near the corners of the
factory, providing that these do not exceed an occupied area
of 10 ×10 cm and a maximum height of 50 cm.
In terms of the final classification, the team with the highest
total number of parts correctly transported to the outgoing
warehouse or to a machine is the winner. In case of teams
with the same total number of points, the first criteria is the
number of correctly places parts in the outgoing warehouse,
and the second criteria is the shortest time to finish the round.
III. SYS TEM ARCHITECTURE
The system framework proposed in this paper and illustrated
in Fig. 2 is composed of three elements: all the components
and mechanics related to the robot (motors, sensors, actua-
tors), the firmware layer to interact with low-level devices
(including motors speed reading and control) running in an
Robot
Firmware
Arduino Mega 2560 Rev3
PIDsEncoders
SolenoidSwitch
Software
Raspberry Pi 4B 4G
Firmware
Driver
Laser
Driver
Localisation Navigation
HMI
UART/USB
2D Laser
SolenoidSwitchMotors
IMU
USB
I2C
Fig. 2: System architecture proposed in this paper for the
Robot@Factory 4.0 competition.
Arduino Mega 2560, and the software executing in a Rasp-
berry Pi 4B 4G (drivers to interact with the firmware and
other devices, localisation and navigation modules). First, the
robot is a four-wheeled omnidirectional platform designed in
the scope of the Robot@Factory 4.0 competition, using a 3D
printer to make the parts of the platform. The robot includes
a switch and a solenoid to sense and transport the boxes.
Next, the firmware layer controls the motors’ angular speed,
counts the encoder quadrature pulses, reads the switch state,
and actuates the solenoid according to the software. Then,
the software executes Robot Operating System (ROS) nodes
implemented in C++ to interact with the firmware and other
devices (including the 2D laser scanner), estimate the robot
pose, provide a Human-Machine Interface (HMI) interface
based on rviz, and enable the autonomous navigation of the
robot to execute the desired tasks. The following sections
explain each element of the system framework in depth.
IV. HAR DWARE
A. Robot motion
In terms of the robot motion, the four-wheeled omnidirec-
tional platform designed in the scope of the competition is
illustrated in Fig. 3. This platform uses 60 mm aluminium
mecanum wheels, where the distance between the front-back
and left-right wheels is 0.152 m and 0.193 m, respectively.
As a consequence of using four mecanum wheels, the robot
has holonomic kinematics, originating that the robot’s linear
and angular velocities are decoupled from each other (e.g.,
the robot can manoeuvre around obstacles without affecting
the orientation) [9]. Indeed, this design choice results in an
increased flexibility in terms of path planning and trajectory
control, where the robot may rotate along the trajectory
without requiring to do on-the-spot rotations.
Furthermore, the motors chosen are 12V DC motors with
251 rpm at no load from DFRobot. The motors include
a 43.7:1 metal gearbox and integrated quadrature encoders
based on the Hall effect. These encoders have a resolution
of 16 counts per revolution of the motor shaft, resulting in
approximately 699 counts of the gearbox’s output shaft. If
(a)
(b)
Fig. 3: Four-wheeled omnidirectional platform designed in the scope
of the competition: (a) the robot; (b) the bottom with the suspension.
the firmware can process the pulses, this setup results in a
resolution of 2800 pulses per revolution on the robot’s wheels.
An additional design constraint must be accounted for
because the robot has four wheels. Conventionally, the static
stability of a robot requires a minimum of three wheels, where
the triangle formed by the ground contact points must contain
the centre of gravity. The latter is ensured in the proposed
platform by most of the components being in the volume
between the front and back wheels. However, once the number
of contact points exceeds three, the geometry requires some
form of flexible suspension on uneven terrain [9]. Even though
the floor of the competition should be flat and even, Fig. 3b
shows the suspension put in the robot to be compatible with
terrain with minor imperfections in terms of its flatness.
B. Processing unit
Next, an Arduino Mega 2560 executes the firmware de-
scribed in this paper. This microcontroller board is compatible
with several open-source libraries, including the SPI6(com-
munication with SPI devices) and TimerOne & TimerThree7
(implementation of timers with PWM pins) libraries used
in this work. Moreover, the 5V operating voltage of the
6https://www.arduino.cc/reference/en/language/functions/communication/
spi/
7https://www.pjrc.com/teensy/td libs TimerOne.html
Arduino Mega is directly compatible with the voltage of the
encoders of the motors and the electromagnet actuation.
As for the software, the Raspberry Pi 4B 4GB executes all
the ROS nodes implemented in C++. Given that this paper
uses ROS 1 Noetic, the Operating System (OS) running in
the Raspberry Pi was the Raspberry OS (Legacy) based on
Debian Buster. However, the OS Ubuntu 20.04 may also be
used instead of Raspbian, in case there is no need for graphical
output. Also, the advantages of using Raspberry Pi are its
compatibility with the Raspberry Pi Camera module (e.g.,
for ArUco markers detection) and its 40 General-Purpose In-
put/Output (GPIO) pins compatible with SPI and I2C devices.
C. Sensors
In terms of the sensory system on the robot, in addition
to the wheel odometry data obtained from the encoders, the
robot is equipped with a 2D laser scanner and an IMU. The
first is a YDLIDAR X4 based on the principle of triangulation.
This laser has a 360º Field-of-View (FoV), a range between
0.12 m and 10 m, and a typical scanning frequency and angle
resolution of 7 Hz (using the device interface to connect
and power up the laser through a USB connection) and 0.5º,
respectively. Considering the maximum area of 1.7×1.2m for
the shop floor defined in the competition rules, the sensor is
considerably suitable for this application. However, other al-
ternatives based on Time-of-Flight (ToF) technologies, such as
the DTOF LD19 and RPLIDAR S2 lasers, may provide more
precise distance data. As for the IMU, the sensor employed
on the robot is the Waveshare 10 Degrees-of-Freedom (DoF).
This IMU is a low-cost sensor with an accelerometer, mag-
netometer, gyroscope, and temperature sensor. However, only
the accelerometer and gyroscope data are used in the scope
of this paper. The data communication between IMU and
Raspberry Pi is through an I2C bus.
Even though camera data is not used in this work, the four-
wheeled omnidirectional platform illustrated in Fig. 3 is also
equipped with an RGB Raspberry Pi Camera Board v2. This
camera has a sensor with 8MP and a resolution of up to 1080p.
The main advantages of using this camera are its ribbon cable
compatibility with the CSI port on the Raspberry Pi, native
drivers for the camera in the Raspberry Pi, and compatibility
with the OpenCV computer vision library.
D. Electronics
As for powering up the robot, the platform has a Gens
Ace 5000 mAh battery with a 3S cell configuration and
an 11.1 V nominal voltage. The battery directly powers the
wheels’ motors and the Arduino Mega. Furthermore, a DC/DC
buck converter compatible with the conversion from 6–40 V
in the input to 1.2–36 V in the output (maximum of 20 A)
creates a second voltage rail of 5 V. This rail powers up
the Raspberry Pi and the electromagnet. Although it would
be advisable to have a voltage regulator between the battery
and the other components, the approach taken in the platform
simplifies the electronics of the system as much as possible.
AB
Fig. 4: Encoders signals example.
Moreover, the Raspberry Pi powers up the 2D laser scanner
and the IMU through the USB connection and the GPIO
interface, respectively. In order to control the motors’ angular
speed, the system must have an electronic board able to
generate PWM signals for the motors. Thus, the Arduino Mega
used in the platform also has the Adafruit Motor Shield. This
shield allows to drive up to four DC motors, communicating
with the microcontroller over I2C.
E. 3D modelling
Lastly, all the parts composing the structure of the omni-
directional platform are 3D printed in PLA. These parts are
available in STEP files in the public repository5. The leading
software used to design the parts was originally the Open-
SCAD open-source modeller, and then, the team transitioned
to AutoDesk Fusion 360. The latter has both free personal
and academic licenses available for use. Even so, STEP files
are compatible with other 3D modelling software such as
FreeCAD and SolidWorks, allowing to use other programs to
perform modifications in the original parts.
V. FIRMWARE
The robot’s firmware is implemented in the Arduino Mega
2560 as a PlatformIO project (the latter has extensions for both
CLion and Visual Studio Code). This firmware includes count-
ing the encoders’ pulses, angular speed control of the motors,
the serial communication with the Raspberry Pi through a
UART/USB connection, and the simple logic associated with
reading the switch state and actuating the solenoid.
A. Wheel encoders
First, relative to counting the encoders’ pulses, Fig. 4
illustrates the quadrature inherently related to a two-channel
(A and B) encoder. Considering that the logical values of the
two channels are known for the current time instant kand
the previous instant k1, there are 16 possible possibilities
for all possible logic values of the last and current states of
the two channels (presented in Table I). Thus, the firmware
developed for the framework proposed in this paper employs
a lookup table to speed up the encoders’ reading, being to
attach an interruption to read the channels’ logic values at
50 kHz. This periodic interruption is implemented using the
TimerOne library7. Even though other boards, such as the
Raspberry Pi Pico and the Teensy ones, have programmable
input/output machines that may improve even further the
frequency of checking the channels and counting the pulses
(avoiding skipping pulses due to higher angular speeds of the
motors), the Arduino Mega does not offer that possibility.
TABLE I: Lookup Table of the Possible Values for the Previous
(k1) and Current (k) State on each Channel A and B, and the
corresponding Increment on the Pulses Counters (k).
Ak1Bk1AkBkk
0 0 0 0 0
0 0 0 1 -1
0 0 1 0 1
0 0 1 1 0
0 1 0 0 1
0 1 0 1 0
0 1 1 0 0
0 1 1 1 -1
1 0 0 0 -1
1 0 0 1 0
1 0 1 0 0
1 0 1 1 1
1 1 0 0 0
1 1 0 1 1
1 1 1 0 -1
1 1 1 1 0
Furthermore, there are two occasions in Table I where the
delta of the pulse counters is 0. One of them is when the
previous state is maintained in the current instant, meaning
that the wheel did not move. The other situation is when
a transition in the quadrature of the encoders is skipped,
e.g., having the previous state {A, B}k1= 0,0and the
current one {A, B}k= 1,1. This situation does not reveal the
wheel’s direction, thus setting the delta to 0 to avoid counting
erroneous pulses. Indeed, this situation should not happen,
in which the firmware should be able to count correctly the
encoders pulses at the maximum speed of the motor.
Regarding the resolution of counting the encoders’ pulses,
as stated in Section IV, the encoders in the omnidirectional
platform have a resolution of 16 counts per revolution on
the motors’ shaft. The approach implemented in this paper
can process the quadrature of the encoders, quadrupling the
resolution. Consequently, the final resolution obtained in the
firmware is 64 and 2796.8 pulses per revolution on the motors’
shaft and at the wheels, respectively (the latter considering the
43.7 reduction ratio in the motors’ gears). This final resolution
corresponds to 0.128º/pulse at the wheel.
B. Wheel angular speed control
The control of the motors’ angular speed is similar to the
methodology presented in Sousa et. al. [10]. Proportional-
Integrative (PI) controllers set the motors’ input voltage.
Considering that the motors have a dead zone, a Hammerstein
nonlinear block is used to compensate for the existence of that
zone. Then, the windup effect is compensated by limiting the
voltage computed from the PI controllers and the Hammerstein
block to the maximum value supported by the motors. When
the voltage exceeds these limits, the integration part of the PI
controller remains unchanged. Given that PWM signals can
be used to control the DC motors, these signals are directly
handled by the Adafruit Motor Shield with a frequency of
1.6 KHz (the maximum value allowed by the shield). The
communication between the shield and the Arduino Mega is
through an SPI connection.
C. Serial communication
As for exchanging data between the firmware and the
software modules, this data transmission is handled over
a serial communication between the Raspberry Pi and the
Arduino Mega. The serial port’s baud rate is 115200 bps,
which is considered a safe setup in terms of timing errors.
The channels library8manages the data protocol for the
communication. This paper employs the version that puts the
hexadecimal representation of the data as ASCII characters
in the communication stream. Indeed, the data frame of this
version is as follows: the first byte is a plain letter in ASCII
format representing a channel, and the following 8 bytes
are the hexadecimal representation of 4-byte data packets as
ASCII characters (0–9, A–F). The channel’s letter can be any
letter between a–z and G–Z. Even though the channel’s library
has a binary form to compact the data part of the frame from 8
to 4 bytes, being more efficient, plain ASCII greatly facilitates
the debugging of the firmware data streams by using a simple
serial monitor. Thus, the channels configuration used in the
firmware is the following one:
Arduino Mega Raspberry Pi:
g: pulse count of the back-right wheel (ticks)
h: pulse count of the back-left wheel (ticks)
i: pulse count of the front-right wheel (ticks)
j: pulse count of the front-left wheel (ticks)
k: interval time between control cycles (us)
r: reset signal
s: switch signal
Raspberry Pi Arduino Mega:
G: reference speed of the back-right wheel (rad/s)
H: reference speed of the back-left wheel (rad/s)
I: reference speed of the front-right wheel (rad/s)
J: reference speed of the front-left wheel (rad/s)
K: PWM value of the motors:
(value >> 24) & 0x03: motor index (0..3)
(value) & 0xFFFF: PWM value (0..max)
L: solenoid actuation
This configuration allows the communication of the en-
coders’ pulse counters, the desired angular speed for the
motors, and the switch and solenoid signals required to interact
and transport the boxes between locations between the two
processing units in the omnidirectional platform. The interval
time is to monitor the control of the angular speed control
and the reset signal to reinitialise the firmware driver in the
software. Additionally, channel Kis to actuate the PWM
directly (and so, being able to set the voltage of the motors),
allowing a possible automatic calibration of the PI parameters
using the software to set the voltage of the motors and logging
the angular speed after processing the rate of pulses per second
read from the encoders.
VI. SOFTWAR E
In terms of the software components of the system frame-
work, all the software is developed in C++ and ROS 1 Noetic
8https://github.com/P33a/MeArmControlVarSpeed
and is executed in the Raspberry Pi 4B 4GB. The software
includes the drivers developed in the scope of the competition,
the robot localisation estimation and its integration with the
laser scanner, wheeled and inertial odometry, autonomous
navigation through the shop floor, and the HMI interface.
A. Drivers
As previously mentioned in Section V, the communication
between the firmware and software is made through a serial
port (USB/UART) using the channels protocol. Consequently,
from the software side, a driver was specifically developed to
handle the serial communication with the microcontroller and
make the bridge between ROS messages and data processed
by the firmware. Similarly, the communication between the
2D laser scanner and the Raspberry Pi is also through a
serial connection with a USB cable. In this case, another
driver was developed to interpret the data received from the
YDLIDAR X4 and build a point cloud ROS message after a
full 360º scan is processed by the driver. These two drivers are
based on an open-source asynchronous implementation over
Boost.Asio9. This implementation had to be modified to be
compatible with the laser baud rate (128000 bps). The latter
is incompatible with the original Boost.Asio implementation
of serial port communication, requiring interaction with the
OS termios interface to set the custom baud rate.
B. Robot localisation
The system framework proposed in this paper consid-
ers three sensor sources to estimate the robot pose: wheel
encoders, IMU, and the 2D laser scanner. First, wheeled
odometry uses the encoders’ data to estimate the robot’s pose
relative to a previous instant. This estimation follows the same
equations presented in Sousa et. al. [11] for the four-wheeled
omnidirectional steering geometry to estimate the linear and
angular displacements of the robot. In the case of the IMU, this
sensor was added later in the preparation phase for the 2023
edition to improve the odometry estimation of the robot pose,
especially in linear and angular velocities higher than 0.5 m/s
and 2 rad/s, respectively (even when scaling the velocities
when one of the motors are saturated). The robot pose ekf
ROS node10 is responsible to fuse the wheeled odometry and
IMU data estimations, where the 3D orientation data from the
IMU improve the 2D pose of wheeled odometry. Then, the
laser data is used to detect beacons in the form of cylindrical
poles with a 9 cm diameter to update the robot pose with
an Extended Kalman Filter (EKF) [9]. The methodology to
extract the centre of the perceived poles is to use the current
pose of the robot and cluster the laser points near the points
expected to have poles. The centroid of each cluster adjusted
by the radius of the pole (the observed points by the 2D
laser are on the outer perimeter of the pole) is assumed as
the position of the pole perceived at each instant.
As for the EKF framework, the prediction step follows
the kinematic equations in [11]. These equations update the
9https://github.com/fedetft/serial-port/tree/master/4 callback
10https://wiki.ros.org/robot pose ekf
Fig. 5: Beacon-based Extended Kalman Filter (EKF).
current robot pose given the linear and angular displacements
between two consecutive instants. So, the prediction step
becomes independent of the odometry source used for the
filter (wheeled-odometry only or fusing with IMU data). In
terms of the update step, Fig. 5 illustrates the robot when
observing a beacon Bi. The state of the filter (X) is the
transformation of the robot ({XR, Y R}) w.r.t. the world
({XW, Y W}) coordinate frame, where X= [R|t]SE(2).
The Euclidean representation of this state is X= (x, y, θ)T,
where xand yare the 2D translation and θis the heading
of the robot. Furthermore, the ground-truth position of the
beacon Biw.r.t. the world (pb,i ) is known, considering the
centre of the coordinate frame being centred in the shop floor
and that the position of the beacons could be measured with a
metric tape. Consequently, Equation 1 represents the predicted
beacon position w.r.t. the robot frame ( ˜pb,i). Considering the
measurement zbeing the distance (db,i) and heading (θb,i )
of the beacon w.r.t. the robot (illustrated in Equation 2), the
predicted measurement (˜z) used in the update step of the EKF
framework is formulated in Equation 3.
˜pb,i =RT(pb,i t)(1)
z=db,i
θb,i(2)
˜z=˜pb,i
atan2 (˜p.yb,i,˜p.xb,i)(3)
C. Autonomous navigation
In the context of the Robot@Factory competition, the au-
tonomous navigation problem can be decomposed into path
planning and motion control. Let the map representation be
defined by a bi-directional graph Mwith a set of vertices
v={v0, . . . , vj, . . . , v65}and a set of bi-directional edges
e={e0,1, . . . , e1,2, . . . , e26,65}. All the vertices are defined
by a set of Cartesian coordinates in the global reference
frame and an ID, such that vj= (xj, yj)T, where jis
the vertex ID. For the competition, the criteria for choosing
the vertices was to assign a vertex to the centre of every
ArUco and keep its ID. This option synergises with a vision-
based state estimation scheme due to the fact that vertices
and ArUcos share IDs, in the case of the system also using
camera information for the pose estimation of the robot. The
Fig. 6: Graph-based map representation of the shop floor.
edges were selected manually by connecting all the vertices
close to each other, also taking into account the walls of the
machines and warehouses on the shoop floor. A visualisation
of the graph concerning the map can be seen in Fig. 6.
The path planning problem is solved by using the Dijkstra
graph-search algorithm [12] to find the shortest path. The
starting point of the trajectory considered in the algorithm is
always the vertex closest to the robot’s current pose. Then,
the path planner generates the array of edges that the robot
must traverse to reach its desired goal. As for the motion
control, this work implements a line follower algorithm based
on a Proportional-Integrative-Derivative (PID) control schema
to set the linear and angular reference velocities of the
omnidirectional platform. The edges from the path planner
are represented as direct lines between the affected vertices
of each edge. Moreover, the error of the PID controller to
set the linear velocities is formulated as the distance of the
robot’s current pose to the closest point in the line. In the
case of the orientation, the error is the difference between the
current heading of the robot and the orientation of the last
line in the trajectory. This formulation is possible due to the
decoupling property between linear and angular velocity of
omnidirectional robots, allowing the independent control of
linear and angular velocities.
D. Human-Machine Interface (HMI)
Finally, the HMI is based on the rviz ROS visualisation
tool. First, a pre-configuration of the rviz is automatically
loaded upon launching the system framework to allow the
visualisation of the ground-truth position of the beacons, the
point cloud of the 2D laser scanner, and the robot and world
coordinate frames. Additionally, a rviz plugin was developed
to add a panel to rviz, in order to allow the user to force the
start of the round without requiring the communication with
the UDP server handled by the competition organisers. Lastly,
the system framework also added the possibility of launching
TABLE II: Final Classification Results of the 5dpo Robotics Team
in the Robot@Factory 4.0 2022 and 2023 Editions.
Edition Round Place #total #outgoing Time (s)
2022 1st 1st 4 4 124
2nd 1st 6 4 126
3rd 1st 8 4 134
2023 1st 1st 4 4 40.7
2nd 1st 6 4 47.3
3rd 1st 8 4 60.9
the map server ROS node to visualise the same shop floor
shown in Fig. 1 in rviz with the appropriate scale, to match
the remaining visual data already present in rviz.
VII. TES TS AND RES ULTS
The main result of this paper is the open-source framework
available in a public GitHub repository5. This framework
includes documentation, designs, and code implementation
on all the hardware, firmware, and software that led the
5dpo Robotics Team to win two consecutive editions of the
Robot@Factory 4.0 competition, namely, the 2022 and 2023
editions. The framework can be used by any future participant
in the competition as a development base to allow the teams
to develop their own systems even further and study and re-
search more advanced techniques in sensor fusion, navigation,
localisation, motion planning, and task scheduling. In addition
to that, educators can also use this framework to foster their
students’ STEM education by using project-oriented learning
to introduce topics related to autonomous mobile robots.
Table II presents the final classification results for the
2022 and 2023 editions. The table includes the three cri-
teria mentioned in Section II to rank the teams in the
Robot@Factory 4.0 competition. Although the team accom-
plished the same points in both editions, there is a noticeable
improvement in the time taken to finish each of the three
rounds of the competition, more specifically, less than half the
time. This improvement is primarily due to improved motion
planning and control algorithms in the 2023 edition and the
consideration of IMU data to improve the odometry estimation
of the robot and, consequently, the robot’s pose estimation. The
improved motion control and planning is the consideration of
a graph structure in the 2023 edition, instead of a pre-defined
path, and also the fact that the robot leverages the decoupling
of the linear and angular velocities to rotate along the way
instead of performing on-the-spot rotations. As for the IMU,
the sensor was not present in 2022, not allowing an increase
in the maximum linear and angular velocities of the robot to
perform the rounds faster. Note that higher velocities increase
the probability of wheel slippage occurring, worsening the
wheeled odometry estimation of the robot [11].
Another improvement in the 2023 edition is using ROS. In
2022, the software part was based on a monolithic application
based in Lazarus to provide a graphical user interface, even
though implementing similar methodologies to the ones pre-
sented in this paper. The transition to C++ and ROS-based
software in the 2023 participation brought the modularity
inherent to separating the different modules (e.g., the EKF
localiser, odometry estimation, and motion control and plan-
ning) into independent nodes, improving the maintainability
and reusability of the framework by the community.
Lastly, there are videos publicly available in YouTube
to show the execution of the system framework proposed
in this paper. The videos include the participation in the
Robot@Factory 4.0 202211 and 202312 editions.
VIII. CONCLUSIONS AND FUTURE WORK
In conclusion, the system framework proposed in this paper
is modular and open-source to let future participants in the
Robot@Factory 4.0 competition use it as their development
base. The framework modularity allows the users to modify
individual modules without interfering with other ones. The
public GitHub repository5has the hardware documentation
and designs (including the 3D models of the robot’s parts),
the firmware implementation for the Arduino Mega 2560,
and the ROS-based software developed in the scope of the
competition, facilitating the reuse of the system framework.
Furthermore, the framework proposed in this paper may be
used for academic purposes to promote the STEM education
of students with a project-oriented learning approach to in-
troduce basic robotics concepts. As future work, the ROS 2
compatibility will be implemented for the system framework,
added camera support to detect the ArUco markers and use
them in the update step of the EKF localiser, and research on
trajectory definition and control. Moreover, the elaboration of
a STEM education activity will also be considered with even
further documentation and more class-oriented contents.
ACKNOWLEDGMENT
In the name of the 5dpo Robotics team, the authors thank
the Electrical and Computers Engineering Department (DEEC)
from the Faculty of Engineering, University of Porto (FEUP),
FEUP, the INESC TEC Institute for Systems and Computer
Engineering, Technology and Science, and the SYSTEC
Research Center for Systems and Technologies for all their
support and assistance in the development of this work.
REFERENCES
[1] M. O’Kelly, H. Zheng, D. Karthik, and R. Mangharam.
“F1TENTH: an open-source evaluation environment
for continuous control and reinforcement learning”. In:
NeurIPS 2019 Competition and Demonstration Track.
Ed. by H.J. Escalante and R. Hadsell. Vol. 123. Proceed-
ings of Machine Learning Research. 2020, pp. 77–89.
URL: https :/ / proceedings. mlr. press /v123 /o - kelly20a .
html.
[2] A. Wischnewski, M. Geisslinger, J. Betz, et al. “Indy
autonomous challenge autonomous race cars at the
handling limits”. In: 12th International Munich Chassis
Symposium 2021. Ed. by P. Pfeffer. 2022, pp. 163–182.
DOI: 10.1007/978-3-662- 64550-5 10.
11https://www.youtube.com/playlist?list=PLvp8fJUEPxYR13wTYem5suJ
p8--02 mJ
12https://www.youtube.com/playlist?list=PLvp8fJUEPxYT8sAByXptjetaS
gl28VxxH
[3] T. Ribeiro, I. Garcia, D. Pereira, J. Ribeiro, G. Lopes,
and A.F. Ribeiro. “Development of a prototype robot for
transportation within industrial environments”. In: 2017
IEEE International Conference on Autonomous Robot
Systems and Competitions (ICARSC). 2017, pp. 192–
197. DO I: 10.1109/ICARSC.2017.7964074.
[4] V. Pinto, F. Ribeiro, T. Brito, A. Pereira, J. Lima, and
P. Costa. “FLOOR Forklift Laser OmnidirectiOnal
Robot”. In: 2023 IEEE International Conference on Au-
tonomous Robot Systems and Competitions (ICARSC).
2023, pp. 245–250. DOI: 10.1109/ICARSC58346.2023.
10129586.
[5] M. Lauer, S. Lange, and M. Riedmiller. “Calculating
the Perfect Match: an efficient and accurate approach
for robot self-localization”. In: RoboCup 2005: Robot
Soccer World Cup IX. Ed. by A. Bredenfeld, A. Jacoff,
I. Noda, and Y. Takahashi. 2006, pp. 142–153. DOI:
10.1007/11780519 13.
[6] H. Sobreira, M. Pinto, A.P. Moreira, P.G. Costa, and
J. Lima. “Robust robot localization based on the perfect
match algorithm”. In: CONTROLO’2014 Proceedings
of the 11th Portuguese Conference on Automatic Con-
trol. Lecture Notes in Electrical Engineering. Ed. by
A. Moreira, A. Matos, and G. Veiga. Vol. 321. 2015,
pp. 607–616. DO I: 10.1007/978-3-319-10380-8 58.
[7] J. Braun, K. Beide, L. Bonzatto, et al. “Design and
development of an omnidirectional mecanum platform
for the RobotAtFactory 4.0 competition”. In: Synergetic
Cooperation between Robots and Humans. CLAWAR
2023. Lecture Notes in Networks and Systems. Ed. by
E.S.E. Youssef, M.O. Tokhi, M.F. Silva, and L.M.
Rincon. Vol. 811. 2024, pp. 63–74.
[8] P. Costa. P33a/RobotAtFactory: RobotAtFactory Com-
petition files. Last Accessed: 2024/01/31. 2021. URL:
https://github.com/P33a/RobotAtFactory/.
[9] R. Siegwart, I. Nourbakhsh, and D. Scaramuzza. Intro-
duction to autonomous mobile robots. 2nd ed. The MIT
Press, 2004.
[10] R. Sousa, P. Costa, and A. Moreira. “A pose control
algorithm for omnidirectional robots”. In: 2021 IEEE
International Conference on Autonomous Robot Sys-
tems and Competitions (ICARSC). 2021, pp. 91–96.
DO I: 10.1109/ICARSC52212.2021.9429803.
[11] R. Sousa, M. Petry, P. Costa, and A. Moreira. “Op-
tiOdom: a generic approach for odometry calibration
of wheeled mobile robots”. In: Journal of Intelligent &
Robotic Systems 105.39 (2022). DOI: 10.1007/s10846-
022-01630-3.
[12] K. Mehlhorn and P. Sanders. “Shortest paths”. In: Algo-
rithms and data structures: the basic toolbox. Springer
Berlin Heidelberg, 2008, pp. 191–215. DO I: 10.1007 /
978-3-540-77978-0 10.
ResearchGate has not been able to resolve any citations for this publication.
Chapter
Full-text available
Robotics competitions are highly strategic tools to engage and motivate students, cultivating their curiosity and enthusiasm for technology and robotics. These competitions encompass various disciplines, such as programming, electronics, control systems, and prototyping, often beginning with developing a mobile platform. This paper focuses on designing and implementing an omnidirectional mecanum platform, encompassing aspects of mechatronics, mechanics, electronics, kinematics models, and control. Additionally, a simulation model is introduced and compared with the physical robot, providing a means to validate the proposed platform.
Article
Full-text available
Odometry calibration adjusts the kinematic parameters or directly the robot’s model to improve the wheeled odometry accuracy. The existent literature considers in the calibration procedure only one steering geometry (differential drive, Ackerman/tricycle, or omnidirectional). Our method, the OptiOdom calibration algorithm, generalizes the odometry calibration problem. It is developed an optimization-based approach that uses the improved Resilient Propagation without weight-backtracking (iRprop-) for estimating the kinematic parameters using only the position data of the robot. Even though a calibration path is suggested to be used in the calibration procedure, the OptiOdom method is not path-specific. In the experiments performed, the OptiOdom was tested using four different robots on a square, arbitrary, and suggested calibration paths. The OptiTrack motion capture system was used as a ground-truth. Overall, the use of OptiOdom led to improvements in the odometry accuracy (in terms of maximum distance and absolute orientation errors over the path) over the existent literature while being a generalized approach to the odometry calibration problem. The OptiOdom and the methods from the literature implemented in the article are available in GitHub as an open-source repository.
Conference Paper
The robot presented in this paper was developed with the main focus on participating in robotic competitions. Therefore, the subsystems here presented were developed taking into account performance criteria instead of simplicity. Nonetheless, this paper also presents background knowledge in some basic concepts regarding robot localization, navigation, color identification and control, all of which are key for a more competitive robot.
Chapter
Motorsport has always been an enabler for technological advancement, and the same applies to the autonomous driving industry. The team TUM Autonomous Motorsports will participate in the Indy Autonomous Challenge in October 2021 to benchmark its self-driving software-stack by racing one out of ten autonomous Dallara AV-21 racecars at the Indianapolis Motor Speedway. The first part of this paper explains the reasons for entering an autonomous vehicle race from an academic perspective: It allows focusing on several edge cases encountered by autonomous vehicles, such as challenging evasion maneuvers and unstructured scenarios. At the same time, it is inherently safe due to the motorsport related track safety precautions. It is therefore an ideal testing ground for the development of autonomous driving algorithms capable of mastering the most challenging and rare situations. In addition, we provide insight into our software development workflow and present our Hardware-in-the-Loop simulation setup. It is capable of running simulations of up to eight autonomous vehicles in real time. The second part of the paper gives a high-level overview of the software architecture and covers our development priorities in building a high-performance autonomous racing software: maximum sensor detection range, reliable handling of multi-vehicle situations, as well as reliable motion control under uncertainty.
Conference Paper
The pose control (position and orientation) of a robot is important to control how and when the robot gets to the desired pose at the desired time in order to perform some task. Controlling omnidirectional robots is of great interest due to their complete maneuverability. So, we use Proportional-Integrative (PI), Proportional-Derivative (PD), and Feed-Forward (FF) controllers to control the pose of an omnidirectional robot in space and in time. The proposed controller approximates the future trajectory (a subset of points) on parametric polynomials for computing the derivatives needed in the FF. In the simulations performed, it was analyzed the size of the future trajectory horizon for the controller depending on the robot’s velocity, and the proposed controller was compared to PD-only and a generic GoToXY controller. The results demonstrated that the proposed controller achieves better results than the other two both in space and in time.
Chapter
The problem of the shortest, quickest or cheapest path is ubiquitous. You solve it daily. When you are in a location s and want to move to a location t, you ask for the quickest path from s to t. The fire department may want to compute the quickest routes from a fire station s to all locations in town – the single-source problem. Sometimes we may even want a complete distance table from everywhere to everywhere – the all-pairs problem. In a road atlas, you will usually find an all-pairs distance table for the most important cities. Here is a route-planning algorithm that requires a city map and a lot of dexterity but no computer. Lay thin threads along the roads on the city map. Make a knot wherever roads meet, and at your starting position. Now lift the starting knot until the entire net dangles below it. If you have successfully avoided any tangles and the threads and your knots are thin enough so that only tight threads hinder a knot from moving down, the tight threads define the shortest paths. The introductory figure of this chapter shows the campus map of the University of Karlsruhe and illustrates the route-planning algorithm for the source node M.
Article
Self-localization of a robot in an indoor plant is one of the most important requirement in mobile robotics. This paper addresses the application and improvement of a well known localization algorithm used in Robocup Midsize league competition in real service and industrial robots. This new robust approach is based on modeling the quality of several measures and minimizing the maching error. The presented innovative work applies the robotic football knowledge to other fields with high accuracy. Real and simulated results allow to validate the proposed methodology.
Conference Paper
The paper develops a new approach for robot self-localization in the Robocup Midsize league. The approach is based on modeling the quality of an estimate using an error term and numerically minimizing it. Furthermore, we derive the reliability of the estimate analyzing the error function and apply the derived uncertainty value to a sensor integration process. The approach is characterized by high precision, robustness and computational efficiency.