Conference PaperPDF Available

Case study on human-free water heaters production for industry 4.0

Authors:

Figures

Content may be subject to copyright.
Case Study on Human-Free Water Heaters
Production for Industry 4.0
Oleg I. Borisov, Vladislav S. Gromov, Sergey A. Kolyubin, Anton A. Pyrkin, Nikolay Y. Dema,
Vladimir I. Salikhov, Igor V. Petranevsky, Alexey O. Klyunin, Sergey V. Shavetov, Alexey A. Bobtsov
Faculty of Control Systems and Industrial Robotics
ITMO University
St. Petersbur, Russia
borisov@corp.ifmo.ru
Abstract—This paper focuses on the design of an industrial
cyber-physical system for workpieces production and processing
using three articulated robots (Mitsubishi MELFA RV-3SDB,
KUKA youBot and Kawasaki FS06N). These equipment is
integrated within the control software center based on MATLAB.
The main goal is to design unified system comprised of hetero-
geneous technical systems. The control strategy has three levels
structure (orchestration, tactic and local levels). A range of issues
addressed within the study covers spatial movements planning
for articulated robots, navigation, computer vision, force control,
sensory system.
Index Terms—Industry 4.0, smart factories, industrial robots,
motion planning, navigation, force control.
I. INTRODUCTION
Cyber-physical systems became a key point of the fourth
revolution (since 2010). Its conditional name is “Smart So-
ciety”. The latter implies such concepts as Industry 4.0, the
Internet of Things and the Internet of Services, Smart Factory
[1], [2].
A cyber-physical system (CPS) is a network of interacting
physical and computational components. CPS implies inter-
section of physical and computational processes [3]. One of
the main features of such systems is their ability of adaptation
to changes of the environment [4], [5]. Industrial CPS can
be represented by smart plants, robotized lines, computer
vision systems, automated quality checking, distributed sen-
sor networks, human-free production, flexible manufacturing,
integration with high-level systems.
In this paper we present some results on the topic of human-
free robotic automation of industrial operations. Our strategy
is research and development technologies to eliminate human
factor from production lines. Thus, workpieces manufacturing
scenario is considered within this work. Corresponding indus-
trial operations are set up on the basis of the laboratory of the
Department of Control Systems and Informatics. Three robots
are used for these tasks: Mitsubishi MELFA RV-3SDB, KUKA
youBot and Kawasaki FS06N.
Design of CPS is a highly sophisticated process, as it
imposes a number of high requirements on resultant perfor-
mance of these systems. In this section we consider challenges
confronting CPS scientists and developers.
The main challenge of this study is an advanced software
integration of low-level control of robotic systems from differ-
ent vendors using communication protocols and limited sets
of control instructions. The latter is also caused by limited
memory space of robot controllers, which leads to a require-
ment of reference data optimization. Significant consequences
of input points in practice are not suitable. Wireless connec-
tion between robots (especially ones mounted on a movable
platform) results in network delays. This issue is addressed
in details in [6], [7]. Another challenge is balancing between
safety restrictions and low latency. Stable real-time connection
should be achieved. All robotic agents should be integrated
into a unified system and coordinated between operations for
production maximization and capacity optimization. Since all
calculations in CPS are assumed to be removed from cloud to
the edge, all these control algorithms should be light-weight.
II. BAS IC AP PROACH A ND FR AM EW OR K
A task of human-free robotized production by means of the
heterogeneous multi-agent robotic system is considered. Three
articulated robots are supposed to carry out the following
industrial operations
1) welding and quality control — Mitsubish MELFA RV-
3SDB;
2) product delivery — KUKA youBot;
3) polishing — Kawasaki FS06N.
All these robots and their motions should be coordinated
between each other to form unified smart CPS. Typical control
structure of a CPS consists of local, tactic and orchestration
levels. Let us consider each of them applied to heterogeneous
multi-agent robotic system for manufacturing.
A. Local Level
A local level is a lowest one. Typically this means control-
ling a single actuator or getting data from a sensor. Low-level
control system is implemented in internal controllers, which
generate signals to be sent directly to actuators. Classical
controllers (as PD or PID) are widely used in such tasks.
They are usually closed source, implemented by the robot
manufacturer and cannot be changed.
Another objective of the local level is getting original data
from sensors for setting a feedback. This can be done using
standard sensors installed on the robot by the manufacturer
(e.g. motor encoders). Also various additional sensors can
Code generation
MATLAB
Network connection
Network connection
“CR2D-700”
Robot controller
Robot board
Robot controller
“RoboticAS”
“AIMB-212”
of the robot
of the robot
Motors and
platform
the mobile
encoders of
arm
scanner
Motors and
the robotic
Laser
encoders of
the arm
Camera of
ORCHESTRATION
TACTIC
LOCAL
Mitsubishi MELFA RV-3SDB
KUKA youBot
Kawasaki FS 06N
Quality control
vision system
Force/torque
sensor
module
module
module
Fig. 1. Scheme of interconnections
be integrated by engineers. Quality control vision system
can be implemented on the basis of cameras, laser scanners,
parameter testers, etc. Multiple cameras can be used to get
visual signals of the same object from different angles. Com-
bining the data provided by these cameras allows to decrease
error values caused by object recognition failures. Sensor
fusion algorithms are used for such purposes. A force/torque
sensor can be installed on the robot end-effector to control
interaction forces between the robot and manipulated object.
These collected data feed the next tactic level.
B. Tactic Level
A tactic level is focused on control of the whole robot to
execute a given operation. In this study the standard software
of each robot is used to execute elementary motions (e.g. using
a standard interpolation procedure given a set of reference
points). Note that all robots can be programmed in different
languages. For example, KUKA youBot is programmed in
C++ with special libraries provided by the manufacturer, while
Mitsubishi MELFA RV-3SDB is programmed in a special
version of Basic. Thus, of the challenges of this research is
heterogeneous software integration.
C. Orchestration Level
An orchestration level is of interest for this research,
since it focuses on coordination between robots and indus-
trial operations they execute. It forms a CPS as a unified
smart system. The automated production scenario describes
a sequence of operations to be carried out by the robots. In
this study algorithms of these operations are programmed in
MATLAB, which generates tactic level control instructions
coded in corresponding programming languages. In general,
the orchestration level produces a reference for the tactic level
on the basis of information collected from the feedback.
MATLAB was chosen as a basis for the orchestration level
as one of the most convenient and powerful academic tools.
MATLAB has a number of advantages in this sense. It allows
to process image signals from cameras, make sophisticated
calculations offline or online, monitor various state signals,
visualize the computer vision system, etc. An auxiliary pro-
gram coded in C language can be used to receive signals
from a high-level control software (e.g. MATLAB), process
them and transmit to an internal low-level control firmware
of the particular robot. Specific format requirements should
be taken into account while processing high-level control
commands and generating low-level signals. The scheme of
all interconnections is illustrated in Fig. 1.
III. WEL DI NG
Since welding is dangerous and sophisticated, its automa-
tion is reasonably typical in many factories. Following a
master/slave model the use of paired robots allows to care
about the personnel health and increase the welding quality.
However, preparatory works are hard and time consuming.
Simplifying human-machine interface (HMI) and making this
operation easily reconfigurable is addressed in this section.
An algorithm of planning planar paths specified by reference
images is described in [8]. Following this procedure a code
for the robot controller is generated automatically on the
basis of the input bitmap image containing desired contours.
However, such movements are restricted by a plane. Since
spatial paths are more suitable for welding, the goal is to
extend the previous result for the 3D case.
Thus, this section focuses on automatic code generation to
perform spatial movements needed for the welding operation
given a set of points specified in the Cartesian space. Such
coordinates can be extracted from a 3D model and processed
in Matlab. The robot used for this study is Mitsubishi RV-
3SDB.
Note that initial points from a 3D model (see Fig. 2) can
also be used without processing using just trivial point-to-
point motion, but it might lead to some issues. The robot
controller can be overloaded by the significant input data. The
velocity of the robot, which have to reconfigure its position
and orientation at each given reference point, might be de-
creased. These issues can be resolved by the arc approximation
algorithm introduced in [8] for a 2D space. This approach is
based on the feasibility of the standard software to move the
end-effector along an arc, specified with only three points.
This basic motion provided by the internal software is more
natural then complex combinations of multiple linear point-to-
point movements. As a result, the robot reconfigures only three
times at the reference points making up this arc. Such solution
allows to reduce the code size and increase the velocity.
Following the idea described in [8] replace all initial points
by a series of arcs each specified by three points.
In comparison to [8], since three dimensional space is now
considered, points might have different 𝑧-coordinates. So, the
distinction with respect to the previous result is that we have
to check not only belonging all intermediate points to the
arc within some 𝛿𝑎𝑟𝑐-region, but belonging them to the plane
within some 𝛿𝑝𝑙𝑎𝑛𝑒-region.
Consider three points that do not lie on the same line.
Coordinates of vectors specified in the Cartesian space are
defined as
𝑝0
1=
𝑥0
1
𝑦0
1
𝑧0
1
, 𝑝0
2=
𝑥0
2
𝑦0
2
𝑧0
2
, 𝑝0
3=
𝑥0
3
𝑦0
3
𝑧0
3
.(1)
It is well known that there is a unique circle (or an arc)
passing three points that do not lie on the same line. First
of all, we need to calculate coordinates of a center of the arc
formed by these points. Consider two coordinate systems. The
first one denoted as 𝑥0𝑦0𝑧0𝑜0is an absolute coordinate system.
The second one denoted as 𝑥1𝑦1𝑧1𝑜1is a coordinate system
attached to these three points, which form a plane 𝑥1𝑦1𝑜1.
Derive a normal to the plane 𝑥1𝑦1𝑜1through a cross product
𝑛=
𝑛𝑥
𝑛𝑦
𝑛𝑧
=𝑝0
2𝑝0
1×𝑝0
3𝑝0
1.(2)
Fig. 2. Welding operation
Then calculate a unit vector
𝑧=
𝑧𝑥
𝑧𝑦
𝑧𝑧
=𝑛
𝑛2
𝑥+𝑛2
𝑦+𝑛2
𝑧
.(3)
Following [9] compute the rotational transformation as
𝑅0
1=𝑅𝑧,𝛼𝑅𝑦,𝛽 ,(4)
where the angles 𝛼and 𝛽can be calculated as follows
𝛼= atan2
𝑧𝑦
𝑧2
𝑥+𝑧2
𝑦
,𝑧𝑥
𝑧2
𝑥+𝑧2
𝑦
,(5)
𝛽= atan2 𝑧2
𝑥+𝑧2
𝑦, 𝑧𝑧.(6)
Thus, substitute the angles 𝛼and 𝛽into the basic rotation
matrices [9] around 𝑧and 𝑦-axes and get
𝑅0
1=
cos 𝛼sin 𝛼0
sin 𝛼cos 𝛼0
0 0 1
cos 𝛽0 sin 𝛽
0 1 0
sin 𝛽0 cos 𝛽
.(7)
Calculate coordinates of the reference points with respect
to the local coordinate system using the rotation matrix
𝑝1
1=𝑅1
0𝑝0
1, 𝑝1
2=𝑅1
0𝑝0
2, 𝑝1
3=𝑅1
0𝑝0
3,(8)
where 𝑅1
0= [𝑅0
1]𝑇.
Denote coordinates as follows
𝑝1
1=
𝑥1
1
𝑦1
1
𝑧1
1
, 𝑝1
2=
𝑥1
2
𝑦1
2
𝑧1
2
, 𝑝1
3=
𝑥1
3
𝑦1
3
𝑧1
3
.(9)
In order to find coordinates of the arc center 𝑐1=
𝑥1
𝑐
𝑦1
𝑐
𝑧1
𝑐
consider three cases. If 𝑥1
2=𝑥1
3and 𝑥1
1̸=𝑥1
2then coordinates
of the arc center can be computed as
𝑦1
𝑐=𝑦1
2+𝑦1
3
2,(10)
𝑥1
𝑐=𝑘1
𝑦1
𝑐(𝑦1
1+𝑦1
2)
2+𝑥1
1+𝑥1
2
2,(11)
where 𝑘1is the slope coefficient of the line (𝑥1
1;𝑦1
1)(𝑥1
2;𝑦1
2)
given by
𝑘1=𝑦1
2𝑦1
1
𝑥1
2𝑥1
1
.(12)
If 𝑥1
1=𝑥1
2and 𝑥1
2̸=𝑥1
3then coordinates of the arc center
can be computed as
𝑦1
𝑐=𝑦1
1+𝑦1
2
2,(13)
𝑥1
𝑐=𝑘2
𝑦1
𝑐(𝑦1
2+𝑦1
3)
2+𝑥1
2+𝑥1
3
2,(14)
where 𝑘2is the slope coefficient of the line (𝑥1
2;𝑦1
2)(𝑥1
3;𝑦1
3)
given by
𝑘2=𝑦1
3𝑦1
2
𝑥1
3𝑥1
2
.(15)
Finally, if all 𝑥-coordinates are distinct, then coordinates of
the arc center can be computed as
𝑥1
𝑐=𝑘1𝑘2(𝑦1
1𝑦1
3) + 𝑘2(𝑥1
1+𝑥1
2)𝑘1(𝑥1
2+𝑥1
3)
2(𝑘2𝑘1),(16)
𝑦1
𝑐=𝑥1
𝑐𝑥1
1+𝑥1
2
2
𝑘1
+𝑦1
1+𝑦1
2
2,(17)
where 𝑘1and 𝑘2are given by (12) and (15).
The third 𝑧-coordinate can be derived trivially as
𝑧1
𝑐=𝑧1
1=𝑧1
2=𝑧1
3.(18)
Express coordinates of the center with respect to the base
coordinate system
𝑐0=
𝑥0
𝑐
𝑦0
𝑐
𝑧0
𝑐
=𝑅0
1𝑐1.(19)
Then we need to calculate distances 𝑑𝑝𝑙𝑎𝑛𝑒 and 𝑑𝑎𝑟𝑐 from
each point between 𝑝1,𝑝2and 𝑝3to the plane and to the arc
each formed by these points.
The equation of a plane is given as
𝑛𝑥𝑥+𝑛𝑦𝑦+𝑛𝑧𝑧+𝑛0= 0,(20)
where 𝑛0=(𝑛𝑥𝑥0
3+𝑛𝑦𝑦0
3+𝑛𝑧𝑧0
3).
Distances from a forth point 𝑝4=
𝑥0
4
𝑦0
4
𝑧0
4
respectively to the
plane 𝑑𝑝𝑙𝑎𝑛𝑒 and to the arc formed by 𝑝1,𝑝2and 𝑝3can be
computed as
𝑑𝑝𝑙𝑎𝑛𝑒 =|𝑛𝑥𝑥0
4+𝑛𝑦𝑦0
4+𝑛𝑧𝑧0
4+𝑛0|
𝑛2
𝑥+𝑛2
𝑦+𝑛2
𝑧
,(21)
𝑑𝑎𝑟𝑐 =
(𝑥0
𝑐𝑥0
4)2+(𝑦0
𝑐𝑦0
4)2+(𝑧0
𝑐𝑧0
4)2𝑟
,(22)
where 𝑟=(𝑥0
1𝑥0
𝑐)2+ (𝑦0
1𝑦0
𝑐)2+ (𝑧0
1𝑧0
𝑐)2is the
radius of the arc.
Then all points should be processed and checked on be-
longing them to a particular plane and arc within the specified
𝛿𝑝𝑙𝑎𝑛𝑒- and 𝛿𝑎𝑟𝑐-regions, respectively. As a result of this
procedure, a sequence of three-points-sets each specifying
a particular arc should be obtained (see Fig. 3). A special
operator of the standard software MVR P1 P2 P3 allows to
move around an arc specified by three reference points. A
list of points and movement instructions comprised of MVR-
commands can be uploaded to the robot controller.
IV. TRANSPORTING
The second important issue is the way how industrial
operations are connected. How workpieces can be moved from
one operation to another. In our days, a conveyor is commonly
used for this purpose. Sometimes this task is even carried out
by plant employees manually which creates a bottleneck in the
whole automation concept. Fast reconfiguration of conveyors
is impossible. Most of the time it moves with constant speed
in a single direction. A path of product movement is often
quite simple and cannot be changed within the assembly line.
Location of the industrial operation workspaces should be well
thought out and planned with respect to the conveyor.
Transporting of products between industrial operations can
be automated by robots equipped with mobile platforms. A
path of their movement can be sophisticated and can be
changed within the process. Rejected products can be reversed
and corrected immediately. Such robots can serve multiple
operations. For example, if some industrial operation takes
significant time, the assigned transport robot can be used to
help in other parts of the line.
The transportation robot has the three following tasks:
to move autonomously within industrial premises be-
tween key locations (workspace areas where industrial
operations such as welding and polishing are performed);
to recognize relative position of workpieces using com-
puter vision system;
to manipulate a workpiece, upload it aboard and unload.
ROS [10] has been chosen as a main tool of the software
development. In order to execute stated tasks the robot has
been equipped with the digital camera Logitech C920 and laser
scanner Hokuyo URG-04LX-UG01. Also it might be equipped
with a depth camera, which combines functionality of both
sensors.
A structure of the navigation system is depicted in Fig. 3.
A unit “Global task planner” ensures interaction with the
orchestration level. This unit receives and decomposes com-
mands from a high level, forms a sequence of elementary
actions to execute them and sends information back on the
operation execution. A unit “YouBot Driver ROS Interface”
connects programs of the tactic level and YouBot driver.
Fig. 3. KUKA YouBot software architecture.
Fig. 4. An example of the simulated environment.
A unit “Navigation” needed to construct a map, localize,
plan a motion path and control a mobile platform. A unit
“Manipulation task planner” defines a sequence of actions to
execute a command given from the unit “Global task planner”,
provides interaction between the units of manipulation and
computer vision and keep information on workpieces located
aboard.
The main objective of the navigation unit is to achieve some
given point within the workspace by the mobile platform. The
unit is able to operate in two modes. Those are a mode of
map constructing/modifying and default (regular) mode.
The first mode allows to construct a new map of the
workspace or to modify an existing one controlling the robot
manually. It also allows to specify targets on the map. Soft-
ware implementation of the algorithm is based on Grid-based
SLAM algorithm with Rao-Blackwellized Particle Filters [11].
In the default mode the unit input is fed by a given reference
point and the robot achieves it autonomously. Global dynamic
window approach [12] is used to plan a path, move and
avoid dynamical obstacles. The simulator v-rep is used to test
subsystems of the considered unit. The simulated workspace
is shown in Fig. 4.
V. POLISHING
The next operation to be automated by means of robotic
systems is polishing of complex parts. Interaction force be-
tween the robot and the part should be regulated on the one
Fig. 5. Transporting operation
hand, and to achieve desired quality of polishing on the other
hand. The robot should be equipped with a special force and
torque (F/T) sensor, which allows to set up the feedback by
the interaction force.
The robot Kawasaki FS06N with ATI F/T Sensor IP60 Delta
Transducer is used for this task. It is shown in Fig. 6. TCP/IP
protocol was used as the way for data transmission between
the robot controller, F/T sensor, and orchestral level.
All calculations of force feedback control (FCC) are carried
out on PC. The robot controller receives already transformed
and processed values of the 𝑋,𝑌and 𝑍components of
the force with respect to the base coordinate system and the
moments 𝑀𝑥,𝑀𝑦and 𝑀𝑧around the same base axes.
The sensor provides vectors of the forces and torques with
respect to its own coordinate system. At the same time,
reference poses are given to the robot in Cartesian coordinates.
Thus, we need to transform vectors of the forces and torques
to the base coordinate system as follows.
Transformation between the sensor coordinate system and
the end-effector one is given by [9]
𝑣𝑒=𝑇𝑒
𝑠𝑣𝑠=
𝑐𝜙𝑠𝜙0 0
𝑠𝜙𝑐𝜙0 0
0 0 1 𝑑
0 0 0 1
=
01 0 0
1 0 0 0
0 0 1 0.05
0 0 0 1
,(23)
where 𝑣𝑒,𝑣𝑠are vectors with respect to the end-effector
and sensor coordinate systems, respectively, 𝑇𝑒
𝑠is a matrix
of homogeneous transformation between them, 𝑑= 0.05 m,
𝜙=𝜋
2are an offset and angle along (around) 𝑍-axis between
the coordinate systems.
Transformation between the end-effector coordinate system
and the base one is given by [9]
𝑣𝑏=𝑇𝑏
𝑒𝑣𝑒=
𝑋
𝑅𝑏
𝑒𝑌
𝑍
0 0 0 1
,(24)
where
𝑅𝑏
𝑒
=𝑅𝑍,𝑂 𝑅𝑌,𝐴 , 𝑅𝑍,𝑇
=
𝑐𝑂𝑠𝑂0
𝑠𝑂𝑐𝑂0
0 0 1
𝑐𝐴0𝑠𝐴
0 1 0
𝑠𝐴0𝑐𝐴
𝑐𝑇𝑠𝑇0
𝑠𝑇𝑐𝑇0
0 0 1
=
𝑐𝑂𝑐𝐴𝑐𝑇𝑠𝑂𝑠𝑇𝑐𝑂𝑐𝐴𝑠𝑇𝑠𝑂𝑐𝑇𝑐𝑂𝑠𝐴
𝑠𝑂𝑐𝐴𝑐𝑇+𝑐𝑂𝑠𝑇𝑠𝑂𝑐𝐴𝑠𝑇+𝑐𝑂𝑐𝑇𝑠𝑂𝑠𝐴
𝑠𝐴𝑐𝑇𝑠𝐴𝑠𝑇𝑐𝐴
,(25)
𝑣𝑏is a vector with respect to the base coordinate system.
As the polishing tool is mounted on the sensor, we need
to take into account its weight and compensate it during
calibration.
Note that the interaction force should be regulated along
𝑍-axis, when polishing welding seams. As the only available
signal in this case is the error between the specified interaction
force and the actual one (their derivatives are assumed to
be unmeasurable) the robust output approach “consecutive
compensator” described and proved for linear and nonlinear
Fig. 6. Polishing operation
plants in [13]–[15] is suitable to use. This approach was
applied to various robotic applications, such as the robotized
boat [16], [17], quadcopters [18], [19] and mobile robots [20].
Controller has the following structure
𝑢(𝑡) = 𝜇𝛼(𝑝)^𝑒(𝑡),(26)
˙
𝜉(𝑡) = 𝜎𝜉(𝑡) + 𝑑𝑘1𝑒(𝑡)),(27)
^𝑒(𝑡) = 𝑇𝜉(𝑡),(28)
where 𝛼(𝑝)is the Hurwitz polynomial of the degree (𝜌1),
Γ,𝑑,are matrices and vectors of corresponding dimensions
Γ=
010. . . 0
001. . . 0
000. . . 0
.
.
..
.
..
.
.....
.
.
𝑘1𝑘2𝑘3. . . 𝑘𝜌1
, 𝑑=
0
0
0
.
.
.
1
, ℎ=
1
0
0
.
.
.
0
,(29)
𝑘={𝑘1, 𝑘2, . . . , 𝑘𝜌1}are chosen for the estimation model
(27), (28) to be stable.
VI. CONCLUSION
This study focuses on industrial cyber-physical system de-
sign on the basis of heterogeneous articulated robots integrated
within unified control software. This paper presents extension
of the previous work [8]. Welding, transporting and polishing
are proposed to be automated by means of robotic systems.
These three scenarios were simulated in laboratory conditions.
Obtained robotic solutions give opportunity to design flexible
production lines excluding human factor. Such issues as spatial
movements planning, navigation, computer vision, force con-
trol, sensorics are addressed within this study. Interaction force
between the robot and the processed workpiece is suggested
to regulate using the output robust controller. Experimental
results are provided in the paper.
VII. ACKN OWLEDGEMENT
This article is supported by the Government of the Rus-
sian Federation (grant 074-U01), the Ministry of Educa-
tion and Science of the Russian Federation (goszadanie no
2.8878.2017/8.9) and Russian Foundation for Basic Research
(project 17-58-53129).
REFERENCES
[1] F. Almada-Lobo, “The industry 4.0 revolution and the future of manu-
facturing execution systems (mes),Journal of Innovation Management,
vol. 3, no. 4, pp. 16–21, 2016.
[2] R. Harrison, D. Vera, and B. Ahmad, “Engineering methods and tools
for cyber–physical automation systems,” 2016.
[3] E. A. Lee and S. A. Seshia, Introduction to embedded systems: A cyber-
physical systems approach. Lee & Seshia, 2011.
[4] A. Pyrkin, A. Bobtsov, V. Nikiforov, S. Kolyubin, A. Vedyakov,
O. Borisov, and V. Gromov, “Compensation of polyharmonic disturbance
of state and output of a linear plant with delay in the control channel,”
Automation and Remote Control, vol. 76, no. 12, pp. 2124–2142, 2015.
[5] O. Borisov, V. Gromov, A. Pyrkin, A. Bobtsov, and N. Nikolaev, “Output
robust control with anti-windup compensation for quadcopters,” IFAC-
PapersOnLine, vol. 49, no. 13, pp. 287–292, 2016.
[6] V. Gromov, O. Borisov, A. Vedyakov, A. Pyrkin, S. Shavetov,
A. Bobtsov, V. Salikhov, and S. Aranovskiy, “Adaptive multisinusoidal
signal tracking system with input delay,IFAC-PapersOnLine, vol. 49,
no. 13, pp. 105–110, 2016.
[7] A. Pyrkin, A. Bobtsov, S. Aranovskiy, S. Kolyubin, and V. Gromov,
“Adaptive controller for linear plant with parametric uncertainties, input
delay and unknown disturbance,IFAC Proceedings Volumes (IFAC-
PapersOnline), vol. 19, pp. 11 294–11 298, 2014.
[8] O. Borisov, V. Gromov, S. Kolyubin, A. Pyrkin, A. Bobtsov, V. Sa-
likhov, A. Klyunin, and I. Petranevsky, “Human-free robotic automation
of industrial operations,” IECON Proceedings (Industrial Electronics
Conference), pp. 6867–6872, 2016.
[9] M. W. Spong, S. Hutchinson, and M. Vidyasagar, Robot modeling and
control. John Wiley & Sons, 2006.
[10] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs,
R. Wheeler, and A. Y. Ng, “Ros: an open-source robot operating system,”
vol. 3, no. 3.2, p. 5, 2009.
[11] G. Grisetti, C. Stachniss, and W. Burgard, “Improved techniques for grid
mapping with rao-blackwellized particle filters,” IEEE Transactions on
Robotics, vol. 23, no. 1, pp. 34–46, 2007.
[12] O. Brock and O. Khatib, “High-speed navigation using the global dy-
namic window approach,Proceedings - IEEE International Conference
on Robotics and Automation, vol. 1, pp. 341–346, 1999.
[13] A. A. Bobtsov, “Robust output-control for a linear system with uncertain
coefficients,Automation and Remote Control, vol. 63, no. 11, pp. 1794–
1802, 2002.
[14] A. Bobtsov, N. Nikolaev, and O. Slita, “Adaptive control of libration
angle of a satellite,” Mechatronics, vol. 17, no. 4, pp. 271–276, 2007.
[15] A. A. Bobtsov, A. S. Kremlev, and A. Pyrkin, “Compensation of
harmonic disturbances in nonlinear plants with parametric and functional
uncertainty,Automation and Remote Control, vol. 72, no. 1, pp. 111–
118, 2011.
[16] J. Wang, A. A. Pyrkin, A. A. Bobtsov, O. I. Borisov, V. S. Gromov,
S. A. Kolyubin, and S. M. Vlasov, “Output control algorithms of
dynamic positioning and disturbance rejection for robotic vessel,” IFAC-
PapersOnLine, vol. 48, no. 11, pp. 295–300, 2015.
[17] J. Wang, O. Borisov, V. Gromov, A. Pyrkin, and A. Bobtsov, “Adap-
tive controller implementation for surface robotic vessel,” in Control
Conference (CCC), 2015 34th Chinese. IEEE, 2015, pp. 3230–3235.
[18] A. Pyrkin, A. Bobtsov, S. Kolyubin, O. Borisov, V. S. Gromov et al.,
“Output controller for quadcopters based on mathematical model decom-
position,” in Control and Automation (MED), 2014 22nd Mediterranean
Conference of. IEEE, 2014, pp. 1281–1286.
[19] A. Pyrkin, A. Bobtsov, S. Kolyubin, O. Borisov, V. Gromov, and
S. Aranovskiy, “Output controller for quadcopters with wind disturbance
cancellation,” in Control Applications (CCA), 2014 IEEE Conference on.
IEEE, 2014, pp. 166–170.
[20] A. A. Pyrkin, A. A. Bobtsov, S. A. Kolyubin, M. V. Faronov, O. I.
Borisov, V. S. Gromov, S. M. Vlasov, and N. A. Nikolaev, “Simple robust
and adaptive tracking control for mobile robots,IFAC-PapersOnLine,
vol. 48, no. 11, pp. 143–149, 2015.
... There is research on human-free tasks or where little interference is needed [4,5]. The suggested approaches are designed to increase the level of autonomous functioning. ...
Conference Paper
Full-text available
Currently, rapidly developing software and hardware technologies in the machine learning field boost natural language processing in many applications: from interactive voice response systems to fully automated customer support with chatbots and even conversational bots for the entertainment industry. However, there are still fields that do not use machine learning to solve existing tasks and challenges. This paper introduces a high-level architecture of voice user interface, describes its requirements and use-cases in the context of using computer numeric control machines. Moreover, while other researchers suggest trying to automate the manufacturing process as much as possible to exclude humans from the manufacturing process, this article suggests choosing a slightly different approach and taking a fresh look at how the user interacts with the machine. The study proposes a new interaction scheme using computer voice control as an analog to computer numeric control term. Such an approach allows to simplify the human-machine interfaces on a machine and improve the safety of use and efficiency of interaction.
... In the context of industry 4.0, some works has already been devoted to integrate the overall task of robot in industrial scale as reported in [3]. Borisov et al. designed the industrial cyber-physical system for workpieces production and processing using three articulated robots that are Mitsubishi MELFA RV-3SDB, KUKA youBot and Kawasaki FS06N [4]. While, Bonci et al. proposed a practical solution to the improvement of the performance of a manufacturing system that integrates robotics, mechatronics and automation systems at different level [5]. ...
Article
Full-text available
It is widely known that the industrial robot system has supported an automation of production process in the industry. A robot provides consistent and predictable productivity that reduces management oversight. In manufacturing system, a manipulator arm is a type of robot that commonly used to pick and place the workpieces into a certain position. This paper describes a prototype of manipulator arm for implementing pick-and-place task in industrial robot system. Manipulator arm separates items based on the color detected by the TCS3200 color sensor and then processed by the ATmega2560 microcontroller. Furthermore, the output of the microcontroller will send a signal to four servomotor drivers to move the robot arm in the desired direction. The robot picks objects and places them in the designated place. The experiments are carried out to pick objects with certain colors and shapes and place into the provided container. The experimental result shows that manipulator arm works well according the design objective such as color accuracy and time response of robot movement from initial time until achieving the goal.
Article
The paper proposes a method for creating of adaptive automatic control systems based on the method of task separation and combination of solutions proposed by the authors. At the first stage, the task of designing an adaptive system is divided into subtasks, which are solved using robust regulators. At the final stage, decisions can be partially combined or simplified by gluing solutions that are not significantly different, i.e. applying the same solutions in related situations. With the help of numerical modeling and optimization, the effectiveness of the proposed method has been demonstrated using an illustrative example. The method greatly simplifies the task of designing adaptive systems and their implementation.
Conference Paper
Full-text available
This paper is a result of the university-industry collaboration between ITMO University and Thermex. A subject of this cooperative investigation is robotic automation of industrial operations in the sense of organizing a cyber-physical system and reducing the human factor from the production. Tasks of a technological process considered in the paper are welding, transporting and polishing. They are simulated on the basis of the laboratory of the Department of Control Systems and Informatics. Robots used in this study are Mitsubishi MELFA RV-3SDB, KUKA youBot and Kawasaki FS06N. Three control levels are highlighted in the paper. There is the strategic level at the top. Its main part is a central control software based on MATLAB used to set up a network and coordinate robots between the operations. Control of a separate robot corresponds to the tactic level. Finally, the local level is assigned to control of actuators and get data from sensors.
Article
Full-text available
Industry 4.0 dictates the end of traditional centralized applications for production control. Its vision of ecosystems of smart factories with intelligent and autonomous shop-floor entities is inherently decentralized. Responding to customer demands for tailored products, these plants fueled by technology enablers such as 3D printing, Internet of Things, Cloud computing, Mobile Devices and Big Data, among others create a totally new environment. The manufacturing systems of the future, including manufacturing execution systems (MES) will have to be built to support this paradigm shift.
Article
Full-text available
Much has been published about potential benefits of the adoption of Cyber-Physical Systems (CPS) in manufacturing industry. However, less has been said about how such automation systems might be effectively configured and supported through their lifecycles and how application modelling, visualisation and reuse of such systems might be best achieved. It is vitally important to be able to incorporate support for engineering best practice whilst at the same time exploiting the potential that CPS has to offer in an automation systems setting. This paper considers the industrial context for the engineering of CPS. It reviews engineering approaches that have been proposed or adopted to date including Industry 4.0 and provides examples of engineering methods and tools that are currently available. The paper then focuses on the CPS engineering toolset being developed by the Automation Systems Group (ASG) at WMG, University of Warwick and explains via an industrial case study how such a component-based engineering toolset can support an integrated approach to the virtual and physical engineering of automation systems through their lifecycle via a method that enables multiple vendors’ equipment to be effectively integrated and provides support for the specification, validation and use of such systems across the supply chain, e.g., between end-users and system integrators.
Article
In the paper an output control approach for quad copters under the condition of the bounded input signal is presented. This algorithm is based on the high-gain principle "consecutive compensator", which was augmented by an auxiliary integral loop in order to implement the anti-windup scheme. The mathematical model describing quadcopters is decomposed on two parts: a static MIMO transformation and six SISO dynamical channels. The controller generates virtual input signals for these channels, which after inverse MIMO transformation are allocated between the actuators as real bounded control signals.
Article
In this paper the problem of adaptive tracking of unknown multisinusoidal signal is addressed. The control input is characterized by the known time delay. It is assumed that all the parameters of the plant are known. In order to demonstrate efficiency of the proposed approach it is implemented to the robotic application. Detailed description of the experimental results are presented in the paper. In addition to the latter comparison between the presented algorithm and the proportional controller is performed.
Conference Paper
We present a new stabilization approach for unstable linear plants with long input delay, unknown parameters and disturbance. The predictor-based algorithm providing plant identification and stabilization is proposed. Also the extension is considered with estimation and cancellation of an unknown disturbance. A numerical examples are given to illustrate the efficiency of our adaptive controller.
Article
A new adaptive algorithm to compensate the unknown a priori multisinusoidal disturbance affecting the plant state and the measured output was proposed. It was intended for the plants that can be unstable, have time delay in the control channel and arbitrary relative degree of the model, as well as be nonminimal-phase, which is much superior to the existing counterparts.
Conference Paper
In the paper an output control approach for a class of nonlinear MIMO systems is presented. A multicopter with four symmetrical rotors, i.e. a quadcopter, is chosen to illustrate effectiveness of the proposed adaptive control approach based on the high-gain principle so-called 'consecutive compensator'. Output controller is designed by decomposition of the mathematical model on two parts. The first one is a static MIMO transformation (more precisely, in the considering case a system of linear equations which relates lift forces generated by the actuators and virtual control inputs). The second one is a few SISO channels. Such trick allows to design a control law in two steps. At the first step we design virtual controls for each SISO channel. Here we apply the mentioned systematic approach 'consecutive compensator'. And then after the inverse MIMO transformation we get a set of real lift forces.