ArticlePDF Available

Earthshaker: A Mobile Rescue Robot for Emergencies and Disasters through Teleoperation and Autonomous Navigation

Authors:

Abstract and Figures

To deal with emergencies and disasters without rescue workers being exposed to dangerous environments, this paper presents a mobile rescue robot, Earthshaker. As a combination of a tracked chassis and a six-degree-of-freedom robotic arm, as well as miscellaneous sensors and controllers, Earthshaker is capable of traversing diverse terrains and fulfilling dexterous manipulation. Specifically, Earthshaker has a unique swing arm – dozer blade structure that can help clear up cumbersome obstacles and stabilize the robot on stairs, a multimodal teleoperation system that can adapt to different transmission conditions, a depth camera aided robotic arm and gripper that can realize semi-autonomous manipulation, a LiDAR aided base that can achieve autonomous navigation in unknown areas. It was these special systems that supported Earthshaker to win the first Advanced Technology & Engineering Challenge (A-TEC) championships, standing out of 40 robots from the world and showing the efficacy of system integration and the advanced control philosophy behind it.
Content may be subject to copyright.
Earthshaker: A Mobile Rescue Robot for Emergencies and Dis-
asters through Teleoperation and Autonomous Navigation
YuZhang1,3,YuxiangLi2,3,HefeiZhang1,YuWang2,ZhihaoWang2,YinongYe1,YongmingYue1,NingGuo1,
WeiGao1
,HaoyaoChen2
,andShiwuZhang1
1Department of Precision Machinery and Precision Instruments, University of Science and Technology of China, Hefei 230027, China;
2College of Mechanical Engineering and Automation, Harbin Institute of Technology Shenzhen, Shenzhen, Guangdong 518055, China;
3Co-first authors
Correspondence: WeiGao, Email: weigao@ustc.edu.cn; HaoyaoChen, Email: hychen5@hit.edu.cn; ShiwuZhang, Email: swzhang@ustc.
edu.cn
©2022TheAuthor(s).ThisisanopenaccessarticleundertheCCBY-NC-ND4.0license(http://creativecommons.org/licenses/by-nc-nd/4.0/).
Graphical abstract
Overview of the rescue robot Earthshaker, the first place in the Advanced Technology & Engineering Challenge (A-TEC) champion-
ships.
Public summary
■Earthshaker,amobilerescuerobotthatcombinesatrackedchassis,aroboticarmandgripper,andvarioussensorsand
controllers,hasbeencreatedtodealwithvariousemergenciesanddisasters.
■Earthshaker’smultimodalteleoperationsystemcan adapt to differenttransmissionconditions,anditcan achieve both
semi-autonomousmanipulationwithitsarmandgripperandautonomousnavigationinunknownareas.
■EarthshakerwonthefirstA-TECchampionships,standingoutof40robotsfromtheworld,showingtheefficacyofthe
systemintegrationandthecontrolphilosophybehindit.
http://justc.ustc.edu.cn
Citation: ZhangY,LiYX,ZhangHF,etal.Earthshaker:AMobileRescueRobotforEmergenciesandDisastersthroughTeleoperationandAutonom-
ousNavigation.JUSTC,2022,52(0):.DOI:10.52396/JUSTC-2022-0066
Just Accepted
Earthshaker: A Mobile Rescue Robot for Emergencies and Dis-
asters through Teleoperation and Autonomous Navigation
YuZhang1,3,YuxiangLi2,3,HefeiZhang1,YuWang2,ZhihaoWang2,YinongYe1,YongmingYue1,NingGuo1,
WeiGao1
,HaoyaoChen2
,andShiwuZhang1
1Department of Precision Machinery and Precision Instruments, University of Science and Technology of China, Hefei 230027, China;
2College of Mechanical Engineering and Automation, Harbin Institute of Technology Shenzhen, Shenzhen, Guangdong 518055, China;
3Co-first authors
Correspondence: WeiGao, Email: weigao@ustc.edu.cn; HaoyaoChen, Email: hychen5@hit.edu.cn; ShiwuZhang, Email: swzhang@ustc.
edu.cn
©2022TheAuthor(s).ThisisanopenaccessarticleundertheCCBY-NC-ND4.0license(http://creativecommons.org/licenses/by-nc-nd/4.0/).
CiteThis:JUSTC,2022,52(X):(12pp) ReadOnline SupportingInformation
Abstract: Todealwithemergenciesanddisasterswithoutrescueworkersbeingexposedtodangerousenvironments,this
paperpresentsamobilerescuerobot,Earthshaker.Asacombinationofatrackedchassisandasix-degree-of-freedomro-
boticarm,as wellasmiscellaneoussensorsandcontrollers,Earthshaker iscapableoftraversingdiverseterrains andful-
fillingdexterousmanipulation.Specifically,Earthshakerhasauniqueswingarm–dozerbladestructurethatcanhelpclear
upcumbersomeobstaclesandstabilizetherobotonstairs,amultimodal teleoperation system that can adapt to different
transmissionconditions, adepthcameraaidedroboticarm andgripperthatcanrealizesemi-autonomous manipulation,a
LiDARaidedbasethatcanachieveautonomousnavigationinunknownareas.Itwasthesespecialsystemsthatsupported
EarthshakertowinthefirstAdvancedTechnology&EngineeringChallenge(A-TEC)championships,standingoutof40
robotsfromtheworldandshowingtheefficacyofsystemintegrationandtheadvancedcontrolphilosophybehindit.
Keywords: Rescue robot; Autonomous navigation; Semi-autonomous manipulation; Multimodal Teleoperation; System
integration
CLC number: TP242Document code: A
1 Introduction
Rescueworkers’livesareoftenunder threatduringtheirres-
cueworkin and after emergencies anddisasters. Sometimes
even casualties have to be suffered unfortunately. With the
development of robotics in general, robots have seen their
prosperityinreplacing human beings tofulfillmiscellaneous
tasksinthosedangerousscenarios[14].
Duringrescuework,itisoftenrequiredtotraverseunstruc-
tured and complicated terrains, even climb up and down
stairs, while carrying miscellaneous equipment and sensors
fordealingwithdangeroussituationsandclearingupcumber-
someobstacles[5].Therefore,mostrescuerobotshavebeende-
velopedbaseduponleggedortrackedroboticplatformstoen-
suremobility.Todate,plentyofleggedrobotshavebeende-
veloped by different research organizations and industrial
companies[6,7], and some of them have shown up in various
competitions like DARPA Robotics Challenge[8], DARPA
SubterraneanChallenge and so on[9]. To further improve the
mobilityofleggedrobots,therehavealsobeenhybridlegged
robotsthathave wheels or tracksattachedattheendoftheir
legs to replace the feet, e.g. RoboSimian[10], Momaro[11],
CHIMP[12],etc. However,theserobotshaveverycomplicated
structuresandlow-levelcontrolsthatconsume a lot of com-
putationpowerandcontroltime,thusresultinginarelatively
fragilesystemunder consistent large workloadduringrescue
work.Sofar, only thequadrupedal robot ANYmal hasbeen
successfully deployed in real rescue scenarios[13]. As with
tracked robots, popular ones are often equipped with swing
armsthat can help cross diverse obstacles, afford high pay-
load,and perform stable locomotion. After the 2016 earth-
quakeinItaly,a such tracked robot TRADR wasusedtoin-
spect damaged buildings[14]. In the ARGOS Challenge[15],
TeamArgonautsalsousedatrackedrobotwithswingarmsto
win the championship[16]. Other groups have even realized
autonomousnavigationfor such trackedrobots when climb-
ingstairs[17] andslipperyslopes[18].However,due totheexist-
ence of the swing arms, those robots lose the capability of
clearingupcumbersomeobstacles.Toovercomethosedraw-
backs in this work, we have designed a unique structure to
provide the tracked system the capability of both climbing
stairsandclearingobstacles.
Besidesthoughtful structural designs, autonomous opera-
tioncanalsogreatlyimproverescuerobots’efficiencyindif-
ferentrescueworks.A typical application scenario would be
exploringsignalblockedareasafteremergenciesordisasters
havehappened. To overcome the loss of telecommunication
betweentherobotsandtheoperators,autonomousnavigation
is potentially desired for rescue robots[19,20]. Miscellaneous
sensors can be taken advantage of to conduct simultaneous
localization of the robot and mapping of the unknown
area[2123].WehaveintegratedaLight Detection and Ranging
http://justc.ustc.edu.cn
1 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
(LiDAR) and an Inertia Measurement Unit (IMU) to build
gridmapsofthesurrounding environment, evaluate position
and pose of the robot and realize autonomous navigation.
Note that no Global Positioning System (GPS) is needed in
thisprocess,makingitparticularlysuitableforsignalblocked
areas.
Evenwithautonomousnavigation,therobotstillneedsthe
operators’ help when it comes to dexterous manipulation
taskslikeopeningdoors.Therewasevenacasewhereseven
operatorswereneededtocooperateoncontrollingarobot[24].
To release the operation burden, we have developed depth
cameraaided semi-autonomousmanipulationforroboticarm
indooropeningtasks that can quickly locatetheposition of
thedoorhandle. Thewholeoperationprocessjustneedstwo
operators’ cooperationto control the base and the arm, re-
spectively,largelyreducingthe operationcomplexityandin-
creasingoperationefficiency.
Consequently, the teleoperation system on a rescue robot
becomesquite critical for successful rescue works. The ef-
fectivityand reliability of the teleoperation system determ-
ines the lower boundary of the rescue robot’s performance.
Therefore, a multimodal teleoperation system to provide
enoughredundancy and deal with different conditions be-
comenecessary.
Basedupontheabovearticulation,wepresentinthispaper
ournewlydesignedrescuerobot,whichhassuccessfullyad-
dressed the aforementioned four points of functionality. We
havenamedit Earthshaker, not onlybecause it “shakes” the
ground when it moves around, but also because we hope it
canbringearthshakingimprovementontheroleofrobotsin
real rescue work. An overview of Earthshaker is shown in
Figure1.Theremainderofthepaperfirstintroducesthevari-
oussystemsofEarthshakerinSection2,includingthetracked
chassis, the robotic arm and gripper, the perception system,
theteleoperationsystem,andtheirmechatronicintegration.In
Section3, control frameworks of the multimodal teleopera-
tion,thedepth camera aided semi-autonomousmanipulation,
andtheLiDARaidedautonomousnavigationarepresentedin
detail.Section4 summarizestheperformanceofEarthshaker
inthefinalsofthe2020A-TECchampionshipsastheexperi-
mentalvalidationofthesystemintegrationandcontrolphilo-
sophy. The experience obtained from the competition and
possiblefuturedirectionsarediscussedinSection5.
2 Earthshaker
2.1 Tracked chassis for robust locomotion
The tracked chassis supports all the other systems onboard
with corresponding mechanical and electronic interfaces to
formtherobotasawhole.Itdeterminestheupperlimitofthe
wholesystem’smobility[25].Thetrackedchassisismadeofal-
loysteelthroughcastingandwelding.Itcombinesthedesign
ofChristiesuspensionandMatildasuspensiontoachieveex-
cellent traversing capability. The vibration and impact from
rough terrains can be effectively absorbed by the chassis to
maintainastable operation environment for the systems on-
board.Thechassisisdrivenbytwo1000wattsbrushlessmo-
tors,whichcansupportamaximumrunningspeedof1.6m/s
and a maximum climbing inclination of 40 degrees. Four
packsof LiPo batteries inside the chassis can power Earth-
shakerto continuously work for 3 hours at medium work-
loads.Eachbatterypacksupportsanindividualsystemtoen-
sure power isolation and security, namely, one 48 volts 60
ampere-hourpackfor the chassis, three24 volts 16 ampere-
hourpacksfor the manipulation system, the perception sys-
temandtheteleoperationsystem,respectively.Intheend,the
Earthshakeris 0.72 m wide and1.22 m high, and itslength
canvaryfrom1.33mto1.49m,withatotalweightofabout
250kg.
Topromotethe robot’s capability ofclearingcumbersome
obstacles and climbing up and down stairs, a swing arm
dozer blade structure has been designed and attached to the
rearend of the chassis. The structure consists of an electric
linear actuator, two tracked swing arms, and a dozer blade.
Theelectriclinearactuatorcanbecontrolledunderteleopera-
tionto rotate theswing arms, thusadjusting the poseof the
dozerbladefrom65degreesto-45degreeswithrespecttothe
horizontaldirection.Onflatterrains,thestructureisfoldedto
reducemotionresistance and increaseagility,whileonstairs
Fig. 1.OverviewoftherescuerobotEarthshaker.
Earthshaker:AMobileRescueRobotforEmergenciesandDisastersthroughTeleoperationandAutonomous
Navigation Zhangetal.
2 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
itcan be used to adjust pitch angle of the robot to improve
stability,asshowninFigure 2. When there are cumbersome
obstaclesintheway,thedozerbladecanbeputverticaltothe
ground to push them away obstacles efficiently, as long as
theyareunder75kg.
2.2 Robotic arm and gripper for dexterous manipulation
Withoutdexterous manipulation, taskslike pressing buttons,
opening doors, turning off valves, picking up small objects,
movingaround wounded victims, etc., cannot be accom-
plished.EarthshakerhasbeenequippedwithaUR5erobotic
armandanAG95two-fingergripperforthosepurposes.The
armcanrealizedexterousmanipulationwithinaradiusof750
mm,withamaximumpayloadof5kg[26].Thearmisinstalled
atthefrontend of the robot toguaranteeenough workspace
andbalancetheextra weight introduced bytheswingarm
dozerbladeat the rear end. The Original Equipment Manu-
facturer(OEM)control box of the armhas been customized
tosavespace on the robotand can work under24volts DC
powerinsteadof220voltsACpower.Thevelocitycontrolof
eachjointonthearmandthegripperismappedtotheremote
controller, thus precise impedance control can be achieved.
Also, to facilitate semi-autonomous manipulation, an Intel
D435iRGBD camera has been mounted on the gripper, the
useofwhichisdiscussedlaterinSection3.2.
2.3 Sensors for diverse perception
Earthshakerhasaplatformforsensorinstallationbetweenthe
roboticarm andtheswingarm.Foursides oftherectangular
platform have four wide-angle cameras, which are headed
slightlydownwards to provide a panorama of the environ-
ment surrounding the robot. Thus, the remote operator can
plan paths and avoid obstacles accordingly. There are also
two infrared cameras on the sensor platform that can help
identifyobjectsinsmokyenvironment.Thetwoinfraredcam-
erasareplacedoppositetoeachother,withone pointingfor-
wardandonepointing backward[27]. At the frontpanelofthe
chassis,thereisanothermicrocamerathatcanprovideawide
view of the environment in front of the robot. With further
helpfrom the lasers onboth sides ofthe robot, the operator
canpreciselydriveEarthshakertopassthroughnarrowdoors
orcorridorswithoutanyproblem.
2.4 Teleoperation and communication
EarthshakeristeleoperatedbytwooperatorsusingtwoAT9S
remotecontrollers,oneforthetrackedchassisandoneforthe
roboticarm and gripper. Each controller has 12 channels to
transmitdigital commands via 2.4 GHz communication fre-
quencytothereceiverontherobot.ASTM32F091basedmi-
crocontrolleris then utilizedtodecodethesignalsto achieve
closed-loopcontrolofthechassis, as well as otherperipher-
alsliketheswingarm–dozerblade,thelasers,theLEDsetc.
Meanwhile,thesignalstothereceiverfortheroboticarmand
gripper are translated into specific actions by an Intel NUC
minicomputer,which hasaRAMof 16GBandanIntelCore
i7-1165G7 CPU with a maximum clock frequency of 4.7
GHz.Ontheotherhand,thevideoimagestransmittedbackto
theoperatorsconsistofimagesfromthewide-anglecameras,
theinfraredcameras,themicrocameraandtheoperatingsys-
temscreenoftheNUCminicomputer.Theseeightimagesare
selectedandcombinedintoonesingleimagefortransmission
tosavebandwidth.
Besidesthe2.4 GHz direct communication,there are also
tworedundant communication pathson Earthshaker, the 1.8
GHz MIMO-mesh radios[28] andthe 4G/5G mobile telecom-
munication.Theseadditionalpathscanovercometherelative
shortcommunicationdistanceofthe2.4GHzsignalsanden-
suretherobustnessofteleoperationforEarthshaker.
2.5 Mechatronic integration
Figure 3 summarizes the major mechatronic components of
Earthshaker,aswellasthesignalpathsformultimodalteleop-
eration.NotethattheNUCminicomputercanalsocontrolthe
chassis, depending on its priority comparing with the
STM32F091microcontrollerontheCANbus.Consequently,
theswitchbetween teleoperation and autonomousnavigation
canbeorganized.Toaccomplishautonomousnavigationand
dexterous manipulation, the NUC is also connected to the
Fig. 2.Demonstrationoftheswingarm–dozerbladestructure.Thestructurecanbefoldedorextendedbasedonneeds.
Zhangetal.
3 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
LiDAR,the RGBD cameraand the grippervia USB cables,
andtotheroboticarmviaaswitch.The same switch is also
connectedtotheMIMO-meshradioandthe4G/5Grouter.As
aresult,theswitchbuildsa100Mbpsnetworkwiththeoper-
ators’ computer, the signal quality of which affects the
latencyofteleoperation.
3 Control
3.1 Control logic of the base
Inrealrescuework,itisinevitabletofaceenvironmentswith
detrimentalmagneticfieldsorpoorsignaltransmissioncondi-
tions,where theregular2.4GHzteleoperationsignals would
decaygreatly with reduced signal-to-noise ratio and in-
creaseddata package loss. To maintain robust signal trans-
missionbetween the operators andEarthshaker for real-time
teleoperation,a framework for multimodal teleoperation has
beendeveloped to ensure the communication path is un-
blocked,asshowninFigure4(a).
Withintheframework,whenEarthshakerisclosetotheop-
eratorsuch that theAT9S remote controllers can talkto the
receivers on the robot directly, the 2.4 GHz communication
frequencyisused. Once thedistanceinbetweenincreasesor
forsomereasonthesignals get blocked to a point where the
directcommunicationfails,the1.8GHzcommunication fre-
quency would be adopted and the signals are transmitted
throughtheMIMO-meshradios.Whennecessary,therobotic
armcanevendropanextrarelayradioonboardtofurtherin-
crease the communication distance and quality. Multiple
MIMO-meshradioscanformadistributednetworkwithvari-
ous forms, e.g., a line, a star, a net, and even a mixture of
those.Thenetworkcanflexiblyadapttofastnodemovement
andnode-to-nodesignalqualityvariation,realizinghighqual-
itysignaltransmissionconsistently.To ensure the teleopera-
tioncommunicationincaseeventheMIMO-meshradiosfail,
one more redundant communication path realized by the
4G/5GrouterhasbeenaddedtoEarthshaker.Theroutercan
eitherconnecttonearbybasestationsfromtheselectedInter-
netServiceProviderorberelayedbynearbyUnmannedAeri-
alVehicles,tobuildanetworkwithapresetcloudserver.The
operators can then access the server, monitor the real-time
datafromtherobotandgivecorrespondingcommands.
Jnorm
Earthshaker checks control signals from these three paths
accordingtotheir prioritylevelsandsignalquality. Ifeffect-
ive data are not received within a prescribed time, the path
withalowerprioritylevelwouldbechecked.Ifallthreepaths
fail, the program would determine whether to enter the
autonomous navigation mode or an emergency stop mode.
Onceanycommunicationpathissuccessfullyestablished,the
remotecontroller inthe baseoperator’s handscan drivethe
robottomoveforward,backwardandrotatearounditscenter
point.Themicrocontroller on therobot first unifies the joy-
stickvaluesobtainedfromtheremotecontrollerto ,
Jnorm =2Jinput Jmax Jmin
Jmax Jmin
(1)
Jinput =[linput ,rin put ]T
Jnorm =[linorm,rnor m]T
Jmax
Jmin
v
ω
where , representing the values from each
joystick, ,representing the unified joystick
commands,and and denotethemaximumandminim-
um values of the joysticks, respectively. When the value is
zero,therobotisstill.Theunifiedvaluesaretheninterpreted
intothebase’slinearvelocity andangularvelocity as
v=0,i f lnorm
3<0.001,
lnorm
3vmax ,otherwise,(2)
and
ω=0,i f |rnorm
3|<0.001,
rnorm
3ωmax ,otherwise,(3)
Fig. 3.ElectricalschematicsofEarthshaker.
Earthshaker:AMobileRescueRobotforEmergenciesandDisastersthroughTeleoperationandAutonomous
Navigation Zhangetal.
4 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
vmax
ωmax
ωr
where and arethemaximumlinearandangularvelo-
citiessupportedbythebase.Throughthekinematicmodelof
differentialdrive,theangularvelocities of left and right mo-
torsofthebase, and ,canbecalculatedas
ωl=2vlω
2r,
ωr=2v+lω
2r,
(4)
l
r
ωr
where isthedistancebetweentracksand representsthera-
diusofthedrivewheel.Thecalculated and arethensent
tothemotorsascontrolcommands.
3.2 Control logic of the arm and gripper
Theprograms for teleoperation of the arm and gripper con-
sist of an operation assisting module for door opening task
andseveral interface modules for maintaining communica-
tionbetweentheNUCminicomputer and the other compon-
ents,includingthesignal receiver, the UR5e arm,theAG95
gripperandtheD435iRGBDcamera.Insidetheseprograms,
anetworksocketisfirstcreatedaccordingtothearmcontrol-
ler’s IP address and port number, such that the built-in
input/output functions can be called to read or write to the
sockettointeract with the armcontroller. At the sametime,
theserialports connected tothesignalreceiverandthe grip-
perareinitialized in the programs throughRS485protocols.
Once the NUC receives remote control instructions through
thesignalreceiver,itparsesthemintothepositionsandvelo-
citiesforeachjointofthearm,aswellastheopeningangle
andholdingforceforthegripper.
Tofacilitate semi-autonomous door opening for Earth-
shaker,theoperationassistingmoduleisdevelopedusingthe
depth camera in an eye-to-hand manner. This module, as
showninFigure4(b),cangreatlysimplifytheprocessofdoor
opening,avoidingthepotential mistakes that could beintro-
duced by the operator through teleoperation. In the module,
thecoordinatesofthecameraandthearmarefirstcalibrated
toobtain thetransformationrelationshipbetweenthem.With
theintrinsicparameter matrix ,thepixelsonthedepthim-
agesobtainedfromtheRGBDcameracan be converted into
three-dimensionalpointcloudas
Get
joystick
value
4G/5G
network
1.8GHz
MIMO-
Mesh radio
2.4G
remote signal
received?
1.8G
radio signal
received?
4G/5G
network
connected?
2.4GHz
remote &
receiver
Operator Side Remote
Side
E-Stop / Start autonomous
navigation
Y
N
N
N
Y
Y
Linear velocity&
angular velocity
Motor Speed
Nonlinear
mapping
Normalized
Kinematics
RGB Image 3D point
cloud
Depth Image
RealSense
D435i
Arm &
Gripper
Extracting the
door planar
Gripping
position
Clustering the
door handle
Gripping
orientation
Visually
feedback
control
External
parameter
calibration
EKF
Scan-
matching
Lidar
Odometry
LiDAR
Inertial
Odometry
Pre-
integration
IMU Laser-Inertial
Odometry
Occupancy
Grid Map
RRT
Expansion
NBV Path
Tracking
(a) Flow chart for control logic of the multimodal teleoperation.
(b) Flow chart of the semi-autonomous door opening algorithm.
(c) Flow chart of the autonomous navigation algorithm.
Fig. 4.FlowchartsofthecontrolalgorithmsforEarthshaker.
Zhangetal.
5 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
P=DK P,K=
1
fx
0
0
0
1
fy
0
0
0
1
,(5)
P
D
(θ, n)
where is the coordinates of the 3D point, denotes the
depthmeasuredontherayofthepixel, isthecoordinatesof
thepoint intheimage.Next,objects canbeidentifiedwithin
thepointcloudconvertedfromthedepthimage.Specifically,
inthetaskofdooropening,thepositionandorientationofthe
doorandthe handle should beestimatedtoserveasthegoal
forpath planningofthearm andgripper.Theposition ofthe
dooris determined through fittingplanes to thepoint cloud.
Before the operation assisting module is started, the robot
needstobeinfrontofthedoorsuchthatthedoorisinsidethe
FieldofView(FOV)oftheRGBDcamera.Consequently,the
pointcloud corresponding to the doorcan be recognized by
planarsegmentation.Tofigureouttheorientationofthedoor,
the Principal Component Analysis (PCA) method[29] is ex-
ploitedtocalculatethenormalvectorofthedoorplaneinthe
point cloud. Then the axis-angle of the door’s normal
directioncanbecalculatedas
θ=acos(a·x),
n=a×x,(6)
θ
n
x
x
n
a
x
q
where denotestheanglebetweenthenormalvector ofthe
doorplaneand the unitvector of -axis, and represents
therotationaxisfromthe vector to the vector . Thus, the
rotationmatrix can befurther calculated with Rodrigues’s
Formula[30],
q=cosθI+(1cosθ)nnT+sinθn.(7)
I
p
q
p
where isthe identity matrix. Subsequently, DBSCAN al-
gorithm[31]is usedtoclusterthecloud pointsthatarecloseto
thedoor plane.Theclusterwith apropersizeisidentifiedas
thepointcloudofthedoorhandle,andtheclustercenter is
calculatedas themeanofallthecluster pointsandsetasthe
targetpositionfor arm to grip.As a result, theorientation
andposition serveasthetargetposewhenapproachingthe
handle.However,duetotheobservationmodeloftheRGBD
sensor, the depth measurement error is proportional to the
squareofdistance.The eye-to-hand method leadsto a relat-
ivelylongseparationbetweenthetargetandthesensor,inev-
itablycausingobservationerrorsforthegrippingpose.Addi-
tionally, the vibration introduced by the movement of the
basealsomakesit difficult to realizevisuallyfeedback con-
trolofgripping.Hence,atthispoint,thealgorithmisonlyad-
optedtoprovideaninitialposeforthedooropeningtask.The
remainingoperationstillneedstobecompletedbytheoperat-
ors. Even with this level of semi-autonomy, the operating
steps have been greatly simplified and the operation burden
ontheoperatorsissufficientlyreleased.
3.3 Autonomous navigation
Whenautonomousnavigation is desired, the control author-
ityofEarthshakercouldbegiventotheNUC.Thishelpsthe
robotexploreunknownandsignalblockedareasactivelyand
searchforanexittowardsadesireddirection,thealgorithmof
whichcanbefoundinFigure4(c).Oncetheautonomousnav-
igationisstarted,theNUCanalyzesthedatascanned by the
XL
c
LiDARtobuildagridmapofcurrentenvironmentandestim-
ate its ego-motion simultaneously. Feature matching based
methodsuchas LOAM[32] isa popularposeestimationmeth-
odthat demonstratesrobustnessandefficacy incomplexoff-
roadenvironment.Therefore,scanmatchingisalsoincorpor-
ated into Earthshaker’s autonomous navigation algorithms.
Featuresareextracted from each frameoftheLiDARsweep
forthesmoothness as
c=1
|S|XL
(i)
jS,j,i
(XL
(i)XL
(j))(8)
XL
i
i
S
i
S
T
where denotes the -th point within the sweep, and
definesasetofconsecutivepointsobtainedbythesamelaser
beamnearpoint .Thepointnumberwithin is empirically
setto10.Bysettingathresholdforsmoothness,thecurvecan
be determined as edge feature with greater smoothness and
planarfeaturewithlesssmoothness.Thentheedgeandplanar
featuresofconsecutiveframescanberegisteredseparatelyto
restore the motion between frames using Iterative Closest
Point(ICP) algorithm[33].The object functionfor the ICP al-
gorithmisset tominimizethecostwithrespecttotheestim-
atedtransformation as
f(T)=min f
n
i=1
di(T
+
m
j=1
dH j(T)) (9)
m
n
d
dH
where and denote the numberofmatched edge features
and planar features, respectively, represents the distance
betweentwomatchededgefeaturesand representsthedis-
tancebetweentwomatchedplanarfeatures.
Due to the vibration of the base caused by tracks and
ruggedterrains,IMUpre-integration[34]is introduced into the
system to further improve the robustness of the localization
results. As shown in Figure 4(c), the Extended Kalman
Filter[35]isusedtoinferthestateoftherobot,fusingthescan-
matchingresultsandtheIMUpre-integrationresultsinatight
couplingmanner.
Withthehigh precision Laser-Inertial odometryestimated
fromEKFfusion,thelaserscansarethenmergedintotheoc-
cupancygridmap.Ingeneral,theexplorationtaskistomax-
imize the covered area on the grid map. Herein, a frontier
basedmethod[36]isusedtoguidetherobottoexplorealongthe
boundary between unknown area and free area on the grid
map.In the method, random tree incrementally expands to-
wardboundariesduringthe exploration process bysampling
viewpointsasnewnodes.Thenewlyaddednodesin theran-
domtree are then evaluated with information gain and tra-
versingcostas
s(x)=g(x)·exp(λ·c(x)) (10)
g(x)
x
c(x)
x
λ
where is the expected information gain in position ,
is thedistancecostbetween robotandposition ,and
denotesacoefficientthatcontrolsthepenaltyonthedistance
cost.By selecting the branch with maximum score, the first
edgeofthisnodeissetasthenextbestviewtonavigate.The
move_basenavigation module providedby Robot Operating
System (ROS)[37] is employed to calculate the shortest path
basedontheDijkstraalgorithm[38].Therobotfollowsthegen-
Earthshaker:AMobileRescueRobotforEmergenciesandDisastersthroughTeleoperationandAutonomous
Navigation Zhangetal.
6 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
erated path to explore the environment gradually. Once the
targetpoint is reached, the next round of exploration plan-
ningcontinues.Thewholeprocessgetsrepeateduntilthero-
botcoversthewholeareaorfindstheexit.
4 Experimental Validation
ToexaminethefunctionalityofEarshakeranddemonstrateits
superiority,itwas sent toattend the first A-TEC champion-
ships in 2020 as experimental validation. The competition
was held by the government of Shenzhen in Guangdong,
Chinatofurtherenhancerobotictechniquesandseekindustri-
al opportunities[39]. In the finals of the championships, the
competitionwasdivided into five sessionsandall the teams
wererankedbasedontheirperformanceinthosesessions,in-
cludingtask difficulty, task completion, and time consump-
tion.Thefivesessions,inturn,weretraversingroughterrains,
clearingcumbersome obstacles and opening doors,climbing
up and down regular stairs, passing through signal blocked
areas,andsearching and rescuing in smoky indoor environ-
ments,asillustratedinFigure5.Specifically,passingthrough
signalblockedareasrequiredtherobotstoautonomouslynav-
igateinside a mazeandsearchforthe only exit,whileinthe
othersessionstherobotswereremotelycontrolledbyoperat-
ors in first person point of view from hundreds of meters
away.Thesediversesessionsexaminedthecapabilityofpar-
ticipatedrobotsinlocomotion,manipulation,perception,tele-
communication, etc.[4042] Robust and consistent performance
inallsessionsbecame more important thanoutstanding per-
formanceinanysinglesession[43].
During the intense championships, 15 teams globally
entered the finals in total. Out of those teams, Earthshaker
tookthefirstplacewithascoreof109points,whereasthero-
bots from Tsinghua University and Chongqing University
tookthe second and thirdplaces with scoresof 79 and 70.5
points,respectively.
Compared to the Seeker robot from Tsinghua University,
theMIST-RobotfromChongqingUniversity,andmanyoth-
errobotsfromtherestteams,Earthshakerrealizedtransform-
ationofthetracked system for climbing stairsand improve-
mentoncumbersomeobstacleclearingcapabilityinthemost
economical way, the swing arm – dozer blade structure.
Earthshakerwonthecompetitionalsobythediversesensors
integratedintotherobotthatallowedtherobottoberobustly
teleoperated and even achieve autonomous navigation. The
followingsub-sections describe the performance of Earth-
shakerineachsessionofthefinals.
4.1 Traversing miscellaneous terrains
Thissession requiredtherobotto firsttraversea30m-by-3m
roughterrainthatcouldbecoveredbyrubbles, bricks, or ir-
regular concrete debris depending on the selected difficulty
by each team. Following that, the robot needed to pass
throughan area covered by large immobile obstacles, climb
up and down slopes of 36 degrees at most, and travel on a
bridge tilted to the side by 27 degrees. Even though these
taskswererelativelyeasy, they relied heavily ontherobots’
speedandagility.Becausetheroboticarmdidnotneedtobe
operatedduringthissession,the corresponding operator was
abletofly a DJI MavicUnmannedAerialVehicle(UAV)to
provideaglobalviewofthefieldfromabove,whichallowed
thebaseoperatortoplanoperationbeforehandandgreatlyre-
duced the time consumption. Benefited from the great
horsepower and well-designed suspension of the chassis,
Earthshaker performed excellently in these tasks and was
rankedthefirstplaceamongalltherobots.
Besidesthe aforementioned regular tasks, there were also
challengetasksinthissession,wheretherobotsneededtotra-
versemuddyterrainswithpotholes,flatterrainswithtrenches
of various widths, and pools filled with water of different
heights. Earthshaker accomplished these challenging tasks
successfully,asshown in Figure 6. Specifically,whenfaced
Fig. 5.OverviewofthesessionsofA-TECchampionships.
Zhangetal.
7 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
with the trenches, Earthshaker put down the swing arm
dozerblade to increasethe body lengthof the chassis. As a
result,itcrossedthetrenchwithawidthof600mm.Aswith
thewaterpools,becausethewholebodyofEarthshakerwas
waterproofofthelevelIP64andthechassiswaseven water-
proofof the level IP66, Earthshaker was capable of dealing
withthepoolwithwaterdepthof500mm.Itisworthnoting
thatEarthshakerprepared for thepossiblerainyweather dur-
ingthe competition,whereasmanyotherrobots didnothave
thispreparation.Consequently,somerobotssufferedfromthe
rainy weather with their naked electronic interfaces, and
endedupnotbeingabletofinishthecompetition.
4.2 Approaching buildings
In this session, the robots were required to first clear up a
10m-by-20mareaby moving obstacles todesignated places,
andthen open and enter a door with automatic closers. The
obstacles included hollow steel tubes as light as 5 kg, and
steelbeams and concrete blocks as heavy as 50 kg. Earth-
shaker successfully utilized the dozer blade to push all the
obstaclestothetargetpositions.
Thereweremultipledifficultylevelsfordooropening,with
differenttypes of doors and door handles. Options are uni-
foldorbifold doors withspherical handles, L-shape handles
orvalves.Themostchallengingcombination,a unifold door
witha spherical handle, was selected for Earthshaker in the
competition.Becauseofthedoorcloser,therobotneededto
rotatethehandleandmaintain therotationwhileopeningthe
door.Asaresult,thetwooperatorsneededtocooperateinthe
process. One operator needed to first align the 0.8 m wide
Earthshakerwiththe1m wide door frame under the help of
the equipped laser pointers, and then keep commanding the
baseto moveforwardslowlyas thedoorhandlewas rotated,
untilthefrontendofthechassiswaspushedagainstthedoor
andthehandlecouldbereleasedbythegripper.Theotherop-
erator needed to fine tune the robotic arm and the gripper
aftertheinitial semi-autonomous manipulation, grip and ro-
tatethedoor handle as thechassiswasapproachingthedoor
untilthehandlecouldbereleasedfromthegripper.Figure7(a-
e)showssomesnapshotsofthewholeprocessofthissession.
Earthshaker finished this session within 31 minutes and 12
seconds.
4.3 Manipulation inside buildings
Robotsinthissession needed to climb up to and downfrom
theplatform as shownin Figure 7(f-h). Optional wayswere
throughvertical laddersorregularstairs.The trackedchassis
determinedthatEarthshakercouldonlypicktheregularstairs,
which was the common choice among all the robots in the
competition.The stairshad 24steps one-way,every step of
whichhadadepthof300mmandaheightof175mm.Thus,
theinclinationanglewasabout34degrees.Therewasaturn-
ingplatformbetweentwosectionsofstairs.
When Earthshaker was climbing up the stairs, the swing
arm– dozer bladestructure was adjustedto provide enough
contact length for the chassis and help the robot move
smoothly.However,theswingarmwasnotputfullyflatdue
to the detrimental friction generated by the passive arm
tracks,whichwouldhindertherobot from thrusting upward.
Theangleoftheswingarmwasempiricallysettojustenough
tosupporttherobottoclimbupthestairs.Ontheotherhand,
whentherobot was climbing down the stairs,theswingarm
couldbe putfullyflatto takeadvantageofits lengthandthe
passive friction generated to increase stability. Earthshaker
was able to finish this session within 6 minutes and 13
seconds,wheretheclimbingupprocesstookthemajorityof
theconsumedtime.Comparedtotheothersmall tracked ro-
botsinthecompetition,Earthshakerwasslowerduetoitsrel-
ativelycumbersomebodyonthestairway.
4.4 Autonomous navigation
Theautonomousnavigation session tested therobot’sintelli-
Fig. 6.SnapshotsofEarthshakerinSession1.(a-c)Earthshakerwaspassingthroughapoolfilledwithwaterof50cmindepth.(d)Earthshakerwastra-
versingmuddyterrains.(e)Earthshakerwascrossingatrenchof60cminwidth.
Earthshaker:AMobileRescueRobotforEmergenciesandDisastersthroughTeleoperationandAutonomous
Navigation Zhangetal.
8 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
gence in building maps and finding exits within unknown
areaswithout human’shelp.Tosimulatethesituation ofsig-
nallossinreality,during the competition, the refereeturned
on the signal blocker once the robot entered the maze. The
operators inside the control room were also not allowed to
touchtheremotecontrollersduringthisperiod.Themazehad
threepossibleentrancesandthreepossibleexits.Whenthero-
botarrivedatthemaze,onlyoneentrancewouldbeopen,and
also only one exit would be usable. To be fair, inside the
maze,thereweremoveabledoorsthatwereadjustedforeach
robottoformadifferentunknownstructure.Earthshakerwas
abletoshowupattheexitwithin 41.13 seconds in this ses-
sion,rankedthesecondfastestamongalltherobots.Tocheck
thebuiltmapforthemaze,thepointcloudstoredintheNUC
hasbeenextractedafterthecompetition,asshowninFigure8.
Theleft halfof thefigure demonstrates the map built when
therobotjustenteredthemazefromthebottomleftentrance,
wherethewhitelineconnectsthe robot to its target position
on the far side of the maze. The right half of the figure
presentsthe map builtwhentherobotsuccessfully found the
exitonthetopleftandthepathitfollowedinthemazeindic-
atedbyaredsolidline.
4.5 Search and rescue in smoky environment
The last session of the competition involved indoor rescue
work. The robot was supposed to enter dense black smoke
filledroomsand searchforafiresource andawoundedper-
son.Thesmokewasreal,spreadbysomesmokegenerators.
However,thefiresourcewasrepresentedbyanelectricoven,
and the wounded person was actually a sand bag in human
shape.Thedummyweighedabout50 kg. To simulate a real
person,clotheswere putonforthedummy thatcouldgener-
ateheatforaperiodoftime.Therewereintotaleightsimilar
rooms.The fire source and the wounded person were ran-
domlydistributed among them. There were also other com-
mon items like tables, chairs, cabinets, etc., inside those
rooms,justlike regular roomspeople can find intheir daily
life.Therobotneededto find the fire source andturnitoff,
andalsoneededtofindthewoundedpersonandcarryitout
oftheroomtoadesignatedarea.Thesmokewasquitedense
andthevisible distance was lessthanhalfameterinsidethe
rooms.Earthshakerhadtosearcheveryroomunderteleopera-
tiontolocatethe wounded person andtheovenwithtwoin-
fraredcameras,thenuse the gripper to turntheovenoffand
carrythewounded person out. This again required coopera-
tionbetweenthetwooperators.Tocarrythewoundedperson
outoftheroom,acustomizedlassowasinstalledontoEarth-
shakerbeforesetoff. Once the woundedperson was located,
theroboticarmand gripper picked up thelassousing preset
controltrajectoriesandputthelassoaroundthewoundedper-
son’sarm through teleoperation. The lasso then automatic-
allylockeduponcetherobotstartedtodragthewoundedper-
son.Figure9showsscenesfromthissession.Earthshaker fi-
nallyspent11minutesand36secondsfinishingallthetasks.
4.6 Summary
Earthshakerperformedreasonablywellineachsessionofthe
competition, even ranked first in two of the five sessions.
That eventually allowed Earthshaker to take the first place
Fig. 7.SnapshotsofEarthshaker in Session 2&3. (a-b)Earthshakerwasclearing a light obstacleontheleftand a heavy oneontheright.(c-e) Earth-
shakerwasopeningaunifolddoorwithasphericaldoorhandle.(f-h)Earthshakerwasclimbingupanddownthestairs.
Zhangetal.
9 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
among all the robots. The overall score table is shown in
Table 1. As a demonstration of dominance, Earthshaker got
109 points in the finals, whereas the runner-up only got 79
points.Earthshakerstoodoutbyitsmultimodalteleoperation,
itsmodular and waterproof mechatronic design, and suffi-
cient experiments and practice before the ultimate test. The
competitionrequiredacompleteand robustrescuerobotasa
whole, not just any advanced individual module of it.
However,some of the Earthshaker's shortcomings were re-
flectedinthecompetition.Theexcessivesizelimiteditsflex-
ibility of movement, making it hard to pass through certain
narrowspacesinactualuse.Atthesametime,thepayloadto
themanipulatorislimited, thus it cannot complete dexterous
manipulationtasks with large loads. Even though Earth-
shakerstillhadalotofroomtoimprove,itwastheexcellent
mechatronicintegration andtheadvancedcontrolphilosophy
thatmadeitthewinneroftheA-TECchampionships2020.
5 Conclusions
This paper introduces a rescue robot Earthshaker, including
Fig. 8.Themapbuiltforthemazefromthecompetition.Theredslimlinerepresentsthepaththattherobotfollowed.Thegrayareaindicatestheaccess-
iblepartofthemap,whilethecyanareaswithdarkredboundariesindicatetheinaccessibleparts.
Fig. 9.SnapshotsofEarthshakerinSession5.(a)Theinfraredthermalimageofthesmokyenvironment;(b-e)Thesceneandthestrategyusedtorescue
thedummy.
Table 1.FinalscoretableoftheA-TECchampionships2020
Team/Robot
Name
Traversing
miscellaneous
terrains
Approaching
buildings
Manipulation
insidebuildings
Autonomous
navigation
Searchandrescuein
smokyenvironment
Additional
challenge
Time
score Re-challenge Total
score
EarthShaker 25 12 12 18 25 7 10 0 109
Seeker
15 18 1 18 8 7 0 79
MIST-Robot
2 10 25 8 6.5 9 0 70.5
TeamofJingpin
10 25 10 12 8 6 -20 69
TeamofDream 8 25 8 0 15 8 8 -20 52
TeamofShentuo
Tech 6 4 4 4 4 6.5 5 0 33.5
FerociousLionof
Tsinghua 4 18 15 12 0 0 0 -20 29
TeamofWalkers 0 0 6 8 1 0 0 0 15
Earthshaker:AMobileRescueRobotforEmergenciesandDisastersthroughTeleoperationandAutonomous
Navigation Zhangetal.
10 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
the system integration and the control algorithms of it. The
performanceof the robothas been evaluatedto be excellent
duringtheA-TECroboticchampionshipsin2020.Theunique
swing arm – dozerblade structure of Earthshaker helps ex-
tendsthecapabilityofconventionaltrackedchassis,improv-
ingitsperformanceoncumbersomeobstacleclearingandreg-
ular stair climbing. The multimodal teleoperation system
providestherobotredundancyandrobustnesswhentheoper-
atorscannotshowuponsite.Thefiniteautonomyintheoper-
ationofthe roboticarmandgripperhelps releasetheoperat-
ors’workburdentoasuitableextent.Whenteleoperationsig-
nalsarelost,therobotcouldalsoentertheautonomousnavig-
ationmode to search for anexit by itself and giveback the
controlauthoritytotheoperators.Overall,the championship
that Earthshaker earned has shown the efficacy of all the
aforementionedefforts.Itcan play a huge role insearchand
rescue in disaster scenarios such as nuclear accidents, toxic
gasleaks, and fires, where human workers cannot be de-
ployedduetoradiation, dangeroftoxiccontaminationorar-
chitecturecollapse. Future efforts canbe put into improving
therobot’sautonomyinmanyforeseeabletasksforemergen-
ciesanddisasterstofurtherincreaseitsefficiencyandrobust-
ness. More earthshaking endeavors in helping the human
communitycanbeexpectedfromEarthshaker.
Acknowledgments
This work is supported by the National Natural Science
Foundation of China (No. U21A20119, No. 62103395) and
the championship prize funded by Shenzhen Leaguer Co.,
Ltd.TheresearchofWGisalsosupportedinpartbytheFun-
damentalResearchFundsfortheCentralUniversities.
Conflict of interest
Theauthorsdeclarethattheyhavenoconflictofinterest.
Biographies
Yu ZhangYuZhang received the BEdegreeinengineering from the
UniversityofScienceandTechnologyofChina(USTC)andiscurrently
aMasterstudentintheBio-Inspired Robotics Laboratory in the USTC.
Hiscurrentresearchinterestsincludesystemdesignforspecialrobotsand
leggedrobots.
Yuxiang LiYuxiang Li received the MS degree in Instrument and
MeterEngineeringfromZhengzhouUniversity.He iscurrently pursuing
hisPhD degreeinHarbinInstituteof TechnologyShenzhen.His current
researchinterestsincluderoboticsandartificialintelligence.
Wei GaoiscurrentlyanassociateresearchfellowintheDepartmentof
PrecisionMachinery and Precision Instrumentationat the University of
Scienceand Technology of China(USTC).Hereceivedhis Bachelor of
Engineeringdegree from the Departmentof Mechanical Engineering at
NorthwesternPolytechnicalUniversityin2011, andhis DoctorofPhilo-
sophydegreefromtheDepartmentofMechanicalEngineeringatFlorida
StateUniversity(FSU)in2019.Hewasapostdoctoralresearchfellowin
FSUfrom2019to2020,andinUSTCfrom2020to2022.Hiscurrentre-
searchfocusesondynamiccontrolofmobilerobots.
Haoyao Cheniscurrentlya Professor in Harbin Institute of Techno-
logyShenzhen andtheState KeyLaboratoryof RoboticsandSystem of
China.He receivedtheBachelor’ sdegreeinMechatronics andAutoma-
tion from the University of Science and Technology of China in 2004,
andthe PhDdegreeintheRobotics fromboththe UniversityofScience
andTechnologyofChinaandtheCityUniversityofHongKongin2009.
Hewasworking asavisitingscholarintheAutonomousSystemsLabin
ETHz,Switzerland.Hisresearchinterestsincludeaerialmanipulationand
transportation,roboticperceptionandcognition,multi-robotsystems.
Shiwu Zhangis currently a professor in the Department of Precision
MachineryandPrecisionInstrumentation,USTC.HereceivedhisB.Eng.
degree in Mechanical and Electronic Engineering from USTC in 1997,
and his Ph.D. degree in the Precision Instrumentation and Machinery
from USTC in 2003. He has been a visiting scholar in University of
Wollongong,Australia in2016andin theOhiostate university,USAin
2012,respectively.His researchinterestsincludeamphibiousrobots,soft
robots,leggedrobots,liquidmetalrobotsandrescuerobots.
References
DEGREEFFJ,MIOCHT,VANVUGHTW,etal.Persistentrobot-
assisted disaster response[C]. Companion of the 2018 ACM/IEEE
International Conference on Human-Robot Interaction, 2018: 99-
100.
[1]
MATSUNOF,SATON,KONK,etal.Utilizationofrobotsystems
indisastersitesofthegreateasternjapanearthquake[C].Fieldand
servicerobotics,2014:1-17.
[2]
QUERALTA J P, TAIPALMAA J, PULLINEN B C, et al.
Collaborativemulti-robotsearchandrescue:Planning,coordination,
perception,andactivevision.Ieee Access,2020,8:191617–191643.
[3]
NAGATANI K, KIRIBAYASHI S, OKADA Y, et al. Emergency
responseto the nuclear accidentat the Fukushima Daiichi Nuclear
PowerPlantsusingmobilerescuerobots.Journal of Field Robotics,
2013,30(1):44–63.
[4]
DELMERICOJ,MINTCHEVS,GIUSTI A,etal.Thecurrentstate
and future outlook of rescue robotics. Journal of Field Robotics,
2019,36(7):1171–1191.
[5]
ATKESONC G, BABUBPW, BANERJEEN,etal. No falls,no
resets: Reliable humanoid behavior in the DARPA robotics
challenge[C]. 2015 IEEE-RAS 15th International Conference on
HumanoidRobots(Humanoids),2015:623-630.
[6]
FENGS,WHITMANE,XINJILEFUX,etal.Optimizationbased
full body control for the darpa robotics challenge. Journal of field
robotics,2015,32(2):293–312.
[7]
SPENKOM,BUERGERS,IAGNEMMAK.TheDARPArobotics
challengefinals: humanoid robots to the rescue[M]. 121. Springer,
2018.
[8]
SHEHR, SCHWERTFEGER S,VISSERA.16 YearsofRoboCup
Rescue.KI-Künstliche Intelligenz,2016,30(3):267–277.
[9]
KARUMANCHI S, EDELBERG K, BALDWIN I, et al. Team
RoboSimian: semiautonomous mobile manipulation at the 2015
DARPA robotics challenge finals[J]. Journal of Field Robotics,
2017,34(2):305-332.
[10]
SCHWARZ M, BEUL M, DROESCHEL D, et al. DRC team
nimbro rescue: perception and control for centaur-like mobile
manipulation robot momaro [M]. The DARPA robotics challenge
finals:humanoidrobotstotherescue.Springer.2018:145-190.
[11]
STENTZ A, HERMAN H, KELLY A, et al. CHIMP, the CMU
highlyintelligent mobileplatform.Journal of Field Robotics,2015,
32(2):209–228.
[12]
HUTTER M, GEHRING C, LAUBER A, et al. Anymal-toward
leggedrobots forharshenvironments. Advanced Robotics,2017,31
(17):918–931.
[13]
KRUIJFF-KORBAYOVá I, FREDA L, GIANNI M, et al.
Deployment of ground and aerial robots in earthquake-struck
amatrice in italy (brief report)[C]. 2016 IEEE international
symposium on safety, security, and rescue robotics (SSRR), 2016:
[14]
Zhangetal.
11 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
278-279.
Autonomous robot for gas and oil sites challenge [EB/OL]. http://
www.argoschallenge.com/en/challenge.
[15]
TAUROBARGOSwinner[EB/OL].http://taurob.com/textargos
gewinner/.
[16]
ENDOD,NAGATANIK.Assessmentofatrackedvehicle’sability
totraversestairs.ROBOMECH Journal,2016,3(1):1–13.
[17]
YAMAUCHI G, NAGATANI K, HASHIMOTO T, et al. Slip-
compensatedodometryfortrackedvehicleonlooseandweakslope.
Robomech Journal,2017,4(1):1–11.
[18]
ROUČEKT,PECKAM,ČíŽEK P, et al. System for multi-robotic
explorationof undergroundenvironmentsCTU-CRAS-NORLAB in
the DARPA Subterranean Challenge. arXiv preprint arXiv:2110,
0591,1:2021.
[19]
AGHAA,OTSUK,MORRELLB,etal.Nebula:Quest forrobotic
autonomy in challenging environments; team costar at the darpa
subterraneanchallenge.arXiv preprint arXiv:2103,1147,0:2021.
[20]
CHENX, ZHANGH,LU H,etal. RobustSLAMsystem basedon
monocular vision and LiDAR for robotic urban search and
rescue[C].2017 IEEE International SymposiumonSafety,Security
andRescueRobotics(SSRR),2017:41-47.
[21]
ROUČEK T, PECKA M, ČíŽEK P, et al. Darpa subterranean
challenge: Multi-robotic exploration of underground environ-
ments[C].InternationalConferenceonModellingandSimulationfor
AutonomousSystems,2019:274-290.
[22]
TRANZATTO M, MASCARICH F, BERNREITER L, et al.
Cerberus:Autonomouslegged and aerial robotic exploration in the
tunneland urbancircuitsofthedarpa subterraneanchallenge.arXiv
preprint arXiv:2201,0706,7:2022.
[23]
SCHWARZ M, RODEHUTSKORS T, DROESCHEL D, et al.
NimbRoRescue:Solvingdisasterresponsetaskswith the mobile
manipulation robot Momaro. Journal of Field Robotics,2017, 34
(2):400–425.
[24]
LIY, LIM,ZHUH, etal.Development andapplicationsofrescue
robots for explosion accidents in coal mines. Journal of Field
Robotics,2020,37(3):466–489.
[25]
LöSCHR,GREHLS,DONNERM,etal.Designofanautonomous
robot for mapping, navigation, and manipulation in underground
mines[C]. 2018 IEEE/RSJ International Conference on Intelligent
RobotsandSystems(IROS),2018:1407-1412.
[26]
SZREK J, ZIMROZ R, WODECKI J, et al. Application of the
infrared thermography and unmanned ground vehicle for rescue
action support in underground mine —The amicos project. Remote
Sensing,2020,13(1):69.
[27]
BHATIA R, LI L. Throughput optimization of wireless mesh[28]
networks with MIMO links[C]. IEEE INFOCOM 2007-26th IEEE
InternationalConferenceonComputerCommunications,2007:2326-
2330.
PEARSONK. LIII. Onlinesand planes ofclosestfit to systemsof
pointsinspace. The London, Edinburgh, and Dublin philosophical
magazine and journal of science,1901,2(11):559–572.
[29]
MURRAYRM,LI Z, SASTRY S S. A mathematical introduction
toroboticmanipulation[M].CRCpress,2017.
[30]
KRIEGEL H P, KRöGER P, SANDER J, et al. Densitybased
clustering. Wiley interdisciplinary reviews: data mining and
knowledge discovery,2011,1(3):231–240.
[31]
ZHANGJ,SINGHS.LOAM:LidarOdometryandMappinginReal-
time[C].Robotics:ScienceandSystems,2014:1-9.
[32]
BESL PAUL J, MCKAY N D. A method for registration of 3-D
shapes. IEEE Transactions on Pattern Analysis and Machine
Intelligence,1992,14(2):239–256.
[33]
FORSTER C, CARLONE L, DELLAERT F, et al. IMU
preintegrationon manifold for efficient visual-inertial maximum-a-
posterioriestimation[C],2015.
[34]
SIMANEK J, REINSTEIN M, KUBELKA V. Evaluation of the
EKF-basedestimationarchitecturesfordatafusioninmobilerobots.
IEEE/ASME Transactions on Mechatronics,2014,20(2):985–990.
[35]
BIRCHERA,KAMELM,ALEXISK,etal.Recedinghorizon"next-
best-view" planner for 3d exploration[C]. 2016 IEEE international
conferenceonroboticsandautomation(ICRA),2016:1462-1468.
[36]
move_base- ROS Wiki [EB/OL].(2020)[http://wiki.ros.org/move_
base.
[37]
EDSGER D, MISA T J. An interview with edsger w. dijkstra.
Communications of the ACM,2010,53(8):41–47.
[38]
A-TEC Official Website [EB/OL]. (2020)[https://atec.leaguer.com.
cn/index/index/championshipsjj.
[39]
DEPETRISP, NGUYEN H, DHARMADHIKARIM,etal.RMF-
Owl: A Collision-Tolerant Flying Robot for Autonomous
Subterranean Exploration[J]. arXiv preprint arXiv: 2202.11055,
2022.
[40]
HUDSON N, TALBOT F, COX M, et al. Heterogeneous Ground
and Air Platforms, Homogeneous Sensing: Team CSIRO Data61's
Approach to the DARPA Subterranean Challenge. arXiv preprint
arXiv:2104,0905,3:2021.
[41]
OTSU K, TEPSUPORN S, THAKKER R, et al. Supervised
autonomyfor communication-degraded subterraneanexplorationby
arobotteam[C].2020IEEEAerospaceConference,2020:1-9.
[42]
OHRADZANSKYMT,RUSHER,RILEYDG,etal.Multi-agent
autonomy: Advancements and challenges in subterranean
exploration[J].arXivpreprintarXiv:2110.04390,2021.
[43]
Earthshaker:AMobileRescueRobotforEmergenciesandDisastersthroughTeleoperationandAutonomous
Navigation Zhangetal.
12 DOI:10.52396/JUSTC-2022-0066
JUSTC,2022,52(X):
Just Accepted
... That becomes important if the user cannot see the robot directly. Several studies have displayed visualizations of robot teleoperation using computer displays including Zhang et al. [55]who created a rescue robot and then, Senft et al. [56] makes telemanipulation on the robotic arm. Apart from using computers, some studies use smartphones for camera visualization including Ainasoja et al. [57], and using augmented reality also virtual reality headset for camera visualization including Dardona et al. [58], Gonzalez et al. [59], Kot et al. [60], Wibowo et al. [61], Stotko et al. [62], Solanes et al. [63], and Doki et al. [64]. ...
Article
Full-text available
The industrial revolution 4.0 and the rapid advancement of technology during pandemic era has made significant progress in robot industry. There are three types of robots based on the level of independent control such as autonomous mobile robots, semi-autonomous mobile robots, and controller mobile robots. Semi-autonomous mobile robot and Controlled mobile still require human intervention in carrying out the tasks. In this study, teleoperation on mobile robots is displayed on various platforms such as Smartphone, Virtual reality headset as a virtual reality platform, and Computer (SVC) device. A camera is used as a visual sensor that dispatches the information of the surroundings to each platform. The controller used for the teleoperation varies depending on the platforms. The computer platform uses the arrow key on the keyboard. The smartphone platform uses touchscreen. Virtual reality uses Oculus Touch that has been integrated with Oculus Quest 2. ROS# as a robot API framework with Unity engine is used for the communication process between the robot and each platform. Teleoperation Experiments on SVC devices refer to four parameters including task completion rates, task time, satisfaction, and error counts. all these parameters will be combined into a single usability metric (SUM). The SUM results from SVC devices show 54.9% on Smartphones, 79.5% on VR devices, and 90.4% on Computers.
Article
Full-text available
We present a field report of the CTU-CRAS-NORLAB team from the Subterranean Challenge (SubT) organized by the Defense Advanced Research Projects Agency (DARPA). The contest seeks to advance technologies that would improve the safety and efficiency of search-andrescue operations in GPS-denied environments. During the contest rounds, teams of mobile robots have to find specific objects while operating in environments with limited radio communication, e.g., mining tunnels, underground stations or natural caverns. We present a heterogeneous exploration robotic system of the CTU-CRAS-NORLAB team, which achieved the third rank at the SubT Tunnel and Urban Circuit rounds and surpassed the performance of all other non-DARPA-funded teams. The field report describes the team’s hardware, sensors, algorithms and strategies, and discusses the lessons learned by participating at the DARPA SubT contest.
Article
Full-text available
Artificial intelligence has undergone immense growth and maturation in recent years, though autonomous systems have traditionally struggled when fielded in diverse and previously unknown environments. DARPA is seeking to change that with the Subterranean Challenge, by providing roboticists the opportunity to support civilian and military first responders in complex and high-risk underground scenarios. The subterranean domain presents a handful of challenges, such as limited communication, diverse topology and terrain, and degraded sensing. Team MARBLE proposes a solution for autonomous exploration of unknown subterranean environments in which coordinated agents search for artifacts of interest. The team presents two navigation algorithms in the form of a metric-topological graph-based planner and a continuous frontier-based planner. To facilitate multi-agent coordination, agents share and merge new map information and candidate goal points. Agents deploy communication beacons at different points in the environment, extending the range at which maps and other information can be shared. Onboard autonomy reduces the load on human supervisors, allowing agents to detect and localize artifacts and explore autonomously outside established communication networks. Given the scale, complexity, and tempo of this challenge, a range of lessons was learned, most importantly, that frequent and comprehensive field testing in representative environments is key to rapidly refining system performance.
Article
Full-text available
Heterogeneous teams of robots, leveraging a balance between autonomy and human interaction, bring powerful capabilities to the problem of exploring dangerous, unstructured subterranean environments. Here we describe the solution developed by Team CSIRO Data61, consisting of CSIRO, Emesent, and Georgia Tech, during the DARPA Subterranean Challenge. These presented systems were fielded in the Tunnel Circuit in August 2019, the Urban Circuit in February 2020, and in our own Cave event, conducted in September 2020. A unique capability of the fielded team is the homogeneous sensing of the platforms utilized, which is used to obtain a decentralized multi-agent SLAM solution on each platform (both ground agents and UAVs) using peer-to-peer communications. This approach enabled a shift in focus from constructing a pervasive communications network to relying on multi-agent autonomy, motivated by experiences in early circuit events. These experiences also showed the surprising capability of rugged tracked platforms for challenging terrain, which in turn led to the heterogeneous team structure based on a BIA5 OzBot Titan ground robot and an Emesent Hovermap UAV, supplemented by smaller tracked or legged ground robots. The ground agents use a common CatPack perception module, which allowed reuse of the perception and autonomy stack across all ground agents with minimal adaptation.
Article
Full-text available
Autonomous exploration of subterranean environments constitutes a major frontier for robotic systems, as underground settings present key challenges that can render robot autonomy hard to achieve. This problem has motivated the DARPA Subterranean Challenge, where teams of robots search for objects of interest in various underground environments. In response, we present the CERBERUS system-of-systems, as a unified strategy for subterranean exploration using legged and flying robots. Our proposed approach relies on ANYmal quadraped as primary robots, exploiting their endurance and ability to traverse challenging terrain. For aerial robots, we use both conventional and collision-tolerant multirotors to explore spaces too narrow or otherwise unreachable by ground systems. Anticipating degraded sensing conditions, we developed a complementary multimodal sensor-fusion approach, utilizing camera, LiDAR, and inertial data for resilient robot pose estimation. Individual robot pose estimates are refined by a centralized multi-robot map-optimization approach to improve the reported location accuracy of detected objects of interest in the DARPA-defined coordinate frame. Furthermore, a unified exploration path-planning policy is presented to facilitate the autonomous operation of both legged and aerial robots in complex underground networks. Finally, to enable communication among team agents and the base station, CERBERUS utilizes a ground rover with a high-gain antenna and an optical fiber connection to the base station and wireless “breadcrumb” nodes deployed by the legged robots. We report results from the CERBERUS system-of-systems deployment at the DARPA Subterranean Challenge’s Tunnel and Urban Circuit events, along with the current limitations and the lessons learned for the benefit of the community.
Article
Full-text available
Extraction of raw materials, especially in extremely harsh underground mine conditions, is irrevocably associated with high risk and probability of accidents. Natural hazards, the use of heavy-duty machines, and other technologies, even if all perfectly organized, may result in an accident. In such critical situations, rescue actions may require advanced technologies as autonomous mobile robot, various sensory system including gas detector, infrared thermography, image acquisition, advanced analytics, etc. In the paper, we describe several scenarios related to rescue action in underground mines with the assumption that searching for sufferers should be done considering potential hazards such as seismic, gas, high temperature, etc. Thus, possibilities of rescue team activities in such areas may be highly risky. This work reports the results of testing of a UGV robotic system in an underground mine developed in the frame of the AMICOS project. The system consists of UGV with a sensory system and image processing module that are based on an adaptation of You Only Look Once (YOLO) and Histogram of Oriented Gradients (HOG) algorithms. The experiment was very successful; human detection efficiency was very promising. Future work will be related to test the AMICOS technology in deep copper ore mines.
Article
Full-text available
Search and rescue (SAR) operations can take significant advantage from supporting autonomous or teleoperated robots and multi-robot systems. These can aid in mapping and situational assessment, monitoring and surveillance, establishing communication networks, or searching for victims. This paper provides a review of multi-robot systems supporting SAR operations, with system-level considerations and focusing on the algorithmic perspectives for multi-robot coordination and perception. This is, to the best of our knowledge, the first survey paper to cover (i) heterogeneous SAR robots in different environments, (ii) active perception in multi-robot systems, while (iii) giving two complementary points of view from the multi-agent perception and control perspectives. We also discuss the most significant open research questions: shared autonomy, sim-to-real transferability of existing methods, awareness of victims' conditions, coordination and interoperability in heterogeneous multi-robot systems, and active perception. The different topics in the survey are put in the context of the different challenges and constraints that various types of robots (ground, aerial, surface, or underwater) encounter in different SAR environments (maritime, urban, wilderness, or other post-disaster scenarios). The objective of this survey is to serve as an entry point to the various aspects of multi-robot SAR systems to researchers in both the machine learning and control fields by giving a global overview of the main approaches being taken in the SAR robotics area.
Chapter
Full-text available
The Subterranean Challenge (SubT) is a contest organised by the Defense Advanced Research Projects Agency (DARPA). The contest reflects the requirement of increasing safety and efficiency of underground search-and-rescue missions. In the SubT challenge, teams of mobile robots have to detect, localise and report positions of specific objects in an underground environment. This paper provides a description of the multi-robot heterogeneous exploration system of our CTU-CRAS team, which scored third place in the Tunnel Circuit round, surpassing the performance of all other non-DARPA-funded competitors. In addition to the description of the platforms, algorithms and strategies used, we also discuss the lessons-learned by participating at such contest.
Book
The DARPA Robotics Challenge was a robotics competition that took place in Pomona, California USA in June 2015. The competition was the culmination of 33 months of demanding work by 23 teams and required humanoid robots to perform challenging locomotion and manipulation tasks in a mock disaster site. The challenge was conceived as a response to the Japanese Fukushima nuclear disaster of March 2011. The Fukushima disaster was seen as an ideal candidate for robotic intervention since the risk of exposure to radiation prevented human responders from accessing the site. This volume, edited by Matthew Spenko, Stephen Buerger, and Karl Iagnemma, includes commentary by the organizers, overall analysis of the results, and documentation of the technical efforts of 15 competing teams. The book provides an important record of the successes and failures involved in the DARPA Robotics Challenge and provides guidance for future needs to be addressed by policy makers, funding agencies, and the robotics research community. Many of the papers in this volume were initially published in a series of special issues of the Journal of Field Robotics. We have proudly collected versions of those papers in this STAR volume.