ArticlePDF Available

Integrating Building Information Model and Augmented Reality for Drone-Based Building Inspection

Authors:

Abstract and Figures

Unmanned aerial vehicles (UAVs) have been widely accepted for building inspection in recent years. However, the advantages of UAV inspection are not fully leveraged because of the absence of related information to assist decision-making during the inspection process. To address the problem, this study proposes an augmented reality (AR) solution by integrating the UAV inspection workflow with the building information model (BIM) of the building of interest, which is driven to simultaneously navigate with the aerial video during inspection. The integration enables easy and straightforward retrieval of useful information from BIM to help better understand the risk issues detected from the aerial video. An overall algorithm pipeline is proposed to drive the connective animation of BIM and the aerial video. The pipeline includes a fast coordinate transformation algorithm (F-Trans) that simplifies the process of converting the WGS-84 coordinates collected by UAVs to BIM project coordinates. An AR system prototype is developed to demonstrate the specific ways of applying the above algorithm pipeline to support the UAV-based building inspection. The efficacy of the proposed solution has been validated by experiments. The results demonstrate that the F-Trans can achieve a sub-meter precision, and the average matching degree between the aerial video and BIM was 74.1%. The developed AR solution has a great potential to enable a more efficient, comprehensive, and unbiased UAV-based building inspection.
Content may be subject to copyright.
1
Integrating Building Information Model and Augmented Reality for Drone-Based Building 1
Inspection 2
Donghai LIU1, Xietian XIA2, Junjie CHEN3,*, and Shuai LI4 3
1 Professor, State Key Laboratory of Hydraulic Engineering Simulation and Safety, Tianjin University, 135 Yaguan 4
Road, Tianjin 300350, China. Email: liudh@tju.edu.cn. 5
2 Ph.D. Student, State Key Laboratory of Hydraulic Engineering Simulation and Safety, Tianjin University, 135 6
Yaguan Road, Tianjin 300350, China. Email: xiaxietian@tju.edu.cn. 7
3 Ph.D., State Key Laboratory of Hydraulic Engineering Simulation and Safety, Tianjin University, 135 Yaguan 8
Road, Tianjin 300350, China. Email: chenjj@tju.edu.cn. 9
4 Assistant Professor, Department of Civil and Environmental Engineering, the University of Tennessee, Knoxville, 10
851 Neyland Dr., Knoxville, TN 37902, the United States. Email: sli48@utk.edu. 11
ABSTRACT 12
Unmanned aerial vehicles (UAVs) have been widely accepted for building inspection in recent years. However, 13
the advantages of UAV inspection are not fully leveraged because of the absence of related information to assist 14
decision-making during the inspection process. To address the problem, this study proposes an augmented reality 15
(AR) solution by integrating the UAV inspection workflow with the building information model (BIM) of the 16
building of interest, which is driven to simultaneously navigate with the aerial video during inspection. The 17
integration enables easy and straightforward retrieval of useful information from BIM to help better understand the 18
risk issues detected from the aerial video. An overall algorithm pipeline is proposed to drive the connective 19
animation of BIM and the aerial video. The pipeline includes a fast coordinate transformation algorithm (F-Trans) 20
that simplifies the process of converting the WGS-84 coordinates collected by UAVs to BIM project coordinates. 21
An AR system prototype is developed to demonstrate the specific ways of applying the above algorithm pipeline to 22
support the UAV-based building inspection. The efficacy of the proposed solution has been validated by 23
2
experiments. The results demonstrate that the F-Trans can achieve a sub-meter precision, and the average matching 24
degree between the aerial video and BIM was 74.1%. The developed AR solution has a great potential to enable a 25
more efficient, comprehensive, and unbiased UAV-based building inspection. 26
Keywords: Augmented reality (AR); Unmanned aerial vehicle (UAV); Building information modeling (BIM); 27
Build inspection; Decision support; Fast coordinate transformation. 28
INTRODUCTION 29
Visual inspection has long been used to detect structural defects to guide the rehabilitation of buildings. 30
Traditional visual inspection relies on field workers to manually check out the status and condition of the building 31
of interest. This practice is not only laborious and time-consuming but also prone to be dangerous in cases such as 32
exterior inspection of high-rise buildings. In recent years, unmanned aerial vehicles (UAVs), or commonly known 33
as drones, have been used to replace field workers for visual inspection (Duque et al. 2018a; O’Riordan-Adjah and 34
MacKenzie 2019). By reviewing the video recorded by an onboard camera installed on the UAV, inspectors are 35
allowed to fulfill their tasks in the office rather than onsite, which leads to an improvement of efficiency and a drop 36
of fatality related to falls for building inspection. 37
However, the practice of reviewing the collected video for structural defect detection can be error-prone if no 38
related information is provided to support decision-making. Without easy access to the information about the 39
building of interest, such as the material properties, object geometry and components location, it is difficult for the 40
inspectors to comprehensively evaluate the safety issues detected from the video. This might either lead to an 41
overestimation or an underestimation of the detected issues: While the former wastes unnecessary resources for 42
restoration, the latter can result in fatal hazards by allowing a serious defect to continuously deteriorate. As a result, 43
it is necessary to develop a means that can augment the aerial inspection video with easy-access building-relevant 44
information to enable an unbiased and comprehensive condition assessment. Not meeting the need will continue to 45
3
have the inspectors make decisions without enough information support, leading to biased judgments that could 46
jeopardize the safety of the building. 47
Recent developments of building information modeling (BIM) and augmented reality (AR) provide an 48
opportunity to address the above research need. BIM is an innovative approach to modeling building information 49
in a 3D object-based information system (Azhar 2011). As a 3D model that integrates information of a building 50
throughout its entire life cycle, BIM serves as a perfect source from which readily available information can be 51
retrieved to support decision-making. AR, as a technology that superimposes virtual objects onto the physical world 52
(Milgram et al. 1995), can enable the interaction between the virtual context and the real scene, and hence can 53
potentially be used to augment the aerial videos captured by UAVs with the virtual BIM model. In the authors’ 54
previous research (Liu et al. 2019), a novel AR framework was proposed to integrate BIM and UAV to assist the 55
inspection of water diversion projects. In the framework, the real-life scene captured by the UAV-borne camera is 56
augmented by a simultaneously navigated BIM. With such a virtual-real interaction, the AR framework provides a 57
visualized and integrative environment to support decision-making for safety inspection of water pipelines. 58
Enlightened by our previous research, this study intends to develop an AR solution for building inspection by 59
adopting the concept of connective animation of BIM and aerial video. Due to the differences in the structures of 60
interest regarding the scale, areas, height, etc., and the different inspection requirements, further development of the 61
original method is required from the following aspects. First, the original coordinate transformation workflow was 62
too complicated and tedious to be implemented; hence, it is important to investigate whether the original workflow 63
can be simplified at the scale of building. Second, since a building has a much smaller scale and presents a much 64
more complicated appearance than a water diversion project, it is unclear whether the matching between the 65
animated BIM and the aerial video in the building inspection scenario can still achieve performance comparable to 66
that observed in the previous study. Third, building inspection has different focuses from the water diversion 67
4
projects, and thus the last challenge rests upon how the original framework can be customized to develop an AR 68
system that can provide functionalities compliant with the specific demands of building inspection. 69
The address of the above three concerns forms our primary contributions. First, a fast coordinate transformation 70
algorithm (F-trans) has been proposed to simplify the procedure of converting WGS-84 coordinates collected by 71
UAVs to BIM project coordinates used by the 3D rendering engine. The simplified algorithm is easy and handy to 72
be deployed in different building projects. Second, the feasibility of the virtual-real interaction algorithm originally 73
proposed for water pipeline inspection has been experimentally testified in simultaneously matching a BIM model 74
of the inspected building with an aerial video (Liu et al. 2019), which extends the algorithm to the area of building 75
inspection. Third, an AR system prototype is developed to demonstrate the specific ways in which the integration 76
of UAV and BIM can support condition assessment and decision-making for building inspection. By combining 77
both the mobility of UAVs and the enormous easy-access information provided by BIM, the developed AR system 78
has great potential to improve the building inspection efficiency and promote the understanding of the detected 79
distress. 80
REVIEW OF RELATED TECHNOLOGIES IN BUILDING INSPECTION 81
Unmanned aerial vehicle (UAV) 82
UAVs are effective inspection platforms due to their merits of high mobility and wide coverage for video recording 83
or image capturing. They have been widely applied in the inspection of bridges (Khan et al. 2015; Duque et al. 84
2018b; Seo et al. 2019), buildings (Roca et al. 2013; Morgenthal and Hallermann 2014; O’Riordan-Adjah and 85
MacKenzie 2019), water diversion projects (Liu et al. 2019), dams (Henriques and Roque 2015), chimney 86
(Hallermann and Morgenthal 2013), photovoltaic panels (Aghaei et al. 2015), and high mast luminaires (HML) 87
(Otero 2015). Image processing techniques and artificial intelligence are being used to detect cracks and damages 88
5
from the visual assets collected by UAVs. For instance, Choi and Kim (2015) developed a UAV-based image 89
collection system for building exterior crack detection. Gopalakrishnan et al. (2018) integrated UAV inspection 90
with a pre-trained deep learning model to detect cracks from the collected aerial images. Morgenthal and 91
Hallermann (2014) outlined a principal approach to assessing the detection quality of UAVs based on automated 92
damage recognition using computer vision methods. Kang and Cha (2018) integrated ultrasonic beacons, a deep 93
convolutional neural network (CNN), and a geotagging method into a novel UAV inspection system, which can 94
perform in an environment where GPS signal is occluded for the effective detection and localization of structural 95
defects. Despite progress that has been made in recent years, existing studies fail to provide effective solutions to 96
augment the aerial video with abundant and intuitive information to support condition assessment and decision-97
making. In fact, while automated defect detection is important, it is also equally essential to provide relevant 98
information, such as geometric dimension and material composition, to empower the decision-making of engineers 99
and professionals when the videos are reviewed for further evaluation of the detected issues. As a 3D object-based 100
information system of the built environment, BIM serves as a potential source where readily available information 101
of the structure of interest can be retrieved. 102
Building information modeling (BIM) 103
BIM contains multi-source information of a building project throughout its life cycle, encompassing the phase of 104
planning, designing, construction, operation and maintenance, and even demolition (Azhar 2011). While it was 105
mainly used in the designing and construction phase in the past, BIM is gradually being adopted nowadays for the 106
operation and maintenance of building projects (Parsanezhad 2014). In (Chen et al. 2020), BIM was used for the 107
inspection and maintenance of fire safety equipment; visual inspection and a pictorial survey were combined with 108
BIM to assess the water ponding defect on a flat roof in (Ani et al. 2015); BIM was used as a knowledge repository 109
to store historical inspection records for the improvement of the inspection-repair process by (Zhan et al. 2019); 110
6
McGuire et al. (2016) used BIM to link and analyze data related to the inspection, evaluation, and management of 111
bridges for tracking and assessing the structural condition; Shim et al. (2017) developed a BIM-based bridge 112
maintenance system for cable-stayed bridges; Hsiao and Lin (2018) developed a web-based BIM models inspection 113
and modification management system. Existing studies demonstrate the potential of using the life-cycle information 114
from BIM to augment the UAV-based visual inspection. 115
Augmented reality (AR) 116
AR is a promising technique to improve the efficiency of visual inspection and to support decision-making. By 117
superimposing related information from the virtual context onto the real scene, inspectors are allowed to gain 118
comprehensive situational awareness about the environment. He et al. (2007) investigated the application of AR in 119
visualizing underground utilities in real-life contexts. Kopsida and Brilakis (2016) proposed a markerless mobile-120
based AR framework for building inspection in interior environments. Li et al. (2015; 2016) and Yuan et al. (2018) 121
used ground penetrating radars to map and characterize buried pipelines, which lays the foundation for visualizing 122
invisible underground utilities in AR. Li et al. (2018) developed a mobile-based AR system for underground pipeline 123
inspection and management. Kumar et al. (2019) proposed a sensor fusion based AR system for pipeline inspection 124
and retrofitting. Dang and Shim (2020) proposed a BIM-based innovative bridge maintenance system and used AR 125
devices to conduct automated inspection. 126
In literature (Milgram et al. 1995), AR is defined as a technique that "augments natural feedback to the operator 127
with simulated cues". In this sense, systems, methods, and technologies that facilitate humans understanding 128
towards the physical world by the interaction with virtual context can be considered as AR. Existing works mainly 129
focus on overlaying virtual information onto the real scene captured by handheld devices (e.g. smartphones) or 130
wearable equipment (e.g. AR Helmet); thus, it is uncertain whether they will be suitable to augment the aerial video 131
with virtual contents retrieved from BIM. In addition, it is necessary to develop an AR system customized to the 132
7
specific requirements of UAV-based building inspection. Liu et al. (2019) presented a dynamic BIM-augmented 133
UAV inspection method for large-scale infrastructure, which successfully addressed the information augmentation 134
issue by simultaneously navigating the virtual BIM model and the real-life inspection video. However, building has 135
a smaller scale than infrastructure (Liu et al. 2018), which leads to a closer location of UAV from the building and 136
a more complex geometry outline of the inspected target in the video frame. Hence it remains unclear whether the 137
method proposed for the case of infrastructure is suitable for building inspection. In addition, the coordinate 138
transformation procedure in the original method is too complicated to perform without the involvement of domain 139
experts. 140
PIPELINE OF UAV-BIM CONNECTIVE ANIMATION 141
Fig. 1 illustrates the overall pipeline of the proposed UAV-BIM connective animation algorithm. A UAV is 142
used to perform building inspection. During the inspection, an aerial video and the flight status of the UAV (position, 143
posture, field of view, etc.) are recorded. When the aerial inspection video is played and reviewed afterwards, a 144
BIM model of the building of interest is designated to simultaneously navigate with the aerial video by utilizing the 145
corresponding flight status data. The specific procedure is as follows. 146
(1) Coordinate transformation 147
Since the position coordinates collected by the UAV and the BIM model use different coordinate systems, the 148
WGS-84 coordinates collected by the UAV need to be converted to BIM project coordinates before they can be 149
used to guide the BIM model navigation. 150
(2) Camera parameters matching 151
The rendering of the BIM scenario is determined by parameters of a virtual camera in the 3D engine. The 152
specific forms of the parameters of the virtual camera are different from the flight status data collected by UAV. 153
For instance, the yaw, pitch, and roll angle are used to describe the posture of the UAV camera, while a camera up 154
8
vector and a line-of-sight vector are used for the virtual camera. Therefore, the collected data that describes the 155
UAV location, posture, and the field of view needs to be matched to the parameters used by the virtual camera. 156
(3) Connective animation 157
The flight status data after coordinate transformation and parameters matching is retrieved to drive the BIM 158
model to continuously and smoothly navigate along with the aerial video. 159
F-Trans: Converting WGS-84 to BIM project coordinates 160
Unlike a water diversion project that usually spans a long distance and a large area (Liu et al. 2018, 2019), the 161
floor space of building is usually quite limited. For example, New Century Global Center at Chengdu, one of the 162
single buildings with the largest floor area (Wikipedia 2020), “only” covers an area of 500×400 square meters, 163
which can be seen as a single spot compared with those water diversion projects that cover thousands of square 164
miles. Due to this characteristic, the selection of the reference ellipsoid will not have a significant influence on the 165
results of coordinates transformation in a building project. Hence, it is feasible to simplify the transformation 166
process by omitting the 3D transformation in the original method proposed by Chen et al. (2019). 167
A fast coordinate transformation algorithm (F-Trans) is proposed to convert the WGS-84 coordinates collected 168
by the UAV to BIM project coordinates. As shown in Fig. 2, F-Trans includes two steps, i.e., coordinate projection, 169
and plane transformation. In the first step, the geographic coordinates (typically denoted by longitude
lon
and 170
latitude
lat
) in the WGS-84 system are projected to a plane coordinate system, of which the coordinates are usually 171
denoted by x and y. With determined ellipsoidal parameters, projection principle, and central meridian, this 172
projection operation can be easily conducted by GIS (geographic information system) software. 173
The second step transforms the plane coordinates in the WGS-84 system to BIM project coordinates. This step 174
can be geometrically represented by a transformation between two different plane Cartesian coordinate systems, as 175
9
shown on the right-hand side of Fig. 2. The BIM project coordinate system (
O
as origin, and XBIM and YBIM as 176
two axes) is the result of translating and rotating the original coordinate system (O as origin, and X and Y as two 177
axes), respectively by a vector
(,)XY
and an angle
. As a result, the coordinates in a BIM project 178
coordinate system can be calculated by Eq. (1). 179
cos sin
( , ) ( , ) ( , )
sin cos



= + 


T T T
bim bim
x y x y X Y
180
Where,
(, )xy
and
(, )
bim bim
xy
are, respectively, the plane coordinates in the WGS-84 system and the BIM 181
project coordinates. The translation vector
(,)XY
and rotation angle
vary according to the specific 182
application scenario. Hence, these three unknown parameters need to be calculated based on at least two reference 183
points, of which the coordinates are known in both the BIM project system and the original system. In chapter 5, 184
the calculation process will be elaborated by a real-world case. 185
The height of a point in the BIM project system (
bim
h
) can be calculated by subtracting a value
H
from 186
the elevation value (
h
) collected by the UAV, as illustrated by Eq. (2). 187
= −
bim
h h H
188
Where the value
H
is calculated based on at least one reference point. 189
Virtual-real matching algorithms 190
As shown in Fig. 3, a virtual camera in a BIM rendering engine uses two aspects of parameters: physical 191
parameters and optical parameters. The physical parameters describe the location and the posture of the camera, 192
which include a position vector
peye
, a camera-up vector
vup
, and a line-of-sight vector
vat
. The optical 193
parameters define the projection system and the scope of rendering. A perspective projection system is used, and it 194
includes four parameters, i.e., fovV, aspect, near, and far. 195
(
1
)
(2)
10
Using the recorded data about the position and posture of the UAV, the physical parameters of the virtual 196
camera can be calculated by Eq. (3) ~ Eq. (5). 197
( , , )p=T
eye bim bim bim
x y h
198
sin( )sin cos( )sin cos
22
cos( )sin sin( )sin cos
22
cos cos
up

 

 


− −


=
− −



v
199
(cos cos( - ),cos sin( - ),sin )
22
v

 
=T
at
200
Where,
( , , )
bim bim bim
x y h
is the position coordinates of UAV after transformation;
,
, and
describe the 201
posture of the UAV-borne camera, which are, respectively, the yaw, pitch, and roll angle. 202
The optical parameters of the virtual camera are calculated by Eq. (6). 203
/
min
 
 
 
=
 
 
+
 
RR
VR
aspect w h
fov fov
near
far
204
Where,
R
w
and
R
h
are respectively the width and the height of the imaging plane of the real camera;
R
fov
is 205
the field of view of the real camera; min is a minimal constant, and
+
represents infinity. 206
Connective animation 207
Fig. 4 illustrates an algorithm flowchart to drive the connective animation between the aerial video and the 208
BIM model. With the process started, a listening event is activated to detect whether the aerial video is being played. 209
If it is “Yes”, the time interval
t
is set as 2 s; on the contrary,
t
is set as 0. Next, the global sampling time 210
t
of the current frame of the video is calculated by adding the video start-recording time
0
t
to the current playing 211
progress
c
t
. The data of flight status, i.e., latitude, longitude, height, yaw, pitch, and roll, at time
tt+
is then 212
retrieved, which is denoted by
( , , , , , )
 
+=
tt
uav
d lat lon h
. Note that optical parameters such as fovV and aspect, 213
which are usually fixed values for a certain kind of UAV, have already been pre-programmed into the system. Next, 214
(3)
(4)
(5)
(6)
11
F-trans and the matching algorithm are executed to obtain the target camera status
+( , , )p v v
=
t t eye up at
S
. Then, 215
Tween.js is used to realize the smooth transition of the BIM camera from the current state to the target state
+tt
S
. 216
When the Tween object is instantiated, the transition duration should be set equal to
t
. At last, once the transition 217
process is finished, the program automatically executes the next cycle of the aforementioned process unless a ‘stop’ 218
event is activated. With this method, the connective animation between the aerial video and BIM model is realized. 219
AR SYSTEM PROTOTYPE FOR BUILDING INSPECTION 220
Based on the aforementioned algorithms, an AR system prototype has been developed to support UAV-based 221
building inspection. 222
System architecture 223
Fig. 5 shows the architecture of the system prototype. The AR system prototype adopts the B/S mode, which 224
comprises a database server and client-end web browsers. The database server stores inspection-related data to 225
support the AR inspection on the web browser. Through the web browser, inspectors can log in to the system to 226
view the aerial inspection video and implement the AR inspection. 227
System functionalities 228
The functionalities of the system prototype can be divided into four modules, i.e., data uploading module, AR 229
inspection module, query module, and system setting module. 230
(1) Data uploading. Via this module, users can upload the inspection-related data onto the database server, 231
which includes inspection specifications, the BIM model of the building to be inspected, UAV inspection videos, 232
and flight status data. 233
(2) AR inspection. Via this module, users can view the aerial inspection video with the corresponding BIM 234
model being simultaneously navigating and check properties of the inspected building from BIM, e.g. object 235
12
geometry, material properties and component location. Users can also assess structural conditions, make 236
comprehensive decisions, and generate inspection reports with the support of the retrieved information from BIM. 237
(3) Query. Users can check the historical inspection reports in this module. 238
(4) System setting. With this module, users can specify basic system parameters such as the field of view of 239
the UAV camera and coordinate transformation parameters. 240
System development 241
Fig. 6 illustrates the technical details for the development of the system prototype. The database server is 242
programmed with PHP to realize the functionality of data storage and retrieval. The HTML page on the client-end 243
web browser is developed with HTML, JavaScript, Ajax, and JSON. The data communication between the database 244
server and web browser (e.g. inspection specifications and reports, flight records, and algorithm parameters) is 245
realized by Ajax. The BIM model of the building of interest has been created in advance. By calling the Autodesk 246
Forge cloud-end Model Derivative API, the original BIM model is compressed and converted into a light-weight 247
format in order to be displayed on web browsers. The Autodesk Forge Viewer API, which is an encapsulation of 248
Three.js, is called to render and display the lightweight BIM model on the web browser. The Aliplayer API is used 249
to upload and play aerial inspection videos through the web browser. The Tween.js is utilized to drive the connective 250
animation between the aerial video and BIM. In this procedure, the flight status data is retrieved from the database 251
and processed by the Ftma.js to conduct the aforementioned F-trans and virtual-real matching. 252
EXPERIMENT VALIDATION 253
Experiments have been performed to validate the proposed solution for building inspection. The building of 254
interest is the student union at the University of Tennessee, Knoxville, which has a floor space of 110×131 m2, as 255
shown in Fig. 7. DJI Phantom 4 Pro was used for inspection, of which the onboard camera has a field of view of 256
84°, an equivalent focal length of 24 mm, and a video resolution of 1280×720. In this chapter, the precision of F-257
13
trans and the virtual-real matching accuracy are first evaluated; then the application of the developed AR system is 258
introduced; the application scope of F-trans and factors that affect the virtual-real matching are discussed in the last 259
part. 260
Coordinate transformation 261
To evaluate the precision of F-Trans, eight reference points were selected on the perimeter of the student union 262
(as shown in Fig. 8). The WGS-84 coordinates of these reference points were obtained via Google Map, while their 263
BIM project coordinates were retrieved from Autodesk Forge Viewer. Point #1 and #2 form a calculation set (as 264
listed in table 1), which is used to calculate the transformation parameters in Eq. (1), and the other points form a 265
test set. 266
By inputting coordinates of point #1 and #2 into Eq. (1), an equation set was obtained, as shown by Eq. (7). Its 267
solution can be easily calculated as
= 117.77
,
= 3164643.093−X
, and
=2544865.713Y
. 268
-72.37=777193.47cos sin
777193.47sin cos
-70
3985020.21
30.07 3985020.21
3985013..78=777176.80cos sin
777176.80sin cos
24
48 3985013.24




+ 
+ + 
+ 
=
=
+ + 
X
Y
X
Y
269
Using the obtained transformation parameters, the transformation results of the test set are listed in table 2. The 270
average error for X component and Y component is, respectively, -0.20m and 0.40m, which meets the requirement 271
for subsequent UAV-BIM connective animation. 272
Evaluation of virtual-real matching accuracy 273
The drone, DJI Phantom 4 pro, was operated to fly along the perimeter of the building of interest to record an 274
aerial video. During the process, the drone was programmed to automatically record its flight status in a .txt file. 275
The aerial video and the flight status data can be accessed with a micro SD card and uploaded onto the AR system 276
(7)
14
after the flight, with which the connective animation of BIM and the aerial video was realized by the proposed 277
method. A video demo can be found on https://youtu.be/ogULi6IyLvw. 278
Fig. 9 shows 24 pairs of screenshots of the simultaneously navigating BIM and aerial video with 5 seconds 279
interval. In each image pair, the playing progress of the video (the time) was marked on the top left, the first row is 280
the BIM model, and the second row is the image frame of the video. To quantitatively evaluate the matching 281
accuracy between the aerial video and the BIM model, an index called Intersection over Union (IoU) was adopted 282
to measure the level of alignment between the real building in the video frames and the virtual building in the BIM 283
model (Chen et al. 2019). The extracted structural of interest (SOI) from the aerial video frames is denoted by Suav, 284
while the SOI from BIM images is denoted by Sbim. The loU is defined as a ratio of the area of Suav
Sbim to the 285
area of Suav
Sbim (as shown in Eq.(8)). 286
()
()
uav bim
uav bim
A S S
IoU A S S
=
287
where A(x) is the area of region x, which can be reflected by the quantity of pixels in the region. 288
Fig. 10 visualizes the level of alignment by overlying the extracted building of interest from the BIM images 289
(the transparent red area) onto that extracted from the aerial images (the ground truth), where the IoU values are 290
annotated on the bottom-right corner. According to (Ferguson et al. 2019), an IoU value larger than 50% is regarded 291
as a satisfying result. The average value of IoU is 74.10% in this study, indicating that the BIM model and the 292
recorded aerial video have been matched well. 293
Application of the AR system prototype 294
Using the developed system prototype, users should upload and specify the BIM model, inspection 295
specifications, and relevant algorithm parameters in advance, since they are basic data that usually remains 296
unchanged for a certain building project and a certain type of UAV. Each time the UAV finishes onsite data 297
(8)
15
collection, inspectors should upload the flight status files and aerial videos into the system before implementing the 298
AR building inspection. 299
Fig. 11-13 present three screenshots of the developed AR system prototype. As shown in Fig. 11, the inspectors 300
can refer to the inspection specifications during the review of the aerial video, which is augmented by a 301
simultaneously navigating BIM model. Once a safety issue (e.g. crack, spalling, water accumulation or windows 302
crack) is detected from the video, the inspector can intervene to pause the navigation, and retrieve useful information 303
from the BIM model, e.g. check out properties of certain elements, or conduct distance measurement. Based on the 304
retrieved information, mechanical calculation of the structure can be done to assess the safety condition of the 305
building. 306
In addition, when a potential safety issue is detected, the inspectors can also use the issues screenshots 307
function (Fig. 12) to capture the screenshot of the displayed video and BIM model to record the detected issue. 308
Alongside the captured screenshot, the inspectors are asked to fill in and submit the related information such as 309
issue category, detail, and location. The detected issues will be gathered and displayed in the inspection reports area 310
(Fig. 13). In this page, the inspector can make condition assessment and give solution instructions for the detected 311
safety issues, based on the visual information from the aerial video and relevant information retrieved from BIM. 312
The inspection report of the current inspection can be submitted to the system. Historical inspection reports can be 313
queried from the query module to support a more comprehensive evaluation of the current condition of the building. 314
The above case study indicates that the developed AR system prototype can effectively improve the inspection 315
efficiency, help inspectors better understand the detected safety issues, and has the potential to lead to a more 316
comprehensive and unbiased condition assessment based on UAV-enabled visual inspection. 317
16
Discussion 318
Application scope of F-Trans 319
As mentioned before, the simplification adopted by F-Trans is feasible because of the relatively small scale of 320
building projects. However, the specific range within which the F-Trans is applicable remains unclear. Hence, the 321
application scope of the F-Trans is discussed in this section. A long-distance water diversion project (Fig. 14) in 322
northwestern China was employed to investigate the building scale within which the transformation error of F-Trans 323
is acceptable. Fifteen reference points were selected from the project (Fig. 15). The WGS-84 plane coordinates and 324
the BIM project coordinates of these reference points are listed in Table 3. The distances between the #1 point and 325
the other reference points are also listed in Table3, which range from 1.73 km to 20.76km. 326
Then, point #1 and one of the other points form a calculation set, which is used to calculate the transformation 327
parameters in Eq. (1). The points within the distance range of this calculation set form a test set. Fig. 16 shows the 328
average value of absolute errors within different ranges of 13 different calculation sets. As shown by the figure, the 329
error between Xbim (Ybim) and Xbim (Ybim) increases with the increment of the distance between the reference points 330
(i.e., the project scale), and the error between Ybim and Ybim’ is nearly 2-3 times larger than the error between Xbim 331
and Xbim’. It is within the range of 7.56 km that the precision of F-Trans maintains at a submeter level, so it is 332
recommended to use F-Trans when the scale of a project is within 7 km. 333
Influencing factors of the UAV-BIM matching results 334
Statistical data of the UAV-BIM matching results shows that the average value of IoU is 74.10%, the maximum 335
value is 84.08% at 100s, and the minimum value is 52.13% at 75 s. The minimum value occurred at the moment 336
when the camera view angle changed around the building corner. This might be caused by the manual adjustment 337
of the camera attitude, during which the angular speed of the camera was changed erratically. Additionally, the 338
roaming time from the previous BIM camera state to the next one is fixed, e.g. 2000ms, and the roaming action is 339
17
continuous, during which the small change caused by manual operation would not be reflected. The above two 340
factors, i.e. the sudden adjustment of camera state and the time interval for roaming, resulted in the low IoU values. 341
Smaller roaming time interval between two adjacent statuses of the BIM camera has the potential to solve this issue, 342
but the efficiency of computation and data transition should also be considered. 343
CONCLUSIONS 344
To expedite the visual inspection of buildings, this paper presented an AR solution by the integration of BIM 345
and UAV. This innovative integration allows seamless information retrieval from BIM to augment the aerial video 346
captured by an inspection UAV. The overall workflow of the proposed algorithm was illustrated, which includes 347
coordinate transformation by F-trans, camera parameters matching, and connective animation between the aerial 348
video and BIM. Based on the aforementioned method, an AR system prototype was developed to demonstrate the 349
specific ways to support the UAV-based visual inspection for buildings. Experiments have been carried out for 350
validation purposes. The F-Trans algorithm was verified by achieving a sub-meter transformation precision. The 351
matching accuracy between the simultaneously navigating aerial video and the BIM model was quantitatively 352
evaluated, of which the IoU was 74.10%. The effectiveness of the developed AR system prototype has also been 353
demonstrated, which has great potential to enable a more efficient, comprehensive, and unbiased UAV-based 354
building inspection. 355
DATA AVAILABILITY STATEMENTS 356
Select data and code generated during the study are available from the corresponding author by request, i.e., 357
flight status data and aerial videos used in this study, and code developed to implement the proposed algorithm. 358
Some models are confidential in nature and may only be provided with restrictions as defined within the 359
department's approval for data collection, e.g., the BIM models used in this study. 360
18
ACKNOWLEDGEMENT 361
This research was supported by the National Key Research and Development Program of China (No. 362
2017YFC0405105, No. 2018YFC0406903) and the Innovative Research Groups of the National Natural Science 363
Foundation of China (No.51621092). The authors acknowledge the support of China Scholarship Council (CSC) 364
and thank Dr. Bingye Han from Beijing University of Civil Engineering and Architecture for his technical support. 365
REFERENCES 366
Aghaei, M., Grimaccia, F., Gonano, C. A., and Leva, S. (2015). “Innovative Automated Control System for PV 367
Fields Inspection and Remote Control.” IEEE Transactions on Industrial Electronics, 62(11), 72877296. 368
https://doi.org/10.1109/TIE.2015.2475235. 369
Ani, A. I. C., Johar, S., Tawil, N. M., Razak, M. Z. A., and Hamzah, N. (2015). “Building information modeling 370
(BIM)-based building condition assessment: A survey of water ponding defect on a flat roof.” Jurnal 371
Teknologi, 75(9), 2531. https://doi.org/10.11113/jt.v75.5222. 372
Azhar, S. (2011). “Building Information Modeling (BIM): Trends, Benefits, Risks, and Challenges for the AEC 373
Industry.” Leadership and Management in Engineering, 11(3), 241252. 374
https://doi.org/10.1061/(ASCE)LM.1943-5630.0000127. 375
Chen, J., Liu, D., Li, S., and Hu, D. (2019). “Registering georeferenced photos to a building information model to 376
extract structures of interest.” Advanced Engineering Informatics, 42, 100937. 377
https://doi.org/10.1016/j.aei.2019.100937. 378
Chen, Y. J., Lai, Y. S., and Lin, Y. H. (2020). “BIM-based augmented reality inspection and maintenance of fire 379
safety equipment.” Automation in Construction, Elsevier, 110, 103041. 380
https://doi.org/10.1016/j.autcon.2019.103041. 381
Choi, S. S., and Kim, E. K. (2015). “Building crack inspection using small UAV.” In Proc., Int. Conf. on Advanced 382
19
Communication Technology, 235238. PyeongChang, South Korea: IEEE. 383
https://doi.org/10.1109/ICACT.2015.7224792. 384
Dang, N. S., and Shim, C. S. (2020). “BIM-based innovative bridge maintenance system using augmented reality 385
technology.” Lecture Notes in Civil Engineering, 54, 12171222. https://doi.org/10.1007/978-981-15-0802-386
8_195. 387
Duque, L., Seo, J., and Wacker, J. (2018a). “Synthesis of Unmanned Aerial Vehicle Applications for Infrastructures.” 388
Journal of Performance of Constructed Facilities, 32(4), 110. https://doi.org/10.1061/(ASCE)CF.1943-389
5509.0001185. 390
Duque, L., Seo, J., and Wacker, J. (2018b). “Timber Bridge Inspection Using UAV.” In Proc., 2018 ASCE 391
Structures Congress, 186-196. Reston, VA: ASCE. https://doi.org/ 10.1061/9780784481332.017. 392
Ferguson, M., Jeong, S., and Law, K. H. (2019). “Worksite Object Characterization for Automatically Updating 393
Building Information Models.” In Proc.,Computing in Civil Engineering 2019: Visualization, Information 394
Modeling, and Simulation, 303311. Atlanta, GA: ASCE. https://doi.org/10.1061/9780784482421.039. 395
Gopalakrishnan, K., Gholami, H., Vidyadharan, A., and Agrawal, A. (2018). “Crack Damage Detection in 396
Unmanned Aerial Vehicle Images of Civil Infrastructure Using Pre-Trained Deep Learning Model.” 397
International Journal for Traffic and Transport Engineering, 8(1), 114. 398
https://doi.org/10.7708/ijtte.2018.8(1).01. 399
Hallermann, N., and Morgenthal, G. (2013). “Unmanned aerial vehicles (UAV) for the assessment of existing 400
structures.” In Proc., Int. Association for Bridge and Structural Engineering, 18. Zurich,Switzerland: 401
International Association for Bridge and Structural Engineering. 402
https://doi.org/10.2749/222137813808627172. 403
He, Z., Liu, Y., Chang, Y., and Hu, S. (2007). “Designing and implementing outdoor augmented reality system 404
20
based on ARToolKit.” In Vol. 6754 of Proc., Geoinformatics 2007: Geospatial Information Technology and 405
Applications, 675425. Washington, DC: SPIE. https://doi.org/10.1117/12.764952. 406
Henriques, M. J., and Roque, D. (2015). “Unmanned Aerial Vehicles (UAV) As A Support To Visual Inspections 407
of Concrete Dams.” In Proc.,2nd International Dam World Conf., 112. Lisbon, Portugal: Laborat´orio 408
Nacional de Engen- haria Civil. 409
Hsiao, P. T., and Lin, Y. C. (2018). “Development of web-based BIM models inspection and modification 410
management system for supervisions.” In Proc., 35th International Symposium on Automation and Robotics 411
in Construction. Edinburgh, UK: IAARC Publications.https://doi.org/10.22260/isarc2018/0023. 412
Kang, D., and Cha, Y. J. (2018). “Autonomous UAVs for Structural Health Monitoring Using Deep Learning and 413
an Ultrasonic Beacon System with Geo-Tagging.” Computer-Aided Civil and Infrastructure Engineering, 414
33(10), 885902. https://doi.org/10.1111/mice.12375. 415
Khan, F., Ellenberg, A., Mazzotti, M., Kontsos, A., Moon, F., Pradhan, A., and Bartoli, I. (2015). “Investigation on 416
bridge assessment using unmanned aerial systems.” In Proc., 2015 ASCE Structures Congress 2015, 404413. 417
Portland, OR:ASCE. https://doi.org/10.1061/9780784479117.035. 418
Kopsida, M., and Brilakis, I. (2016). “Markerless BIM Registration for Mobile Augmented Reality Based 419
Inspection.” In Proc., Int. Conf. on Smart Infrastructure and Construction, 16311636. Cambridge, UK: CSIC. 420
Kumar, G. A., Patil, A. K., Kang, T. W., and Chai, Y. H. (2019). “Sensor fusion based pipeline inspection for the 421
augmented reality system.” Symmetry, 11(10), 1325. https://doi.org/10.3390/sym11101325. 422
Li, S., Cai, H., Abraham, D. M., and Mao, P. (2016). “Estimating Features of Underground Utilities: Hybrid 423
GPR/GPS Approach.” Journal of Computing in Civil Engineering, 30(1), 112. 424
https://doi.org/10.1061/(ASCE)CP.1943-5487.0000443. 425
Li, S., Cai, H., and Kamat, V. R. (2015). “Uncertainty-aware geospatial system for mapping and visualizing 426
21
underground utilities.” Automation in Construction, 53, 105119. 427
https://doi.org/10.1016/j.autcon.2015.03.011. 428
Li, W., Han, Y., Liu, Y., Zhu, C., Ren, Y., Wang, Y., and Chen, G. (2018). “Real-time location-based rendering of 429
urban underground pipelines.” ISPRS International Journal of Geo-Information, 7(1), 32. 430
https://doi.org/10.3390/ijgi7010032. 431
Liu, D., Chen, J., Hu, D., and Zhang, Z. (2019). “Dynamic BIM-augmented UAV safety inspection for water 432
diversion project.” Computers in Industry, 108, 163177. https://doi.org/10.1016/j.compind.2019.03.004. 433
Liu, D., Chen, J., Li, S., and Cui, W. (2018). “An integrated visualization framework to support whole-process 434
management of water pipeline safety.” Automation in Construction, 89, 2437. 435
https://doi.org/10.1016/j.autcon.2018.01.010. 436
McGuire, B., Atadero, R., Clevenger, C., and Ozbek, M. (2016). “Bridge Information Modeling for Inspection and 437
Evaluation.” Journal of Bridge Engineering, 21(4), 19. https://doi.org/10.1061/(ASCE)BE.1943-438
5592.0000850. 439
Milgram, P., Takemura, H., Utsumi, A., and Kishino, F. (1995). “Augmented Reality: A class of displays on the 440
reality-virtuality continuum.” Telemanipulator and Telepresence Technologies, 282292. 441
https://doi.org/10.1117/12.197321. 442
Morgenthal, G., and Hallermann, N. (2014). “Quality assessment of Unmanned Aerial Vehicle (UAV) based visual 443
inspection of structures.” Advances in Structural Engineering, 17(3), 289302. https://doi.org/10.1260/1369-444
4332.17.3.289. 445
O’Riordan-Adjah, C. A., and MacKenzie, A. W. (2019). “Investigating Unmanned Aerial Vehicles (UAVs) for 446
Vertical Structural Inspection and Analysis.” In Proc., Int. Conf. on Sustainable Infrastructure 2019, 193204. 447
Los Angeles, CA: ASCE. https://doi/10.1061/9780784482650.020. 448
22
Otero, L. D. (2015). Proof of Concept for using Unmanned Aerial Vehicles for High Mast Pole and Bridge 449
Inspections. Florida Department of Transportation, Research Center. 450
Parsanezhad, P. (2014). “Effective Facility Management and Operations via a BIM-based Integrated Information 451
System.” In Proc. CIB Facilities Management Conf,. 2014, 442453. Copenhagen, Denmark: CIB. 452
https://doi/10.13140/2.1.2298.9764. 453
Roca, D., Lagüela, S., az-Vilariño, L., Armesto, J., and Arias, P. (2013). “Low-cost aerial unit for outdoor 454
inspection of building façades.” Automation in Construction, 36, 128135. 455
https://doi.org/10.1016/j.autcon.2013.08.020. 456
Seo, J., Duque, L., and Wacker, J. (2019). “Glued-Laminated Timber Arch Bridge Inspection Using UAV.” In Proc., 457
Computing in Civil Engineering 2019, 105113. Atlanta, GA: ASCE. https://doi/10.1061/9780784482445.043. 458
Shim, C. su, Kang, H., Dang, N. S., and Lee, D. (2017). “Development of BIM-based bridge maintenance system 459
for cable-stayed bridges.” Smart Structures and Systems, 20(6), 697708. 460
https://doi.org/10.12989/sss.2017.20.6.697. 461
Wikipedia. (2020). “List of largest buildings.” <https://en.m.wikipedia.org/wiki/List_of_largest_buildings> (Jul. 10, 462
2020). 463
Yuan, C., Li, S., Cai, H., and Kamat, V. R. (2018). “GPR Signature Detection and Decomposition for Mapping 464
Buried Utilities with Complex Spatial Configuration.Journal of Computing in Civil Engineering, 32(4), 1465
15. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000764. 466
Zhan, J., Ge, X. J., Huang, S., Zhao, L., Wong, J. K. W., and He, S. X. J. (2019). “Improvement of the inspection-467
repair process with building information modelling and image classification.” Facilities, 37(78), 395414. 468
https://doi.org/10.1108/F-01-2018-0005. 469
470
23
TABLE 1. Coordinates of reference points in the calculation set 471
No.
WGS-84 (Geographic
coordinates) WGS-84 (Plane coordinates) BIM project coordinates
Longitude (°) Latitude (°) X (m) Y (m) Xbim (m) Ybim (m)
#1 -83.927833 35.955944 777193.473 3985020.21 -72.365555 30.072178
#2 -83.92802 35.955886 777176.7983 3985013.24 -70.777151 47.996734
TABLE 2. F-Trans algorithm precision analysis 472
No. Ground truth Calculated value Error
Xbim (m) Ybim (m) Xbim (m) Ybim’ (m) (Xbim-Xbim)(m) (Ybim-Ybim)(m)
#3 1.999 45.588 1.906 46.342 -0.092 0.754
#4 -50.973 -20.770 -50.630 -20.694 0.343 0.076
#5 -82.600 78.264 -83.609 78.542 -1.009 0.279
#6 -38.417 77.159 -39.071 77.648 -0.654 0.488
#7 -56.760 -34.284 -56.500 -34.009 0.260 0.275
#8 1.299 28.230 1.256 28.777 -0.043 0.548
Average error (m) -0.20 0.40
TABLE 3. Coordinates of reference points in a long-distance water diversion project 473
No.
WGS-84 (Plane coordinates) BIM project coordinates
Distance(km)
X (m) Y (m) Xbim (m) Ybim (m)
#1 595423.8 5217119 8998.991 6933.071 -
#2 593932.2 5216239 7502.195 6062.23 1.73
#3 591783.8 5215723 5348.41 5565.786 3.90
#4 591309.2 5215680 4873.852 5524.557 4.36
#5 588707.3 5215830 2279.112 5662.401 6.84
#6 588030.7 5215557 1602.587 5387.102 7.56
#7 586954.7 5213586 517.1564 3431.321 9.18
#8 583139.2 5213393 -3287.83 3214.373 12.84
#9 582750 5212772 -3677.84 2595.26 13.40
#10 580473.3 5210889 -5958.04 719.332 16.20
#11 579362.4 5210973 -7076.23 823.8809 17.20
#12 578648.9 5209644 -7792.94 -502.691 18.36
#13 578731.1 5204959 -7717.47 -5193.5 20.65
#14 579116.7 5204274 -7316.79 -5900.74 20.76
#15 581114 5203040 -5320.11 -7136.04 20.07
474
475
24
Fig. 1. Overall pipeline of UAV-BIM connective animation 476
Fig. 2. F-Trans algorithm procedure 477
Fig. 3. Virtual camera parameters: (a) Physical parameters; (b) Optical parameters 478
Fig. 4. Flowchart of connective animation 479
Fig. 5. The architecture of the AR system prototype 480
Fig. 6. Technical framework for the development of the AR system prototype 481
Fig. 7. BIM model of the Student union at the University of Tennessee, Knoxville 482
Fig. 8. Eight selected reference points shown on Google Map 483
Fig. 9. Results of BIM-UAV connective animation 484
Fig. 10. Quantitative evaluation of matching accuracy based on IoU 485
Fig. 11. Interface of the AR system prototype Inspection specification 486
Fig. 12. Interface of the AR system prototype Safety issues screenshots 487
Fig.13. Interface of the AR system prototype Inspection reports 488
Fig. 14. The BIM model of a long-distance water diversion project 489
Fig. 15. Fifteen selected reference points along with the water diversion project 490
Fig. 16. Average calculation errors under different distance ranges 491
492
BIM modelBIM model
Camera parameters
matching
Camera parameters
matching
Aerial videoAerial video
Flight
status
UAV inspect
Data recordingData recording
Aerial
video
Real
Virtual
Coordinate
transformation
Coordinate
transformation
Connective
animation
Connective
animation
Position, posture,
field of view, etc.
WGS-84
(Geographic coordin ates)
WGS-84
(Plane coordinates)
Step 1: Coordinate projection
BIM project
coordinates
Step 2: Plane transformation
(,)lon lat
(,)xy
Centra l
meri dian
Equator
X
Y
O
X
Y
O
X
BIM
Y
BIM
O’
(,)
bim bim
xy
(,)XYΔΔ
ω
eye
p
m
up
v
Virtual c amera
ϕ
at
v
Look-at point
EndStart
Is video being
played?
Global sampling
time:
Retrieve data at
ddd :dddddd
F-trans and
matching algorithm
BIM camera
roaming to dddd
Is stop event
activated?
BIM target camera
status dd
Yes
No
No
Yes
0c
tt t=+
2tsΔ=
0tsΔ=
tt
tt
uav
d
tt
S
tt
S
Database server Web browser
Network connect
BIM model
Aerial video and flight record
Inspection rep orts and
spec ifications
Algorithm parameters
A R in spec tion
Qu ery
Upload (BIM, aerial video, flight record,
and et c.)
Setting (Algor ithm parameters)
Database Server
Aerial videos Lightweight BIM
Autodesk
Forge
Viewer
Three.js
Html Page
Viewer3D.js
Tween.js
Connective
animation
Web Browser
BIM displaying
and interaction
Algorithm
parameters
Flight
record
Inspection
Specifications
Aliplayer
Video
playback
t+Δt
Data upload
or review
Ajax Json
Inspection specifications
Inspection reports
Parameters setting
Middle layer
Ftma.js
Video
upload
Model Derivative
BIM upload
Inspection
reports
Flight record
Data
Map data ©
2
2
019 Goo
gl
l
e
0 s 5 s 10 s 15 s 25 s
30 s 35 s 45 s 55 s
60 s 65 s 70 s 75 s 85 s
90 s 95 s 105 s 110 s
20 s
50s
40 s
80 s
100 s 115 s
0 s 5 s 10 s 15 s 25 s
30 s 35 s 45 s 55 s
60 s 65 s 70 s 75 s 85 s
90 s 95 s 105 s 110 s
20 s
50s
40 s
80 s
100 s 115 s
IoU:67.25% IoU:67.24% IoU:62.5 2% IoU:67.84% IoU:71.79%
IoU:70.39%
IoU:69.18% IoU:70.27% IoU:73.5 5% IoU:76.55% IoU:83.91%
IoU:77.66%
IoU:82.28% IoU:59.09% IoU:72.1 6% IoU:52.13% IoU:79.95%
IoU:82.20%
IoU:76.91% IoU:79.61% IoU:84.0 8% IoU:83.25% IoU:81.30%
IoU:79.99%
Inspection Query Uploading Setting TJU-User1
Inspection
specifications
Issues
screenshots
Inspection
reports
Major s tructure
No inclinat ion, deformation, pee ling, cracking and non-sh rinkage cracks, no expos ed steel bar, etc.
Wall space
Finish bric k stick firmly; the surf ace is smooth and clean, an d has no damage or unmatched co lor.
Roof
No agin g phenomenon o r cracks, no water accumula tion.
Door and window
Firm, flat , beautiful, and tight, no rust or access ories missing.
BIM model
Aerial video
Distance
meas uremen t
Element
properties
DJ-0374.MP4 DJ-0375.MP4
DJ-0376.MP4 DJ-0377.MP4
DJ-0378.MP4
Inspection Query Uploading Setting TJU-User1
Inspection
specifications
Issues
screenshots
Inspection
reports
BIM model
Aerial vide o
*Example
1_ Windows b roken 2_ Roof water accumulation 3_ Wall space strange color
Take screenshots
2_ Wall space strange color
×
*Example
SUBMIT
Inspection Query Uploading Setting TJU-User1
Inspection
specifications
Issues
screenshots
Inspection
reports
BIM mo del
Aerial vide o
*Example
2_ Wall space strange color
Previous p age Subm it
Issue ID
4
Issue category Is sue detail Element id Lo cation
Roof Water accumulation on the
roof of east win g 56 43679 35.9576080812 21906
83.9289603 3843202
Condition asse ssment:
Soluti on:
4 of 4
No hazard in theory consider ing the compressive strength a nd impermeability of
the roof. Bu t intake of the drainpipe is fo und plugged up by someth ing.
Check and clean th e i ntake of the drainpipe.
BIM model
Satellite Image © 2019 Maxar Techno logies
Google Earth
Image © 2019 Maxar Technologies
0
1
2
3
4
5
6
7
0 2 4 6 8 10 12 14 16 18 20 22
Average error/m
Distanc e/km
Xbim-Xbim'
Ybim-Ybim'
|X
bim
-X
bim
|Y
bim
-Y
bim
|
|
... Kopsida et al. [34] employed KinectFusion for the 3D reconstruction of buildings, achieving registration between as-built models and asplanned BIM models by estimating camera poses and utilizing the iterative closest point (ICP) algorithm. Liu et al. [35] aligned the real camera pose coordinates with the virtual camera coordinates in the BIM model and achieved registration between the 3D reconstruction model and the BIM model. In Chen's study [36], a registration method based on real images and BIM-projected images was proposed. ...
Article
Full-text available
Defect inspection of existing buildings is receiving increasing attention for digitalization transfer in the construction industry. The development of drone technology and artificial intelligence has provided powerful tools for defect inspection of buildings. However, integrating defect inspection information detected from UAV images into semantically rich building information mod-eling (BIM) is still challenging work due to the low defect detection accuracy and the coordinate difference between UAV images and BIM models. In this paper, a deep learning-based method coupled with transfer learning is used to detect defects accurately; and a texture mapping-based defect parameter extraction method is proposed to achieve the mapping from the image U-V coordinate system to the BIM project coordinate system. The defects are projected onto the surface of the BIM model to enrich a surface defect-extended BIM (SDE-BIM). The proposed method was validated in a defect information modeling experiment involving the No. 36 teaching building of Nantong University. The results demonstrate that the methods are widely applicable to various building inspection tasks.
... The introduction of quadrocopters (drones) into the practice of forensic construction and technical expertise seems to be an important step in the development of this area of forensic activity. This will improve the process of conducting inspections of objects, reduce time and improve the availability of information about them, as well as increase the safety of work [10][11][12][13][14]. ...
Article
Full-text available
The article deals with the problems of using special methods of forensic construction-technical expertise. The classification of methods and means on various grounds, in particular, on the nature of the impact on objects and on the principle of operation is proposed, the specifics of their use in full-scale and laboratory studies are analysed. Special attention is paid to the use of unmanned aerial vehicles for inspection of construction objects and land plots functionally related to them. In some cases, objects may be difficult to access or dangerous for the expert. Thanks to drones, experts can quickly get an overview of the condition of the site and then examine the areas of interest in more detail. In addition, the use of quadrocopters contributes to the safety of conducting investigations. In general, the use of quadrocopters in forensic construction and technical expertise is a promising and useful tool. It is shown that timely replenishment of the construction expert's instrumental arsenal with modern techniques and methods of research based on the latest achievements of science and technology is the key to sustainable development of forensic construction and technical expertise as one of the most demanded by Russian legal proceedings areas of expert activity.
... The system consists of a drone that acquires both firstperson and overhead views at the building construction site, a controller that operates the drone, and a PC on a server that performs AR rendering with occlusion handling. Similarly, [45] present a method for building inspection that merges aerial drone animations and BIM and visualizes this using AR technology. ...
Chapter
Full-text available
The concept of the digital twin is a tenet of digitalization across fields and has been gaining popularity in digital construction and architecture as well. Yet, as with any new terminology, there is still ambiguity when it comes to defining what a digital twin is, how one should be constructed, and most importantly, where it can be useful. In this chapter, through a systematic review of the literature of 113 research papers, we map and summarize the state of the art of digital twins in architecture. Our findings show that digital twins are an ecology of practices and understandings and the notion means different things for the different fields that make up architecture. We contribute with a map of this ecology that shows the studies on two axes: space (the scale of the twin - ranging from building element to city scale) and time (moment in a building’s life). We then discuss how no matter how accurate, digital twins will never be twins, how twins need to engage critically with data and conceptually consider the infrastructures of data storage and processing they make use of as well as the life-span of a twin and how it correlates to that of its physical counterpart.
... Recent technological advancement is poised to transform the existing IDR approach. The development of robotics, for example, can significantly expand the inspection coverage that is limited by human capability [9,10]. The ever-increasing artificial intelligence (AI) power can automate data processing [11,12], and information modeling makes cross-party data interoperability a reality [13,14]. ...
Article
Full-text available
Keywords built environment sustainability facility management defect information modeling building information modeling The built environment is subject to various defects as it ages. A well-maintained built environment depends on surveying activities to inspect, document, and rehabilitate the defects that occurred. The advancement of digital technologies paves the pathway towards (1) comprehensive defect inspection by systematic mapping, (2) their consistent documentation by digital modeling, and (3) timely retrofitting by proactive management. However, the three steps of defect mapping , modeling, and management (D3M) remain largely fragmented and have yet to be synergized. Exploiting the pivotal role of building information modeling (BIM) in built asset management, this paper puts forward a cohesive framework for integrated D3M. It leverages the rich geometric-semantic information in BIM to assist defect mapping and enriches the BIM by industry foundation classes (IFCs)-represented defect information. The defect-enriched BIM facilitates defect management in a data-driven manner. The framework was applied in multiple real-life infrastructure and civil works projects. It demonstrates how the BIM-based D3M framework can enhance the maintenance of those that have been built, and ultimately contribute to a safe and sustainable built environment. Future studies are called for to substantiate each of the 3Ms by leveraging BIM as both an enabler and a beneficiary.
... BIM is a multi-source database that holds information about a building project throughout its life cycle, including planning, design, construction, operation, maintenance, and even deconstruction (Azhar et al., 2012). BIM is gradually being embraced for the operation and maintenance of building projects after being mostly employed in the design and construction phase in the past (Liu et al., 2021). In recent years, there has been a surge in using BIM in construction safety management. ...
... The system aided in facility management and acted as an inspection tool that is powered by back-end processes stored on an offline server. Ref. [29] developed an inspection method using drone-based photogrammetry, BIM, and AR. The users synchronized the visualization of the BIM model with aerial video using a coordinate transformation algorithm. ...
Article
Full-text available
In the era of aging civil infrastructure and growing concerns about rapid structural deterioration due to climate change, the demand for real-time structural health monitoring (SHM) techniques has been predominant worldwide. Traditional SHM methods face challenges, including delays in processing acquired data from large structures, time-intensive dense instrumentation, and visualization of real-time structural information. To address these issues, this paper develops a novel real-time visualization method using Augmented Reality (AR) to enhance vibration-based onsite structural inspections. The proposed approach presents a visualization system designed for real-time fieldwork, enabling detailed multi-sensor analyses within the immersive environment of AR. Leveraging the remote connectivity of the AR device, real-time communication is established with an external database and Python library through a web server, expanding the analytical capabilities of data acquisition, and data processing, such as modal identification, and the resulting visualization of SHM information. The proposed system allows live visualization of time-domain, frequency-domain, and system identification information through AR. This paper provides an overview of the proposed technology and presents the results of a lab-scale experimental model. It is concluded that the proposed approach yields accurate processing of real-time data and visualization of system identification information by highlighting its potential to enhance efficiency and safety in SHM by integrating AR technology with real-world fieldwork.
... The fourth industrial revolution (Industry 4.0) has introduced various digital technologies such as artificial intelligence (AI), big data (BD), digital twin (DT), building information modelling (BIM), the Internet of things (IoT), and advanced analytical tools into industrial practises The construction industry has proven performance improvement through this digitization process (Ding et al., 2019;Heaton & Parlikad, 2020;Liu et al., 2021;Sampaio et al., 2022). The incorporation of emerging technologies for maintenance activities to improve performance through data-driven, predictive, and automated approaches is referred to as Maintenance 4.0. ...
Article
Full-text available
Facility maintenance management (FMM) is essential for ensuring long-term values and to sustain project goals throughout the life cycle delivery process. However, in underdeveloped nations such as Ethiopia, facility maintenance management is an immature and underutilised process that requires a holistic intervention for practical improvement. The main aim of this study was to identify and prioritise critical factors that affect the effectiveness of FMM, with a focus on public universities in Ethiopia. Initially, a total of thirty-three (33) crucial variables were identified with a systematic literature review and desk study. To collect primary data, a survey research design approach was utilised using questionnaires and informant interviews. A total of seventy-five (75) data sets were obtained from 180 online surveys for conducting exploratory factor analysis (EFA). The outcome of the study revealed thirteen (13) critical attributes grouped into four factors that affect the effectiveness of facility maintenance management practises. The final four-factor model includes F1, internal processes and organisation; F2, community culture, learning, and growth; F3, impacts of design and construction quality; and F4, facility maintenance approach and management. This study indicated that facility maintenance management practises in public universities in Ethiopia are immature and require extensive enhancement. The identified influencing factors highlight the need for a comprehensive intervention to promote improved facility maintenance management practises and applications in Ethiopia. Further research is needed to analyse a wider range of attributes and data using confirmatory factor analysis.
... Recent technological advancement is poised to transform the existing IDR approach. The development of robotics, for example, can significantly expand the inspection coverage that is limited by human capability [9,10]. The ever-increasing artificial intelligence (AI) power can automate data processing [11,12], and information modeling makes cross-party data interoperability a reality [13,14]. ...
Article
Full-text available
The built environment is subject to various defects as it ages. A well-maintained built environment depends on surveying activities to inspect, document, and rehabilitate the defects that occurred. The advancement of digital technologies paves the pathway towards (1) comprehensive defect inspection by systematic mapping, (2) their consistent documentation by digital modeling, and (3) timely retrofitting by proactive management. However, the three steps of defect mapping, modeling, and management (D3M) remain largely fragmented and have yet to be synergized. Exploiting the pivotal role of building information modeling (BIM) in built asset management, this paper puts forward a cohesive framework for integrated D3M. It leverages the rich geometric-semantic information in BIM to assist defect mapping and enriches the BIM by industry foundation classes (IFCs)-represented defect information. The defect-enriched BIM facilitates defect management in a data-driven manner. The framework was applied in multiple real-life infrastructure and civil works projects. It demonstrates how the BIM-based D3M framework can enhance the maintenance of those that have been built, and ultimately contribute to a safe and sustainable built environment. Future studies are called for to substantiate each of the 3Ms by leveraging BIM as both an enabler and a beneficiary.
Article
Full-text available
Augmented reality (AR) systems are becoming next-generation technologies to intelligently visualize the real world in 3D. This research proposes a sensor fusion based pipeline inspection and retrofitting for the AR system, which can be used in pipeline inspection and retrofitting processes in industrial plants. The proposed methodology utilizes a prebuilt 3D point cloud data of the environment, real-time Light Detection and Ranging (LiDAR) scan and image sequence from the camera. First, we estimate the current pose of the sensors platform by matching the LiDAR scan and the prebuilt point cloud data from the current pose prebuilt point cloud data augmented on to the camera image by utilizing the LiDAR and camera calibration parameters. Next, based on the user selection in the augmented view, geometric parameters of a pipe are estimated. In addition to pipe parameter estimation, retrofitting in the existing plant using augmented scene are illustrated. Finally, step-by-step procedure of the proposed method was experimentally verified at a water treatment plant. Result shows that the integration of AR with building information modelling (BIM) greatly benefits the post-occupancy evaluation process or pre-retrofitting and renovation process for identifying, evaluating, and updating the geometric specifications of a construction environment.
Chapter
Full-text available
A smooth transportation system plays a pivotal role in the development of a country since it enables rapid connect among industrial zones or territories. Especially bridges, which commonly has around 50 years’ service life, needs to be paid more attention to maintenance to keep it in a good service state. Nowadays most bridge has its own bridge maintenance system (BMS) but still bridge engineer is challenged by difficulty in term of store the damage records and repair history, as well as assess the current behavior of the structural system. This paper proposed an innovative BMS which using a schematic BIM-based information management system, collaboration with an automate inspection task using augmented reality (AR) device. According to preventive maintenance strategy, a data schema for BIM model was investigated. An integrated digital model is created to store, manipulate and share the inspection data and maintenance history. On another hand, the onsite inspection task is timely performed. The state-of-the-art lies on the versatile capacity in term of real-time data manipulate of AR device. Besides capturing feature, a chain of algorithms based on computer vision is embedded into the AR device, aims to enhance the precision and performance of inspection task. Right after, the technical damage report is fed-back to the management system and assessment model is discussed. A pilot application to an existing cable-stayed bridge is introduced, which is well applied for a year and show a good potential for bridge maintenance.
Article
Full-text available
Vision-based techniques are being used to inspect structures such as buildings and infrastructure. Due to various backgrounds in the acquired images, conventional vision-based techniques rely heavily on manual processing to extract relevant structures of interest for subsequent analysis in many applications, such as distress detection. This practice is laborious, time-consuming, and error-prone. To address the challenge, this study proposes a new method that automatically matches a georeferenced real-life photo with a building information model-rendered synthetic image to allow the extraction of relevant structure of interest. Field experiments were conducted to validate and evaluate the proposed method. The average accuracy of this method is 79.21% and the processing speed is 140 s per image. The proposed method has the potential to reduce the workload of image processing for vision-based structural inspection.
Article
Full-text available
Keywords: Infrastructure Water diversion project Safety inspection Unmanned aerial vehicle (UAV) Dynamic BIM Virtual-real interactive system Abstract: As indispensable infrastructure for supplying industrial and residential water, structural failure to water diversion projects will result in unmeasurable loss. Safety inspection is critical for the operation of these projects. Traditional manual inspection is inefficient, has limited coverage, and cannot collaborate the off-site decision-makers with field workers to evaluate the detected abnormal phenomena. In addition, building information modeling (BIM), which is an intuitive and data-driven information platform, has not been integrated with the existing safety inspection procedure for better understanding the detected risk issues. To tackle these drawbacks, a safety inspection method that integrates unmanned aerial vehicle (UAV) and dynamic BIM is proposed in this paper. An overall workflow for dynamic BIM-augmented UAV safety inspection is developed, which consists of data collection, dynamic BIM construction, and UAV-BIM coupling. A dynamic BIM model is created by aggregating timely-updated safety information with a BIM model in the Web environment. The synchronous navigation of UAV video and dynamic BIM is realized by matching the virtual camera parameters with the real ones. The field experiment implemented in Tianjin, China demonstrates the efficacy of the proposed method, which can not only improve the safety inspection efficiency, but also enable the off-site managers to view the inspection video, and make timely and comprehensive safety evaluation with the support of dynamic BIM. Although the proposed method is introduced from the perspective of water diversion projects, it can also be applied to the inspection of other large-scale infrastructures, e.g. pavements, dams, and railways.
Conference Paper
Full-text available
Automated data capture systems could significantly improve the efficiency and productivity of the architecture, engineering, construction and facility management (AEC/FM) industry. However, automatically collecting spatiotemporal information in an unstructured environment such as a construction site or a work place remains a time consuming and challenging task. This paper presents a new approach to automated data capture and processing, referred to as object characterization. In object characterization, the goal is to identify common objects in a scene and extract rich semantic information about those objects. A novel 2D-3D object detection algorithm is designed for detection and characterization of common worksite objects. The proposed system has applications in automated surveying and data collection, especially in applications which leverage unmanned aerial vehicles or mobile robots. To demonstrate this utility, the proposed system is deployed on a mobile robot and used to detect newly placed objects in a worksite environment.
Conference Paper
The cost of inspecting and monitoring vertical structures is increasing and is still a very high-risk process when using current methods such as scaffolding and tie wires that are sometimes hundreds of feet high. Using unmanned aerial vehicles (UAVs) may help reduce these costs and risks, as well as improve the quality of structural inspection and monitoring. This paper focuses on researching the capabilities of two different UAVs for investigating, inspecting, and analyzing vertical structures. The research highlights the capabilities of the UAVs, ranging from the physical geometry process, including height, proximity to the structure, latitude, longitude, and flying modes, to the image-capturing process, including camera, sensors, gimbals, and payload. Also, Optar AI V1.0 computer software was used to examine detailed crack configurations found in the captured visual data and to analyze structural stresses through the software’s visual analyzer to determine whether there is a need for further inspection.
Article
The purpose of fire safety equipment (FSE) inspection and maintenance is to ensure that this equipment is in good working condition during emergencies, so that fire damage could be kept to a minimum. At present, the maintenance and inspection of FSE must be performed according to the standards and methods specified by fire safety regulations, which makes it necessary to consult relevant files and drawings. However, the reading of maintenance and inspection information can be a very time-consuming endeavor as the majority of these files are in paper. To solve the aforementioned issues, building information modeling (BIM) was used in this study to construct the FSE elements so that the required information could be rapidly acquired by FSE inspectors. A cloud database for equipment inspection and maintenance was then generated by organizing the compiled information. This database was combined with the augmented reality (AR) technology to facilitate FSE inspection and maintenance using mobile devices, thus overcoming the constraints imposed by paper files on these tasks. The results of the demonstration and validation shown that the proposed BIM AR FSE system provides highly comprehensive, mobile, and effective access to FSE information. The combination of information and real-life objects via AR effectively facilitated the presentation of information in an immediate, visual, and convenient manner.
Article
Purpose Automated technologies have been applied to facility management (FM) practices to address labour demands of, and time consumed by, inputting and processing manual data. Less attention has been focussed on automation of visual information, such as images, when improving timely maintenance decisions. This study aims to develop image classification algorithms to improve information flow in the inspection-repair process through building information modelling (BIM). Design/methodology/approach To improve and automate the inspection-repair process, image classification algorithms were used to connect images with a corresponding image database in a BIM knowledge repository. Quick response (QR) code decoding and Bag of Words were chosen to classify images in the system. Graphical user interfaces (GUIs) were developed to facilitate activity collaboration and communication. A pilot case study in an inspection-repair process was applied to demonstrate the applications of this system. Findings The system developed in this study associates the inspection-repair process with a digital three-dimensional (3D) model, GUIs, a BIM knowledge repository and image classification algorithms. By implementing the proposed application in a case study, the authors found that improvement of the inspection-repair process and automated image classification with a BIM knowledge repository (such as the one developed in this study) can enhance FM practices by increasing productivity and reducing time and costs associated with ecision-making. Originality/value This study introduces an innovative approach that applies image classification and leverages a BIM knowledge repository to enhance the inspection-repair process in FM practice. The system designed provides automated image-classifying data from a smart phone, eliminates time required to input image data manually and improves communication and collaboration between FM personnel for maintenance in the decision-making process.
Article
This paper is intended to provide the state-of-the-art and of-the-practice on visual inspection, monitoring, and analysis of infrastructure using unmanned aerial vehicles (UAVs). Several researchers have inspected various civil infrastructures, including bridges, buildings, and other structures, by capturing close-up images or recording videos, while operating UAVs. Various image analysis tools, such as the algorithm Morphological Link for Crack (Morpholink-C), were able to conduct precise measurements of crack thickness and length. Corrosion has also been detected using texture and color algorithms to investigate UAV-based images. Other analysis methods include structurally integrated sensors, such as digital image correlation equipment, which have helped to capture structural behaviors using UAVs. After the literature review was completed, a nationwide survey was distributed to Departments of Transportation (DOTs) to evaluate the current UAV-enabled inspection techniques that different DOTs have used or are planning to use for visual damage assessment for critical transportation infrastructures, especially bridges. Furthermore, a pertinent UAV selection was completed to indicate suitable UAVs for bridge inspection. Primary findings have shown that UAV-enabled infrastructure inspection techniques have been successfully developed to detect a broad variety of damage (including cracks and corrosions), and a few DOTs have used UAVs to inspect bridges as a more economical and versatile tool.