0% found this document useful (0 votes)
4 views

xx - Automated continuous construction progress monitoring

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

xx - Automated continuous construction progress monitoring

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Advanced Engineering Informatics 38 (2018) 27–40

Contents lists available at ScienceDirect

Advanced Engineering Informatics


journal homepage: www.elsevier.com/locate/aei

Automated continuous construction progress monitoring using multiple T


workplace real time 3D scans

Zoran Pučko , Nataša Šuman, Danijel Rebolj
Faculty of Civil Engineering, Transportation Engineering and Architecture, University of Maribor, Slovenia

A R T I C LE I N FO A B S T R A C T

Keywords: In recent years, exponential growth has been detected in research efforts focused on automated construction
Automation progress monitoring. Despite various data acquisition methods and approaches, the success is limited. This paper
Construction progress monitoring proposes a new method, where changes are constantly perceived and as-built model continuously updated
Data acquisition during the construction process, instead of periodical scanning of the whole building under construction. It
Scan-vs-BIM
turned out that low precision 3D scanning devices, which are closely observing active workplaces, are sufficient
4D point cloud
BIM elements identification
for correct identification of the built elements. Such scanning devices are small enough to fit onto workers’
protective helmets and on the applied machinery. In this way, workers capture all workplaces inside and outside
of the building in real time and record partial point clouds, their locations, and time stamps. The partial point
clouds are then registered and merged into a complete 4D as-built point cloud of a building under construction.
Identification of as-designed BIM elements within the 4D as-built point cloud then results in the 4D as-built BIM.
Finally, the comparison of the 4D as-built BIM and the 4D as-designed BIM enables identification of the dif-
ferences between both models and thus the deviations from the time schedule. The differences are reported in
virtual real-time, which enables more efficient project management.

1. Introduction is a rich source of information in terms of 3D geometry, such as ar-


chitectural, structural and MEP, knowledge sharing and interoperability
Construction industry interests for timely and accurate information through the whole life-cycle of a building. A 4D BIM model can be
on the progress of the construction project are increasing [1–4]. Un- created by integrating the 3D BIM design model with the schedule plan
fortunately, continuous monitoring and control in all phases of the [16,17] for project optimisation and for construction simulation and
project are one of the most difficult tasks in construction project progress monitoring. However, documenting changes and preparing the
management. This includes the measurement of the progress through as-built schedule is remaining a persistent problem in the construction
site inspections and comparison with the project plan, while the quality stage, requiring integration of BIM-based models and real-time field
of progress data highly depends on the surveyor’s experience and the data into an as-built BIM. Development of new tools and methodologies
quality of measurements [5]. Manual visual observations and tradi- allowing automated progress monitoring, in terms of data acquisition,
tional progress monitoring help to obtain feedback about progress information retrieval, progress estimation and visualisation of the re-
measurement [1,6], equipment and material tracking [7,8], safety sults, is therefore of vital importance [18–20].
planning [9], productivity tracking [10], and causes of schedule and Many authors [1,2,21–26] have already found a strong demand for
cost overruns [11,12]. However, such a process is time-consuming, the development of automated solutions, whereby remote-sensing has
error-prone, and infrequent [13,14]. Project managers spend lots of proven to be the most effective data acquisition method
resources for acquisition and updating of project progress information [3,4,7,9,10,13,20,23,27–35]. Field data capturing technologies include
and for creating finance, quality, and time-related reports [15], while Image-based technologies, Laser Scanning (LS), Radio Frequency
the development of automated systems for construction progress Identification (RFID), Ultra-Wideband (UWB), Global Positioning
monitoring and evaluation has not yet reached the required efficiency System (GPS) and wireless sensor networks (WSN). The BIM-based
and reliability to be applied by the industry. One promising approach to automated progress monitoring methods use the obtained 4D as-built
automation is based on Building Information Modelling (BIM) tech- (AB) BIM and compare it with the 4D as-designed (AD) BIM, and
nology, which is becoming a standard in the construction industry. BIM identify the differences between both models in a well-interpreted way.


Corresponding author.
E-mail address: [email protected] (Z. Pučko).

https://doi.org/10.1016/j.aei.2018.06.001
Received 30 October 2017; Received in revised form 8 April 2018; Accepted 1 June 2018
Available online 18 June 2018
1474-0346/ © 2018 Elsevier Ltd. All rights reserved.
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Such approach, also known as Scan-vs-BIM [29], provides efficient are generated by 3D scanners, observing each workplace and thus cap-
timely data analysis and visualisation of discrepancies in an early stage turing all changes done. The obvious way to capture every change is to
to prevent potential upcoming delays and enables timely decisions for mount a scanner to every actor on a building site. In case of workers, the
corrective actions [2,13,22]. protection helmet is the most practical place as it moves with the eyes
An overview of research on automation progress monitoring is given and consequently with the hands and tools of a worker. In case of ma-
in papers that compare different approaches, results, capturing tech- chines, a scanner shall be mounted on a location that will follow the
nologies, and shortcomings. Kopsida et al. [2] compare the advantages view of the active parts of a machine. In this way, scanners capture all
and disadvantages of real-time data capturing techniques on the con- changes performed on the building, and partial point clouds ensure
struction site based on photogrammetry, videogrammetry, and laser views on all emerging components, indoor and outdoor of the building.
scanning. In addition, the review by Omar & Nehdi [3] illustrates the Partial point cloud data is enhanced with location and time in-
categorization of the monitoring methods and the data acquisition formation, which ensure its adequate registration, including merging
technologies, as well as methods for data processing. Alizadehsalehi & into the 4D AB PC of a specific time interval i (e.g. an hour or a day). At
Yitmen [4] discuss the impact of the combination of data capturing the end of the time interval Δt, the 4D AB PC(i) is merged into the total
techniques with BIM in construction companies, whereas Pătrăucean 4D AB PC, which includes all changes from the beginning of con-
et al. [26] separate point cloud based methods for creating as-built (AB) struction, i.e. the complete as-built situation.
BIM models regarding the existence of an as-designed (AD) BIM model. In the next step, identification process with merging elements from
Further, some findings of point cloud based methods using laser scan- the 3D AD BIM with the 4D AB PC is made and, resulting in a structured
ning [23–28,36–38] understand these as expensive and time-con- model of the on-site situation, the so-called 4D AB BIM. The 4D AB BIM
suming. Omar & Nehdi [3] claim that laser scanning is still not widely is then compared to the 4D AD BIM and a list of differences is gener-
employed due to its high cost, need for a clear line of sight, and diffi- ated. Since the difference elements are linked to activities, the delayed
culty in using for interior works. Equally important are other obstacles, activities or those ahead are recognised and listed. The main steps of
such as additional work required (e.g. equipment installation and ca- the proposed Automated Continuous Construction Progress Monitoring
libration), marker or target installation, determining the locations of (ACCPM) method are summarised in Table 1.
equipment, engaging experts for equipment handling, and limited The basic supposition of our method is that work is always per-
usefulness of the obtained results. On the other hand, less expensive formed by or in the presence of workers, or by autonomous machines.
point cloud based methods, used for reconstructions through images Therefore, it is logical to scan and communicate the situation with their
[23,24,33,39,40], do not solve the problem of progress monitoring for help in a way, which does not hinder anyone in their primary work. The
interior tasks, since most images are taken outdoors. proposed method overcomes many deficiencies detected in previously
Omar & Nehdi [3] introduce a method, where 3D range cameras or known methods. It is anticipated to use equipment from the category of
RGB-D cameras respectively have been used for data acquisition, pro- 3D sensing technologies for the production of range images.
ducing range images, also referred to as depth images. The method is Workers' protection helmets shall be equipped with scanning de-
able to cover a wide field of view and is less costly than laser scanning, vices that are acceptable low in mass, volume and cost, ensure con-
but more expensive than photogrammetry. It is suitable for short-range tinuous point cloud acquisition in the vicinity of the worker's area (up
applications and has a great potential for site spatial sensing and to 3 m), enable indoor and outdoor operation, enable location and time
modelling applications. As stated in [41] the RGB-D sensors and digital detection, and are fully mobile with power autonomy for a working
cameras used for photogrammetry are nowadays even in the same price day. Positioning and wireless communication functionality shall be
range. Furthermore, Kinect is one of the most popular 3D scanning supported for easier processing.
device used for semantic analysis of buildings [42], geometric quality of The essential novelty of the ACCPM method is in the way of point
depth data and indoor mapping applications [43], close range 3D cloud acquisition, which (i) is continuous, (ii) does not require any
measurements and modelling [44], and automated recognition of con- additional monitoring or surveying activities, and (iii) data acquisition
struction worker actions [34]. Until today, two versions of Kinect have is taken automatically only where changes occur (Fig. 1). In all other
been developed and compared in [45], and their functionality further methods it is not known, which parts of the building have been added
improved in [46]. or changed from the previous scan, therefore the entire building needs
According to the findings in the reviewed works, none of the ob- to be scanned every time when construction progress has to be checked.
served methods is yet able to provide comprehensive and reliable Since such total scanning requires significant effort, it is not done daily,
monitoring of construction, which would cover the whole building but rather in longer intervals. In ACCPM each change is scanned and
(outdoor and indoor) through the entire construction process. Thus, our merged with other partial scans within a given time interval Δt,
research focuses on methods, based on 4D as-built (AB) vs. 4D as-de- creating a Δt AB BIM. The 4D AB BIM is then an accumulation of a
signed (AD) comparison, referred to as Scan-vs-BIM, while our goal was continuous sequence of all Δt AB BIM from the beginning of the project
to develop a method, which would resolve the need for additional ef- until the present moment. Hereby Δt is typically shorter than the scan
forts of point cloud acquisition, and the problem of incomplete point intervals in other methods and can be set to any desired value, e.g. an
clouds due to obscured building elements, or due to missing parts of hour or a day.
construction (e.g. indoor or outdoor scanning only). The basic idea was
to continuously generate 4D point clouds on the go by gathering 3D 2.1. Partial point cloud acquisition
scans from workers' helmets, equipped with appropriate low-cost 3D
scanning devices, such as Kinect [43,47]. In our experiments, we used Point clouds are generated by wearable, helmet-mounted scanners
Kinect v2 sensor, but many others would fit this category (e.g. Xtion Pro that scan each workplace from a close distance of around 1 m. Hereby
Live 3D sensor or Structure sensor). The method is further explained in the scanning margin distances can be set according to the type of the
the paper. scanning device used. Since each scanning device is constantly obser-
ving the work in progress in front of a worker (or a machine), every
2. Methodology change in the building is recorded (Fig. 2). The scanning devices
transfer their partial scans to a server either wirelessly or through a
The proposed Scan-vs-BIM method for automated construction pro- docking station where a helmet is put at the end of the day. Overall,
gress monitoring requires the existence of 3D and 4D AD BIM. The 4D each way has its advantages and disadvantages regarding the config-
AD BIM is permanently compared to the on-site situation in form of a 4D uration of the helmet scanning system (memory, Wi-Fi, power supply)
AB BIM, which is generated through a series of partial point clouds. They and the required infrastructure.

28
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Table 1
Steps of the proposed automated construction progress monitoring method.
Step Illustration

1 3D scanning → a single scanner provides multiple frames during a


specific time interval Δt

Multiple frames of scanner “a”, created in Δt


2 Registration of frames for each scanner → locally registered
partial point clouds (pPC)

pPCa pPCb
3 Registration of partial point clouds into the reference coordinate
systema → 4D AB PC (i) for the ith time interval Δt
Positioning and orientation information within the reference
system is required for each pPC

pPCa + pPCb → 4D AB PC(i)


4 4D AB PC (i) cut & clean

5 Identification of BIM elements contained in 4D AB PC(i), using


the 3D AD BIM as the reference model → Δt AB BIM

Δt AB BIM
6 Adding Δt AB BIM, which represents changes in Δt, to the n
4D AB BIM = ∑i = 1 Δt (i) AB BIM
complete 4D AB BIM
7 Finding differences between 4D AD BIM and 4D AB BIM Eb = 4D AD BIM - 4D AB BIM, elements behind schedule
Ea = 4D AB BIM - 4D AD BIM, elements ahead of schedule
When Eb = {} and Ea = {}, no deviation from schedule

a
For reference coordinate system see further details in Section 2.4.

29
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Fig. 1. Comparison of current Scan-vs-BIM methods and the proposed ACCPM method.

very short intervals, they represent an overlapping and thus well-con-


nected sequence of point clouds, which can be registered into one single
partial point cloud (pPC) automatically by using the Iterative Closest
Point (ICP) algorithm [48]. ICP represents a robust solution for iterative
calculation of point correspondences within each point cloud frame.
The coordinate system is unique for each pPC and originates in the
starting position of the scanner.

2.3. Registration of multiple partial point clouds

On the other hand, registering two or more partial point clouds


(pPC) into the reference coordinate system (RCS) is a more challenging
problem. In case of the known exact position of each pPC in the RCS,
Fig. 2. A scenario of indoor and outdoor continuous partial point cloud ac- registration is a simple transformation into the RCS and the union of all
quisition on-site using multiple helmet-mounted scanners. pPCs. If the exact position is not available, two partial point clouds can
be aligned when at least four equivalent point pairs are known.
According to [33], the process can be automated by various methods,
2.2. Registration of partial point cloud frames
for example by using a camera that is bundled with the range scanner in
order to obtain combined 3D point measurements and geometry-re-
On the basic level, scanners acquire point cloud frames, which to-
gistered intensity images, with at least five matched keypoint pairs,
gether constitute a partial point cloud. Multiple frames could be either
which represent the relative camera motion between the two intensity
processed locally or sent to the server. Since frames are generated in
images using structure from motion vision techniques [49–52]. The

30
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

second technique is the semi-automated plane-based registration


system for coarse registration of laser scanned 3D point clouds with
project 3D models [53] using one-click RANSAC-supported extraction
method. The next technique is the principal component analysis (PCA)
[54] that use ICP and the Levenberg–Marquardt algorithm. Another
technique is the method for computing persistent point feature histo-
grams [55], which assemble pairs of histogram-features in the first
point cloud and check for corresponding pairs (points with similar
histograms and distances to each other) in the second one. Similarly,
the Fast Point Feature Histograms (FPFH) for 3D registration [56] uses
the sample consensus-based initial alignment algorithm (SAC-IA),
which performs fast searches to find a good alignment solution and can
be further refined using a non-linear optimization method.
Another promising technique that covers the problem of positioning
and registration of point clouds is the simultaneous localisation and Fig. 3. 4D as-built point cloud (4D AB PC) of the observed building after po-
mapping (SLAM) that constructs the unknown environment and si- sitioning and orientation of partial point clouds, partial point cloud registration,
multaneously keeps track of the locations of the scanner. SLAM does not point cloud cutting and cleaning.
solve the construction of an unknown environment perfectly, but at
operational compliance [57–69]. Different algorithms, e.g. Kalman fil- should be removed by the cutting method. Therefore, BIM element
ters, particle filters, Monte Carlo are used mostly in combination, ac- bounding boxes are created for this purpose, representing volumes of
cording to the purposes and methods, for example in the Google Tango the elements increased by 10%. All points outside of the bounding box
Project. are obsolete and therefore cut off.
Our method proposes acquisition of multiple partial point clouds, Both, cleaning and cutting can significantly reduce the number of
which are related to various locations and generated simultaneously points in the point cloud and will accelerate the entire process. The
and continuously. Research continues to find the most appropriate al- result is a point cloud of the entire as-built situation, the 4D AB PC,
gorithm to fulfil the gap for precise partial point cloud registration. which includes all elements that have been built in a specific time in-
terval Δt (Fig. 3). In other words, this 4D AB PC is not necessarily
2.4. Partial point cloud positioning and orientation complete, since elements built before the time interval Δt might be
missing. However, the operation of completing the 4D model is per-
As mentioned in Section 2.3, the ideal information for exact regis- formed in the following step, as it is easier to merge elements on the
tration of pPCs is their position within the RCS. To obtain it, the scanner object model than on the point cloud level.
position and orientation has to be automatically recorded along with
the time stamp during partial point cloud acquisition.
Generally, there are various positioning methods, such as GPS na- 2.6. Generating the 4D AB BIM by identification of 3D BIM elements
vigation, Wi-Fi triangulation, use of Bluetooth beacons, etc. [70-72]. contained within 4D AB PC
The positioning error for GPS is 5–20 m, for Wi-Fi 5–15 m and for
Bluetooth 1–3 m, which is insufficient for pPC registration. Taneja et al. 3D BIM includes the exact geometry of each building element. In
[73] evaluate in their study the effects of positioning data quality this step, the 4D AB PC(i) and the 3D BIM are compared to identify all
(absolute and relative) and navigation models (network and metric) on elements of the 3D BIM contained in the 4D AB PC. For identification, a
map-matching of indoor positioning data with eight different algo- software is used that has been developed at the University of Maribor
rithms to get accurate and reliable indoor positioning. The main finding (the method is explained in detail in [74]), and identifies elements
is that the one-size-fit-all principle is not applicable in the case of map- correctly when more than 1/3 of their surface is covered by cloud
matching of indoor positioning data, but suggests to provide an accu- points. The result is a Δt AB BIM, which contains all elements found in
rate starting positions and recalibration points at fixed known locations the observed time interval Δt. This means some elements that already
with Bluetooth beacons or radiofrequency identification (RFID) tags. exist on-site could be missing, therefore the created Δt AB BIM is
For further investigation, the study determines the assumption of merged with the previous to generate the total 4D AB BIM. Fig. 4 shows
adding Gaussian noise to the raw data during the Monte Carlo simu- an example of the identification process to generate the 4D AB BIM
lation process with positioning data fusion along with map-matching, model.
fusing relative and absolute positioning data.
In fact, integrated platforms for upgrading the scanner for their 2.7. 4D AB BIM and 4D AD BIM comparison
precise positioning and orientation are currently being developed.
4D AB BIM represents all elements of the construction projects that
2.5. Point cloud cleaning and cutting have already been built up to the specific point in time. Based on the
existence of 4D AD BIM and comparison with 4D AB BIM, it is easy to
Registration of multiple pPC leads to duplication of points that do identify deviations from the schedule plan. The difference is re-
not contribute to better quality and more accurate identification of presented as a list of elements that are either built ahead of the planned
elements, but just increase the size of the point cloud and slow down time or have not been built on time. If the list of elements is empty, it
the entire process. It is, therefore, appropriate to clean the duplicate
points by setting the appropriate minimal distance between two
neighbour points, which will satisfy the criteria of point cloud density
and local element coverage for correct element identification, as de-
scribed in [74].
During 3D scanning, temporary objects emerge in the point cloud,
which are not included in the 3D model (such as depot of the in-
stallation material, building machinery, and equipment, other workers, Fig. 4. Elements of 3D BIM, identified within 4D AB PC, result in 4D AB BIM,
etc.). Such areas of a point cloud can lead to false identification and which actually represents the intersection of both input sets.

31
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Fig. 5. Difference between 4D AD BIM and 4D AB BIM and the resulting ele-
ments.

Fig. 8. 3D BIM geometry model and its local coordinate system.

Kindergarten Pekre during construction. The building has two floors


and is designed as “L” form with dimensions of 11.10 m × 17.10 m and
21.10 m × 9.90, and height of 9.47 m and 9.17 m. The gross area of the
floor is 398.70 m2 and 797.40 m2 in total. The main construction ma-
terials are clay bricks and reinforced concrete slabs.
Fig. 6. Differences between both models, the 4D AD BIM with missing elements 3D BIM geometry model was created using ArchiCAD 20 with local
emphasized. coordinate axis and origin set, as shown in Fig. 8. The geolocation
(latitude Ø, longitude λ, height h) and orientation were determined as
means there are no differences and everything was done at the right shown in Fig. 9.
time. Since each element is linked to an activity in the project schedule, Because our methodology is in the testing phase, monitoring was
deviations in the schedule can be identified and listed. Fig. 5 illustrates not carried out permanently but on selected dates, when a more sig-
the difference operation on an example. nificant work progress was performed. This means that on the specific
Fig. 5 shows a set of elements that are missing in the 4D AB BIM and dates a simulation of partial point cloud data acquisitions on different
are thus behind the schedule. These elements are emphasized in Fig. 6 workplaces was done within a specific time interval Δt (level of mag-
in the context of 4D AD BIM, while Fig. 7 shows the delay in the project nitude of an hour). For the purpose of the methodology validation, the
schedule caused by the missing elements. building was built, as modelled, with four floors as shown in Fig. 10.
Trimble Vico Office [75], a BIM Project Management Software, was Each floor was divided into workplace zones.
used for the comparison of the 4D AD BIM and 4D AB BIM model. In
Vico Office, the direct connection between 3D and 4D BIM is estab- 3.1. Partial point cloud acquisition
lished. So it is easy to represent differences as isolated resulting ele-
ments (Fig. 5) or highlighted elements in the context of 4D AD BIM In our experiments, we were using Kinect v2 scanner, which has the
(Fig. 6) and also as deviations in the schedule plan (planned-vs-actual) following characteristics [44,46]:
as shown in Fig. 7. A fully automatic BIM model comparison and
scheduling updating is planned in the continuation of our project. For • RGB camera with a resolution of 1920 × 1080 2D pixels
BIM model comparison, the Solibri Model Checker [76] could be used • IR camera with a resolution of 512 × 424 3D pixels
and for scheduling updates some promising methods are given in • Framerate up to 30 Hz
publications [77–79]. • Field of view for depth sensing of 70 degrees horizontally and 60
degrees vertically
• Useful range from 0.8–4.5 m
3. Method validation • Depth accuracy in the central position < 2 mm within the distance
0.5–3.0 m from the Kinect v2
The validation of the proposed method was performed on

Fig. 7. Differences in time schedule (difference activities).

32
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Fig. 9. Building geolocation and orientation.

Fig. 10. The four floors of the test building: foundation (a), ground floor (b), 1st floor (c) and roof (d).

Kinect uses its own local coordinate system with the origin (x = 0,
y = 0, z = 0) located at the centre of the IR sensor [80], as shown in
Fig. 11.
The anticipated scanning technology does not ensure a high point
cloud accuracy and density, it does, however, fit the required criteria
and enables efficient element identification for the required classes of
elements according to the used metrics [74].
77 workplaces have been observed and 3388 partial point clouds
Fig. 11. Scanner local coordinate system and origin.

33
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Fig. 12. Three workers (worker A – (a), worker B – (b) and worker C – (c)) mounted a window frame on the ground floor. Real scene image (d) and appropriate part
of the 3D BIM model (e).

Fig. 13. A worker built a partition wall on the 1st floor. Partial point cloud (a), real scene image (b) and appropriate part of the 3D BIM model (c).

Fig. 14. Partial point clouds from worker A (a), worker B (b) and worker C (c).

created during construction. The first example of a partial point cloud is


the workplace where three workers mounted a window frame on the
ground floor (Fig. 12). The second example is the workplace where a
worker built a partition wall on the 1st floor (Fig. 13).

3.2. Partial point cloud multiple frames registration

For the registration of partial point cloud frames, CloudCompare


software was used. In both examples, to align multiple frames of the
individual partial point clouds from each worker, the ICP algorithm was
used. In the first example, the partial point cloud from worker A has
407,981 points, worker B 648,459 and worker C 864,612 points
(Fig. 14).
In the second example the partial point cloud results in 648,459
points (Fig. 15).

3.3. Partial point cloud registration

The first example was demanding in terms of registration because


Fig. 15. Partial point cloud of worker’s D workplace. partial point clouds of each worker were far apart and could not be

34
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Table 2
Appropriate values for precise positioning and orientation of the three workers
A, B, and C.
Worker X [m] Y [m] Z [m] Orientation [°]

A 26.69 9.61 1.60 216.06


B 28.36 9.88 1.85 214.58
C 29.81 9.49 1.77 261.04

positioning and orientation was determined manually, using a laser


Fig. 16. Final single point cloud of the workplace where three workers
distance meter and a compass (see Fig. 17). The position of the 3D
mounted a window frame on the ground floor. scanner is given by X, Y, Z coordinates, which represent the coordinates
in the reference coordinate system and are aligned with the coordinate
system of the 3D BIM model. The orientation is given in degrees related
to North.
For the first example, the precise positioning and orientation are
shown in Fig. 18 and appropriate values for coordinates and degrees in
Table 2.
For the second example, the precise positioning and orientation are
shown in Fig. 19 and appropriate values for coordinates and degrees in
Table 3.

3.5. Point cloud cleaning and cutting

Duplicate points in point clouds, which are surplus points and not
needed for correct identification of elements, are defined as points with
the distance between neighbour points greater than 5 mm. The result of
duplicate points removal is a point cloud with a higher level of uni-
Fig. 17. Laser distance meter (a) and compass (b).
formity and a lower level of density, which speeds up the following
procedures. In the first example, the single point cloud has been re-
registered automatically by using the ICP algorithm. In general, two duced from 1,921,052 to 816,307 points (Fig. 20).
partial point clouds can be aligned by picking at least four equivalent As described in Section 2.5, the areas with temporary objects re-
points belonging to both point clouds, where one is the reference cloud present obsolete points in the point cloud and should be cut away. For
and the other the aligned cloud. This requires a lot of manual work, it is the second example, the cutting and cleaning resulted in the point cloud
time-consuming, sometimes it has to be repeated, the result is not with 278,935 points (Fig. 21).
precise and leads to limited usefulness. As described in Section 2.3, Following the steps described above for all 3388 partial point
there is no automated solution, it is still an open problem. One possible clouds, the final point cloud of the entire as-built scene, the 4D AB PC at
solution for automation would be to attach a precise positioning and the end of construction, included 92,631,785 points (Fig. 22).
orientation subsystem to Kinect. In our first example, the precise posi-
tioning and orientation was done manually (see Section 3.4). The result
was a point cloud with 1,921,052 points (Fig. 16). 3.6. Generating 4D AB BIM by identification of 3D BIM elements contained
in 4D AB PC

3.4. Partial point cloud positioning and orientation Identification of BIM elements within the point cloud was per-
formed using a software developed at the University of Maribor (the
In our methodology validation, the partial point cloud precise method is explained in detail in [74]). The input to the programme is

Fig. 18. Precise positioning and orientation of the three workers (A, B and C), who mounted a window frame on the ground floor.

35
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Fig. 19. Precise positioning and orientation of the worker building a partition wall on the 1st floor.

Table 3 had to be built and the door mounted. The result of the identification
Appropriate values for precise positioning and orientation of the worker D. process shows that the wall (green) exists and the door (red) is missing
Worker X [m] Y [m] Z [m] Orientation [°]
(Fig. 24).
In general, the identification process was done in the appropriate
D 25.04 19.94 5.87 340.57 time interval Δt for all elements where the appropriate models (3D AD
BIM, 4D AD BIM and 4D AB PC) are used as input, and the output result
represents the Δt AB BIM. Our paper presents two examples for a more
illustrative view, where the first example has the time stamp 22. 08.
2017 and the second example the time stamp 19. 05. 2017.
Fig. 25 shows the final result of the identification process for both
examples, the Δt AB BIM models, related to time interval Δt.
Since the resulting Δt AB BIM is only containing changes that have
been done in the observed time interval Δt, we need to add it to all
previously created Δt (i) AB BIM to generate the total 4D AB BIM (see
Table 1):
n
4D AB BIM = ∑ Δt (i) AB BIM
i=1
Fig. 20. Cleaned point cloud of the first example.

the 3D AD BIM, the 4D AD BIM, and the 4D AB PC, where the 4D BIM 3.7. 4D AB BIM and 4D AD BIM comparison
and PC have to be related to the same time interval. The output is a 4D
AB BIM, which includes all identified elements, each including an at- Comparison of the 4D AB BIM and the 4D AD BIM models was
tribute that represents the matching with the 4D AD BIM. Using this performed for both examples. There are no differences in the first ex-
attribute, the visual output can represent the missing elements with a ample, which means that the mounting of the window frame on the
different colour. Fig. 23 shows the input (a) and output (b), whereby in ground floor carried out by the three workers was done on time. In
this case there were no missing elements (all are shown in green), other words, no deviations appeared in the schedule plan. However,
meaning that all elements (the window frame) had been finished as there is a deviation in the second example (see Section 3.6), which can
planned. be either presented as a list of elements or in form of highlighted ele-
In the second example, according to the schedule, a partition wall ments in the 4D AD BIM model, as shown in Fig. 26.
The highlighted element (the missing door) is linked to the relevant

Fig. 21. Point cloud of the second example before cutting (a), after cutting (b) and after final cleaning (c).

36
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Fig. 22. Entire point cloud of the as-built scene, the 4D AB PC model.

Fig. 23. Aligned 3D BIM model and point cloud as the input for the identification process in the first example (a), and the results of the identification process (b).

Fig. 24. Aligned 3D BIM model and point cloud as the input for the identification process in the second example (a), and the results of the identification process (b).

activity in the project schedule, where deviations in the schedule can be project progress monitoring, based on the comparison of the 4D as-built
identified and listed. Fig. 27 is illustrating the planned schedule, the BIM and the 4D as-designed BIM models (Scan-vs-BIM). The acquisition
actual progress report and the re-planned schedule. of point clouds on the construction site requires the presence of workers
The identified deviation in the second example (the missing door) with protection helmets equipped with 3D scanning devices. Since
represents a delay in the schedule, which was caused by a subcontractor workers constantly observe their workplace, the continuous point cloud
responsible for door delivery. acquisition of all changes during the construction process are ensured.
The proposed method is overcoming many deficiencies of known ex-
isting methods. As a result, a new method of point cloud acquisition is
4. Conclusion and ongoing research continuous and does not require any additional monitoring or sur-
veying activities. Moreover, data acquisition is taken in real time where
The paper presents a novel approach to automated construction

37
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

Fig. 25. Δt AB BIM model of the first (a), and of the second example (b).

Fig. 26. 4D AD BIM of the second example with the highlighted element missing in the 4D AB BIM.

Fig. 27. Planned schedule (a), re-planned schedule with baseline (b) and actual progress report (c).

changes occur, either indoor or outdoor, it enables using autonomous PC (point cloud of the whole building under construction) of a specific
mobile low-cost 3D scanning devices, while real-time detection of the time frame. This 4D AB PC is used together with the 3D AD BIM as the
built elements enables early detection of deviations in the schedule input for the element identification process, which results in a struc-
plan. tured model of the on-site situation, the 4D AB BIM. The 4D AB BIM is
Furthermore, the 3D scanning devices do not need additional ex- then compared to the 4D AD BIM and a list of differences is generated.
perts’ work for handling, special maintenance activities or preliminary Since the difference elements are linked to activities in the project
preparatory work for use. They do not need to be highly accurate, as schedule, the delayed activities, as well as those ahead of due time, are
they are only required to produce range images and gather multiple recognised and listed.
partial point clouds of workplaces in the vicinity of a worker. The In conclusion, the research project is not yet finished and there are
partial point clouds ensure adequate registration into the total 4D AB some open problems and disadvantages. For the automation of partial

38
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

point cloud registration, the worker helmets should be equipped with a clouds, J. Inf. Technol. Constr. 20 (2015) 68–79 http://www.itcon.org/2015/5.
subsystem for precise positioning and orientation, and possibly also [18] B. Akinci, S. Kiziltas, E. Ergen, I. Karaesmen, F. Keceli, Modeling and analyzing the
impact of technology on data capture and transfer processes at construction sites: a
with a subsystem for communication with the server. In our experi- case study, J. Constr. Eng. Manage. 132 (2006) 1148–1157, http://dx.doi.org/10.
ment, we have simulated some activities manually, which should be 1061/(ASCE)0733-9364(2006) 132:11(1148).
otherwise automated. Further, several solutions are anticipated for the [19] H. Fathi, I. Brilakis, A videogrammetric as-built data collection method for digital
fabrication of sheet metal roof panels, Adv. Eng. Informatics 27 (2013) 466–476,
precise positioning and orientation, e.g. inertial positioning using http://dx.doi.org/10.1016/j.aei.2013.04.006.
Bluetooth beacons, UWB tags and anchors, Wi-Fi-based positioning, and [20] A. Bhatla, S.Y. Choe, O. Fierro, F. Leite, Evaluation of accuracy of as-built 3D
some others. The advantage is that we can improve an approximate modeling from photos taken by handheld digital cameras, Autom. Constr. 28 (2012)
116–127, http://dx.doi.org/10.1016/j.autcon.2012.06.003.
location with the help of some significant elements, which we could [21] C. Kim, C. Kim, H. Son, Fully automated registration of 3D data to a 3D CAD model
recognise from the BIM model. The existence of the 3D AD BIM is also for project progress monitoring, Autom. Constr. 35 (2013) 587–594, http://dx.doi.
significant for the registration of partial point clouds. For commu- org/10.1016/j.autcon.2013.01.005.
[22] N. Hyland, S.O. Keeffe, N. Hyland, C. Dore, S. Brodie, Validation of As-Is and As-
nication with the server, a Wi-Fi network could be installed on site and
generated IFC BIMs for advanced scan-to-BIM methods automatic validation of As-
the protection helmets upgraded with a Wi-Fi subsystem. To simplify Is and As-generated IFC BIMs for advanced scan-to-BIM, Methods (2017).
the helmet equipment, another solution would be to upload data at the [23] M. Golparvar-Fard, F. Peña-Mora, S. Savarese, Automated progress monitoring
end of the working day, when helmets are put into docking stations. We using unordered daily construction photographs and IFC-based building informa-
tion models, J. Comput. Civ. Eng. 29 (2015) 4014025, http://dx.doi.org/10.1061/
have also anticipated helmet equipment in an affordable price range. (ASCE)CP.1943-5487.0000205.
After solving the mentioned open problems, we plan to move from [24] H. Fathi, F. Dai, M. Lourakis, Automated as-built 3D reconstruction of civil infra-
the testbed to a real construction project, where we plan to use structure using computer vision: achievements, opportunities, and challenges, Adv.
Eng. Informatics. 29 (2015) 149–161, http://dx.doi.org/10.1016/j.aei.2015.01.
(probably awkward) prototype scanning helmets, upgraded with posi- 012.
tioning and communication subsystems and a server for partial point [25] Z.A. Memon, M. Zaimi, A. Majid, M. Mustaffar, A systematic approach for mon-
cloud registration and comparison of the 4D AD and 4D AB BIM. itoring and evaluating the construction project progress (accessed September 8,
2017), J. -The Inst. Eng. Malay. 67 (2006), http://dspace.unimap.edu.my/dspace/
bitstream/123456789/13560/1/026-032_systematic approach.pdf.
References [26] V. Pătrăucean, I. Armeni, M. Nahangi, J. Yeung, I. Brilakis, C. Haas, State of re-
search in automatic as-built modelling, Adv. Eng. Informatics 29 (2015) 162–171,
http://dx.doi.org/10.1016/j.aei.2015.01.001.
[1] C. Kim, H. Son, C. Kim, Automated construction progress measurement using a 4D
[27] F. Bosché, Automated recognition of 3D CAD model objects in laser scans and
building information model and 3D data, Autom. Constr. 31 (2013) 75–82, http://
calculation of as-built dimensions for dimensional compliance control in con-
dx.doi.org/10.1016/j.autcon.2012.11.041.
struction, Adv. Eng. Informatics. 24 (2010) 107–118, http://dx.doi.org/10.1016/j.
[2] M. Kopsida, I. Brilakis, P. Vela, A review of automated construction progress and
aei.2009.08.006.
inspection methods, Proc. 32nd CIB W78 Conf. Constr. IT, 2015, pp. 421–431.
[28] K.K. Han, D. Cline, M. Golparvar-Fard, Formalized knowledge of construction se-
[3] T. Omar, M.L. Nehdi, Data acquisition technologies for construction progress
quencing for visual monitoring of work-in-progress via incomplete point clouds and
tracking, Autom. Constr. 70 (2016) 143–155, http://dx.doi.org/10.1016/j.autcon.
low-LoD 4D BIMs, Adv. Eng. Informatics 29 (2015) 889–901, http://dx.doi.org/10.
2016.06.016.
1016/j.aei.2015.10.006.
[4] S. Alizadehsalehi, I. Yitmen, The impact of field data capturing technologies on
[29] F. Bosché, M. Ahmed, Y. Turkan, C.T. Haas, R. Haas, The value of integrating Scan-
automated construction project progress monitoring, Procedia Eng. (2016) 97–103,
to-BIM and Scan-vs-BIM techniques for construction monitoring using laser scan-
http://dx.doi.org/10.1016/j.proeng.2016.08.504.
ning and BIM: The case of cylindrical MEP components, Autom. Constr. 49 (2015)
[5] M. Golparvar-fard, F. Peña-mora, S. Savarese, Monitoring of Construction
201–213, http://dx.doi.org/10.1016/j.autcon.2014.05.014.
Performance Using Daily Progress Photograph Logs and 4D As-planned Models
[30] R. Maalek, F. Sadeghpour, Accuracy assessment of Ultra-Wide Band technology in
William E.O’Neil Pre-Doctoral Candidate in Construction Management and Master
tracking static resources in indoor construction scenarios, Autom. Constr. 30 (2013)
of Computer Science Student, University of Illinois, Urbana-Champaign, (n.d.), pp.
170–183, http://dx.doi.org/10.1016/j.autcon.2012.10.005.
53–63.
[31] A.M. Costin, J. Teizer, B. Schoner, RFID and bim-enabled worker location tracking
[6] Y. Turkan, F. Bosche, C.T. Haas, R. Haas, Automated progress tracking using 4D
to support real-time building protocol control and data visualization, J. Inf.
schedule and 3D sensing technologies, Autom. Constr. 22 (2012) 414–421, http://
Technol. Constr. 20 (2015) 495–517.
dx.doi.org/10.1016/j.autcon.2011.10.003.
[32] C. Zhang, D. Arditi, Automated progress control using laser scanning technology,
[7] J. Yang, O. Arif, P.A. Vela, J. Teizer, Z. Shi, Tracking multiple workers on con-
Autom. Constr. 36 (2013) 108–116, http://dx.doi.org/10.1016/j.autcon.2013.08.
struction sites using video cameras, Adv. Eng. Informatics. 24 (2010) 428–434,
012.
http://dx.doi.org/10.1016/j.aei.2010.06.008.
[33] I. Brilakis, M. Lourakis, R. Sacks, S. Savarese, S. Christodoulou, J. Teizer,
[8] M. Memarzadeh, M. Golparvar-Fard, J.C. Niebles, Automated 2D detection of
A. Makhmalbaf, Toward automated generation of parametric BIMs based on hybrid
construction equipment and workers from site video streams using histograms of
video and laser scanning data, Adv. Eng. Informatics 24 (2010) 456–465, http://dx.
oriented gradients and colors, Autom. Constr. 32 (2013) 24–37, http://dx.doi.org/
doi.org/10.1016/j.aei.2010.06.006.
10.1016/j.autcon.2012.12.002.
[34] V. Escorcia, M.A. Dávila, M. Golparvar-Fard, J.C. Niebles, Automated vision-based
[9] S. Chi, C.H. Caldas, Image-based safety assessment: automated spatial safety risk
recognition of construction worker actions for building interior construction op-
identification of earthmoving and surface mining activities, J. Constr. Eng. Manage.
erations using RGBD cameras, Constr. Res. Congr. 2012 (2012) 879–888, http://dx.
138 (2012) 341–351, http://dx.doi.org/10.1061/(ASCE)CO.1943-7862.0000438.
doi.org/10.1061/9780784412329.089.
[10] J. Gong, C.H. Caldas, An object recognition, tracking, and contextual reasoning-
[35] T. Krijnen, J. Beetz, An IFC schema extension and binary serialization format to
based video interpretation method for rapid productivity analysis of construction
efficiently integrate point cloud data into building models, Adv. Eng. Informatics
operations, Autom. Constr. 20 (2011) 1211–1226, http://dx.doi.org/10.1016/j.
(2016), http://dx.doi.org/10.1016/j.aei.2017.03.008.
autcon.2011.05.005.
[36] P.W. Theiler, Automated Registration of Terrestrial Laser Scanner Point Clouds,
[11] B.N. Baker, B.C. Murphy, D. Fisher, Factors affecting project success, A Comp. Anal.
2015.
between Law Pract. Penins. Malaysia Repub. Singapore Common Wealth Law Bull.
[37] S. Bechtold, B. Höfle, Helios: a multi-purpose lidar simulation framework for re-
(2000) 145–173. http://psrcentre.org/images/extraimages/271214036.pdf.
search, planning and training of laser scanning operations with airborne, ground-
[12] S.A. Assaf, S. Al-Hejji, Causes of delay in large construction projects, Int. J. Proj.
based mobile and stationary platforms, ISPRS Ann. Photogramm. Remote Sens.
Manage. 24 (2006) 349–357, http://dx.doi.org/10.1016/j.ijproman.2005.11.010.
Spat. Inf. Sci. (2016) 161–168, http://dx.doi.org/10.5194/isprsannals-III-3-161-
[13] M. Golparvar-Fard, F. Peña-Mora, C.A. Arboleda, S. Lee, Visualization of con-
2016.
struction progress monitoring with 4D simulation model overlaid on time-lapsed
[38] S. Hong, I. Park, J. Lee, K. Lim, Y. Choi, H.G. Sohn, Utilization of a terrestrial laser
photographs, J. Comput. Civ. Eng. 23 (2009) 391–404, http://dx.doi.org/10.1061/
scanner for the calibration of mobile mapping systems, Sensors (Switzerland) 17
(ASCE)0887-3801(2009) 23:6(391).
(2017), http://dx.doi.org/10.3390/s17030474.
[14] R. Navon, R. Sacks, Assessing research issues in Automated Project Performance
[39] S. El-Omari, O. Moselhi, Integrating 3D laser scanning and photogrammetry for
Control (APPC), Autom. Constr. 16 (2007) 474–484, http://dx.doi.org/10.1016/j.
progress measurement of construction work, Autom. Constr. 18 (2008) 1–9, http://
autcon.2006.08.001.
dx.doi.org/10.1016/j.autcon.2008.05.006.
[15] S. Scott, S. Assadi, A survey of the site records kept by construction supervisors,
[40] G.F. Mani, P.M. Feniosky, S. Savarese, D4AR-A 4-dimensional augmented reality
Constr. Manag. Econ. 17 (1999) 375–382, http://dx.doi.org/10.1080/
model for automating construction progress monitoring data collection, processing
014461999371574.
and communication, Electron. J. Inf. Technol. Constr. 14 (2009) 129–153.
[16] M. Golparvar-Fard, M. Asce, F. Peña-Mora, S. Savarese, M. Asce, F. Peña-Mora,
[41] Z. Pučko, D. Rebolj, Automated construction progress monitoring using continuous
S. Savarese, Integrated sequential as-built and as-planned representation with tools
multipoint indoor and outdoor 3D Scanning, in: Lean Comput. Constr. Congr. - Vol.
in support of decision-making tasks in the AEC/FM industry, J. Constr. Eng. Manag.
1 Proc. Jt. Conf. Comput. Constr., 2017, pp. 105–112. http://doi.org/10.24928/
137 (2011), http://dx.doi.org/10.1061/(ASCE)CO.1943-7862.0000371.
JC3-2017/0021.
[17] A. Braun, S. Tuttas, A. Borrmann, A concept for automated construction progress
[42] I. Armeni, O. Sener, A. Zamir, H. Jiang, 3D Semantic parsing of large-scale indoor
monitoring using BIM-based geometric constraints and photogrammetric point
spaces, in: CVPR, 2016, pp. 1534–1543. http://doi.org/10.1109/CVPR.2016.170.

39
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40

[43] K. Khoshelham, S.O. Elberink, Accuracy and resolution of kinect depth data for Trans. Pattern Anal. Mach. Intell. 35 (2013) 354–366, http://dx.doi.org/10.1109/
indoor mapping applications, Sensors (Basel). 12 (2012) 1437–1454, http://dx.doi. TPAMI.2012.104.
org/10.3390/s120201437. [62] M. Magnabosco, T.P. Breckon, Cross-spectral visual simultaneous localization and
[44] E. Lachat, H. Macher, M.A. Mittet, T. Landes, P. Grussenmeyer, First experiences mapping (SLAM) with sensor handover, Rob. Auton. Syst. 61 (2013) 195–208,
with kinect V2 sensor for close range 3D modelling, Int. Arch. Photogramm. Remote http://dx.doi.org/10.1016/j.robot.2012.09.023.
Sens. Spat. Inf. Sci. - ISPRS Arch. 40 (2015) 93–100. http://doi.org/10.5194/ [63] N. Karlsson, E. Di Bernardo, J. Ostrowski, L. Goncalves, P. Pirjanian, M.E. Munich,
isprsarchives-XL-5-W4-93-2015. The vSLAM algorithm for robust localization and mapping, Proc. – IEEE Int. Conf.
[45] O. Wasenmüller, D. Stricker, Comparison of kinect v1 and v2 depth images in terms Robot. Autom. 2005, 2005, pp. 24–29, , http://dx.doi.org/10.1109/ROBOT.2005.
of accuracy and precision, Lect. Notes Comput. Sci. (Including Subser. Lect. Notes 1570091.
Artif. Intell. Lect. Notes Bioinformatics). 10117 LNCS, 2017, pp. 34–45, , http://dx. [64] C.-C. Wang, C. Thorpe, S. Thrun, M. Hebert, H. Durrant-Whyte, Simultaneous lo-
doi.org/10.1007/978-3-319-54427-4_3. calization, mapping and moving object tracking, Int. J. Rob. Res. 26 (2007)
[46] L. Yang, L. Zhang, H. Dong, A. Alelaiwi, A. El Saddik, Evaluating and improving the 889–916, http://dx.doi.org/10.1177/0278364907081229.
depth accuracy of kinect for Windows v2, IEEE Sens. J. 15 (2015) 4275–4285, [65] V. Nguyen, A. Harati, R. Siegwart, A lightweight SLAM algorithm using Orthogonal
http://dx.doi.org/10.1109/JSEN.2015.2416651. planes for indoor mobile robotics, IEEE Int. Conf. Intell. Robot. Syst. 2007, pp.
[47] E. Lachat, H. Macher, M.-A. Mittet, T. Landes, P. Grussenmeyer, first experiences 658–663, , http://dx.doi.org/10.1109/IROS.2007.4399512.
with kinect v2 sensor for close range 3D modelling, Int. Arch. Photogramm. Remote [66] S. Lynen, T. Sattler, M. Bosse, J. Hesch, M. Pollefeys, R. Siegwart, Get out of my lab:
Sens. Spat. Inf. Sci. XL-5/W2, 2015, http://dx.doi.org/10.5194/isprsarchives-XL-5- large-scale, real-time visual-inertial localization, Robot. Sci. Syst. XI. (2015), http://
W4-93-2015. dx.doi.org/10.15607/RSS.2015.XI.037.
[48] S. Rusinkiewicz, Efficient Variants of the ICP Algorithm a r c Levoy, 2001, http:// [67] T. Schops, T. Sattler, C. Hane, M. Pollefeys, 3D Modeling on the go: interactive 3D
dx.doi.org/10.1109/IM.2001.924423. reconstruction of large-scale scenes on mobile devices, in: Proc. – 2015 Int. Conf. 3D
[49] O. Ozyesil, V. Voroninski, R. Basri, A. Singer, A Survey of Structure from Motion, Vision, 3DV 2015, 2015, pp. 291–299. http://doi.org/10.1109/3DV.2015.40.
2017, pp. 1–40, , http://dx.doi.org/10.1017/S096249291700006X. [68] M. Klingensmith, I. Dryanovski, S. Srinivasa, J. Xiao, Chisel: real time large scale 3D
[50] M.J. Westoby, J. Brasington, N.F. Glasser, M.J. Hambrey, J.M. Reynolds, “Structure- reconstruction onboard a mobile device using spatially hashed signed distance
from-Motion” photogrammetry: a low-cost, effective tool for geoscience applica- fields, Robot. Sci. Syst. XI. (2015), http://dx.doi.org/10.15607/RSS.2015.XI.040.
tions, Geomorphology. 179 (2012) 300–314, http://dx.doi.org/10.1016/j. [69] T. Sattler, M. Havlena, K. Schindler, M. Pollefeys, Large-scale location recognition
geomorph.2012.08.021. and the geometric burstiness problem, in: 2016 IEEE Conf. Comput. Vis. Pattern
[51] F. Dellaert, S.M. Seitz, C.E. Thorpe, S. Thrun, Structure from motion without cor- Recognit., 2016, pp. 1582–1590. http://doi.org/10.1109/CVPR.2016.175.
respondence, in: Proc. IEEE Conf. Comput. Vis. Pattern Recognition. CVPR 2000 [70] S. Mengin, C. Portes, J. Vialle, B. Vincent, Indoor positioning & navigation, Agone.
(Cat. No.PR00662), vol. 2, n.d., pp. 557–564. http://doi.org/10.1109/CVPR.2000. 1–32 (2016).
854916. [71] Pozyx – centimeter positioning for arduino, n.d. https://www.pozyx.io/ (accessed
[52] K. Häming, G. Peters, The structure-from-motion reconstruction pipeline – a survey October 5, 2017).
with focus on short image sequences, Kybernetika 46 (2010) 926–937. [72] Precise Indoor Positioning UWB RTLS – FCC certified, 30cm accuracy, n.d. https://
[53] F. Bosché, Plane-based registration of construction laser scans with 3D/4D building www.eliko.ee/products/kio-rtls/ (accessed October 5, 2017).
models, Adv. Eng. Informatics 26 (2012) 90–102, http://dx.doi.org/10.1016/j.aei. [73] S. Taneja, B. Akinci, J.H. Garrett, L. Soibelman, H.A. Karimi, Effects of positioning
2011.08.009. data quality and navigation models on map-matching of indoor positioning data, J.
[54] J. Serafin, G. Grisetti, Using extended measurements and scene merging for efficient Comput. Civ. Eng. 30 (2016) 4014113, http://dx.doi.org/10.1061/(ASCE)CP.1943-
and robust point cloud registration, Rob. Auton. Syst. 92 (2017) 91–106, http://dx. 5487.0000439.
doi.org/10.1016/j.robot.2017.03.008. [74] D. Rebolj, Z. Pučko, N.Č. Babič, M. Bizjak, D. Mongus, Point cloud quality re-
[55] R.B. Rusu, N. Blodow, Z.C. Marton, M. Beetz, Aligning point cloud views using quirements for Scan-vs-BIM based automated construction progress monitoring,
persistent feature histograms, 2008 IEEE/RSJ Int. Conf. Intell. Robot. Syst. IROS, Autom. Constr. 84 (2017) 323–334, http://dx.doi.org/10.1016/j.autcon.2017.09.
2008, pp. 3384–3391, , http://dx.doi.org/10.1109/IROS.2008.4650967. 021.
[56] R.B. Rusu, N. Blodow, M. Beetz, Fast Point Feature Histograms (FPFH) for 3D re- [75] BIM Solutions|General Contractor Solutions, n.d. http://gc.trimble.com/product-
gistration, 2009 IEEE Int. Conf. Robot. Autom. 2009, pp. 3212–3217, , http://dx. categories/bim-solutions (accessed April 3, 2018).
doi.org/10.1109/ROBOT.2009.5152473. [76] Solibri, Solibri|Solibri Model Checker, Solibri, 2016. https://www.solibri.com/
[57] N. Zikos, V. Petridis, 6-DoF low dimensionality SLAM (L-SLAM), J. Intell. Robot. products/solibri-model-checker/ (accessed March 24, 2018).
Syst. Theory Appl. 79 (2015) 55–72, http://dx.doi.org/10.1007/s10846-014- [77] H. Hamledari, S.M. Asce, B. Mccabe, M. Asce, S. Davari, A. Shahi, Automated
0029-6. schedule and progress updating of IFC-based 4D BIMs, J. Comput. Civ. Eng. 31
[58] J. Engel, T. Schöps, D. Cremers, LSD-SLAM: Large-Scale Direct monocular SLAM, (2017) 1–16, http://dx.doi.org/10.1061/.
Lect. Notes Comput. Sci. (Including Subser. Lect. Notes Artif. Intell. Lect. Notes [78] H. Son, C. Kim, A.M. Asce, Y.K. Cho, M. Asce, Automated schedule updates using as-
Bioinformatics). 8690 LNCS, 2014, pp. 834–849, , http://dx.doi.org/10.1007/978- built data and a 4D building information model, J. Manag. Eng. 33 (2017) 1–13,
3-319-10605-2_54. http://dx.doi.org/10.1061/(ASCE)ME.1943-5479.0000528.
[59] T. Pire, T. Fischer, G. Castro, P. De Cristóforis, J. Civera, J. Jacobo Berlles, S-PTAM: [79] T. Valenko, U. Klanšek, An integration of spreadsheet and project management
Stereo Parallel Tracking and Mapping, Rob. Auton. Syst. 93 (2017) 27–42. http:// software for cost optimal time scheduling in construction, Organ. Technol. Manage.
doi.org/10.1016/j.robot.2017.03.019. Constr. Int. J. 9 (2017) 1627–1637, http://dx.doi.org/10.1515/otmcj-2016-0028.
[60] R. Mur-Artal, J.M.M. Montiel, J.D. Tardós, Orb-slam: a versatile and accurate [80] Microsoft, Coordinate mapping, Kinect Wind. SDK 2.0. 785530 (2014) 785530.
monocular slam system, IEEE Trans. Robot. 31 (2015) 1147–1163. https://msdn.microsoft.com/en-us/library/dn785530.aspx.
[61] D. Zou, P. Tan, CoSLAM: Collaborative visual SLAM in dynamic environments, IEEE

40

You might also like