xx - Automated continuous construction progress monitoring
xx - Automated continuous construction progress monitoring
A R T I C LE I N FO A B S T R A C T
Keywords: In recent years, exponential growth has been detected in research efforts focused on automated construction
Automation progress monitoring. Despite various data acquisition methods and approaches, the success is limited. This paper
Construction progress monitoring proposes a new method, where changes are constantly perceived and as-built model continuously updated
Data acquisition during the construction process, instead of periodical scanning of the whole building under construction. It
Scan-vs-BIM
turned out that low precision 3D scanning devices, which are closely observing active workplaces, are sufficient
4D point cloud
BIM elements identification
for correct identification of the built elements. Such scanning devices are small enough to fit onto workers’
protective helmets and on the applied machinery. In this way, workers capture all workplaces inside and outside
of the building in real time and record partial point clouds, their locations, and time stamps. The partial point
clouds are then registered and merged into a complete 4D as-built point cloud of a building under construction.
Identification of as-designed BIM elements within the 4D as-built point cloud then results in the 4D as-built BIM.
Finally, the comparison of the 4D as-built BIM and the 4D as-designed BIM enables identification of the dif-
ferences between both models and thus the deviations from the time schedule. The differences are reported in
virtual real-time, which enables more efficient project management.
⁎
Corresponding author.
E-mail address: [email protected] (Z. Pučko).
https://doi.org/10.1016/j.aei.2018.06.001
Received 30 October 2017; Received in revised form 8 April 2018; Accepted 1 June 2018
Available online 18 June 2018
1474-0346/ © 2018 Elsevier Ltd. All rights reserved.
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Such approach, also known as Scan-vs-BIM [29], provides efficient are generated by 3D scanners, observing each workplace and thus cap-
timely data analysis and visualisation of discrepancies in an early stage turing all changes done. The obvious way to capture every change is to
to prevent potential upcoming delays and enables timely decisions for mount a scanner to every actor on a building site. In case of workers, the
corrective actions [2,13,22]. protection helmet is the most practical place as it moves with the eyes
An overview of research on automation progress monitoring is given and consequently with the hands and tools of a worker. In case of ma-
in papers that compare different approaches, results, capturing tech- chines, a scanner shall be mounted on a location that will follow the
nologies, and shortcomings. Kopsida et al. [2] compare the advantages view of the active parts of a machine. In this way, scanners capture all
and disadvantages of real-time data capturing techniques on the con- changes performed on the building, and partial point clouds ensure
struction site based on photogrammetry, videogrammetry, and laser views on all emerging components, indoor and outdoor of the building.
scanning. In addition, the review by Omar & Nehdi [3] illustrates the Partial point cloud data is enhanced with location and time in-
categorization of the monitoring methods and the data acquisition formation, which ensure its adequate registration, including merging
technologies, as well as methods for data processing. Alizadehsalehi & into the 4D AB PC of a specific time interval i (e.g. an hour or a day). At
Yitmen [4] discuss the impact of the combination of data capturing the end of the time interval Δt, the 4D AB PC(i) is merged into the total
techniques with BIM in construction companies, whereas Pătrăucean 4D AB PC, which includes all changes from the beginning of con-
et al. [26] separate point cloud based methods for creating as-built (AB) struction, i.e. the complete as-built situation.
BIM models regarding the existence of an as-designed (AD) BIM model. In the next step, identification process with merging elements from
Further, some findings of point cloud based methods using laser scan- the 3D AD BIM with the 4D AB PC is made and, resulting in a structured
ning [23–28,36–38] understand these as expensive and time-con- model of the on-site situation, the so-called 4D AB BIM. The 4D AB BIM
suming. Omar & Nehdi [3] claim that laser scanning is still not widely is then compared to the 4D AD BIM and a list of differences is gener-
employed due to its high cost, need for a clear line of sight, and diffi- ated. Since the difference elements are linked to activities, the delayed
culty in using for interior works. Equally important are other obstacles, activities or those ahead are recognised and listed. The main steps of
such as additional work required (e.g. equipment installation and ca- the proposed Automated Continuous Construction Progress Monitoring
libration), marker or target installation, determining the locations of (ACCPM) method are summarised in Table 1.
equipment, engaging experts for equipment handling, and limited The basic supposition of our method is that work is always per-
usefulness of the obtained results. On the other hand, less expensive formed by or in the presence of workers, or by autonomous machines.
point cloud based methods, used for reconstructions through images Therefore, it is logical to scan and communicate the situation with their
[23,24,33,39,40], do not solve the problem of progress monitoring for help in a way, which does not hinder anyone in their primary work. The
interior tasks, since most images are taken outdoors. proposed method overcomes many deficiencies detected in previously
Omar & Nehdi [3] introduce a method, where 3D range cameras or known methods. It is anticipated to use equipment from the category of
RGB-D cameras respectively have been used for data acquisition, pro- 3D sensing technologies for the production of range images.
ducing range images, also referred to as depth images. The method is Workers' protection helmets shall be equipped with scanning de-
able to cover a wide field of view and is less costly than laser scanning, vices that are acceptable low in mass, volume and cost, ensure con-
but more expensive than photogrammetry. It is suitable for short-range tinuous point cloud acquisition in the vicinity of the worker's area (up
applications and has a great potential for site spatial sensing and to 3 m), enable indoor and outdoor operation, enable location and time
modelling applications. As stated in [41] the RGB-D sensors and digital detection, and are fully mobile with power autonomy for a working
cameras used for photogrammetry are nowadays even in the same price day. Positioning and wireless communication functionality shall be
range. Furthermore, Kinect is one of the most popular 3D scanning supported for easier processing.
device used for semantic analysis of buildings [42], geometric quality of The essential novelty of the ACCPM method is in the way of point
depth data and indoor mapping applications [43], close range 3D cloud acquisition, which (i) is continuous, (ii) does not require any
measurements and modelling [44], and automated recognition of con- additional monitoring or surveying activities, and (iii) data acquisition
struction worker actions [34]. Until today, two versions of Kinect have is taken automatically only where changes occur (Fig. 1). In all other
been developed and compared in [45], and their functionality further methods it is not known, which parts of the building have been added
improved in [46]. or changed from the previous scan, therefore the entire building needs
According to the findings in the reviewed works, none of the ob- to be scanned every time when construction progress has to be checked.
served methods is yet able to provide comprehensive and reliable Since such total scanning requires significant effort, it is not done daily,
monitoring of construction, which would cover the whole building but rather in longer intervals. In ACCPM each change is scanned and
(outdoor and indoor) through the entire construction process. Thus, our merged with other partial scans within a given time interval Δt,
research focuses on methods, based on 4D as-built (AB) vs. 4D as-de- creating a Δt AB BIM. The 4D AB BIM is then an accumulation of a
signed (AD) comparison, referred to as Scan-vs-BIM, while our goal was continuous sequence of all Δt AB BIM from the beginning of the project
to develop a method, which would resolve the need for additional ef- until the present moment. Hereby Δt is typically shorter than the scan
forts of point cloud acquisition, and the problem of incomplete point intervals in other methods and can be set to any desired value, e.g. an
clouds due to obscured building elements, or due to missing parts of hour or a day.
construction (e.g. indoor or outdoor scanning only). The basic idea was
to continuously generate 4D point clouds on the go by gathering 3D 2.1. Partial point cloud acquisition
scans from workers' helmets, equipped with appropriate low-cost 3D
scanning devices, such as Kinect [43,47]. In our experiments, we used Point clouds are generated by wearable, helmet-mounted scanners
Kinect v2 sensor, but many others would fit this category (e.g. Xtion Pro that scan each workplace from a close distance of around 1 m. Hereby
Live 3D sensor or Structure sensor). The method is further explained in the scanning margin distances can be set according to the type of the
the paper. scanning device used. Since each scanning device is constantly obser-
ving the work in progress in front of a worker (or a machine), every
2. Methodology change in the building is recorded (Fig. 2). The scanning devices
transfer their partial scans to a server either wirelessly or through a
The proposed Scan-vs-BIM method for automated construction pro- docking station where a helmet is put at the end of the day. Overall,
gress monitoring requires the existence of 3D and 4D AD BIM. The 4D each way has its advantages and disadvantages regarding the config-
AD BIM is permanently compared to the on-site situation in form of a 4D uration of the helmet scanning system (memory, Wi-Fi, power supply)
AB BIM, which is generated through a series of partial point clouds. They and the required infrastructure.
28
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Table 1
Steps of the proposed automated construction progress monitoring method.
Step Illustration
pPCa pPCb
3 Registration of partial point clouds into the reference coordinate
systema → 4D AB PC (i) for the ith time interval Δt
Positioning and orientation information within the reference
system is required for each pPC
Δt AB BIM
6 Adding Δt AB BIM, which represents changes in Δt, to the n
4D AB BIM = ∑i = 1 Δt (i) AB BIM
complete 4D AB BIM
7 Finding differences between 4D AD BIM and 4D AB BIM Eb = 4D AD BIM - 4D AB BIM, elements behind schedule
Ea = 4D AB BIM - 4D AD BIM, elements ahead of schedule
When Eb = {} and Ea = {}, no deviation from schedule
a
For reference coordinate system see further details in Section 2.4.
29
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Fig. 1. Comparison of current Scan-vs-BIM methods and the proposed ACCPM method.
30
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
31
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Fig. 5. Difference between 4D AD BIM and 4D AB BIM and the resulting ele-
ments.
32
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Fig. 10. The four floors of the test building: foundation (a), ground floor (b), 1st floor (c) and roof (d).
Kinect uses its own local coordinate system with the origin (x = 0,
y = 0, z = 0) located at the centre of the IR sensor [80], as shown in
Fig. 11.
The anticipated scanning technology does not ensure a high point
cloud accuracy and density, it does, however, fit the required criteria
and enables efficient element identification for the required classes of
elements according to the used metrics [74].
77 workplaces have been observed and 3388 partial point clouds
Fig. 11. Scanner local coordinate system and origin.
33
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Fig. 12. Three workers (worker A – (a), worker B – (b) and worker C – (c)) mounted a window frame on the ground floor. Real scene image (d) and appropriate part
of the 3D BIM model (e).
Fig. 13. A worker built a partition wall on the 1st floor. Partial point cloud (a), real scene image (b) and appropriate part of the 3D BIM model (c).
Fig. 14. Partial point clouds from worker A (a), worker B (b) and worker C (c).
34
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Table 2
Appropriate values for precise positioning and orientation of the three workers
A, B, and C.
Worker X [m] Y [m] Z [m] Orientation [°]
Duplicate points in point clouds, which are surplus points and not
needed for correct identification of elements, are defined as points with
the distance between neighbour points greater than 5 mm. The result of
duplicate points removal is a point cloud with a higher level of uni-
Fig. 17. Laser distance meter (a) and compass (b).
formity and a lower level of density, which speeds up the following
procedures. In the first example, the single point cloud has been re-
registered automatically by using the ICP algorithm. In general, two duced from 1,921,052 to 816,307 points (Fig. 20).
partial point clouds can be aligned by picking at least four equivalent As described in Section 2.5, the areas with temporary objects re-
points belonging to both point clouds, where one is the reference cloud present obsolete points in the point cloud and should be cut away. For
and the other the aligned cloud. This requires a lot of manual work, it is the second example, the cutting and cleaning resulted in the point cloud
time-consuming, sometimes it has to be repeated, the result is not with 278,935 points (Fig. 21).
precise and leads to limited usefulness. As described in Section 2.3, Following the steps described above for all 3388 partial point
there is no automated solution, it is still an open problem. One possible clouds, the final point cloud of the entire as-built scene, the 4D AB PC at
solution for automation would be to attach a precise positioning and the end of construction, included 92,631,785 points (Fig. 22).
orientation subsystem to Kinect. In our first example, the precise posi-
tioning and orientation was done manually (see Section 3.4). The result
was a point cloud with 1,921,052 points (Fig. 16). 3.6. Generating 4D AB BIM by identification of 3D BIM elements contained
in 4D AB PC
3.4. Partial point cloud positioning and orientation Identification of BIM elements within the point cloud was per-
formed using a software developed at the University of Maribor (the
In our methodology validation, the partial point cloud precise method is explained in detail in [74]). The input to the programme is
Fig. 18. Precise positioning and orientation of the three workers (A, B and C), who mounted a window frame on the ground floor.
35
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Fig. 19. Precise positioning and orientation of the worker building a partition wall on the 1st floor.
Table 3 had to be built and the door mounted. The result of the identification
Appropriate values for precise positioning and orientation of the worker D. process shows that the wall (green) exists and the door (red) is missing
Worker X [m] Y [m] Z [m] Orientation [°]
(Fig. 24).
In general, the identification process was done in the appropriate
D 25.04 19.94 5.87 340.57 time interval Δt for all elements where the appropriate models (3D AD
BIM, 4D AD BIM and 4D AB PC) are used as input, and the output result
represents the Δt AB BIM. Our paper presents two examples for a more
illustrative view, where the first example has the time stamp 22. 08.
2017 and the second example the time stamp 19. 05. 2017.
Fig. 25 shows the final result of the identification process for both
examples, the Δt AB BIM models, related to time interval Δt.
Since the resulting Δt AB BIM is only containing changes that have
been done in the observed time interval Δt, we need to add it to all
previously created Δt (i) AB BIM to generate the total 4D AB BIM (see
Table 1):
n
4D AB BIM = ∑ Δt (i) AB BIM
i=1
Fig. 20. Cleaned point cloud of the first example.
the 3D AD BIM, the 4D AD BIM, and the 4D AB PC, where the 4D BIM 3.7. 4D AB BIM and 4D AD BIM comparison
and PC have to be related to the same time interval. The output is a 4D
AB BIM, which includes all identified elements, each including an at- Comparison of the 4D AB BIM and the 4D AD BIM models was
tribute that represents the matching with the 4D AD BIM. Using this performed for both examples. There are no differences in the first ex-
attribute, the visual output can represent the missing elements with a ample, which means that the mounting of the window frame on the
different colour. Fig. 23 shows the input (a) and output (b), whereby in ground floor carried out by the three workers was done on time. In
this case there were no missing elements (all are shown in green), other words, no deviations appeared in the schedule plan. However,
meaning that all elements (the window frame) had been finished as there is a deviation in the second example (see Section 3.6), which can
planned. be either presented as a list of elements or in form of highlighted ele-
In the second example, according to the schedule, a partition wall ments in the 4D AD BIM model, as shown in Fig. 26.
The highlighted element (the missing door) is linked to the relevant
Fig. 21. Point cloud of the second example before cutting (a), after cutting (b) and after final cleaning (c).
36
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Fig. 22. Entire point cloud of the as-built scene, the 4D AB PC model.
Fig. 23. Aligned 3D BIM model and point cloud as the input for the identification process in the first example (a), and the results of the identification process (b).
Fig. 24. Aligned 3D BIM model and point cloud as the input for the identification process in the second example (a), and the results of the identification process (b).
activity in the project schedule, where deviations in the schedule can be project progress monitoring, based on the comparison of the 4D as-built
identified and listed. Fig. 27 is illustrating the planned schedule, the BIM and the 4D as-designed BIM models (Scan-vs-BIM). The acquisition
actual progress report and the re-planned schedule. of point clouds on the construction site requires the presence of workers
The identified deviation in the second example (the missing door) with protection helmets equipped with 3D scanning devices. Since
represents a delay in the schedule, which was caused by a subcontractor workers constantly observe their workplace, the continuous point cloud
responsible for door delivery. acquisition of all changes during the construction process are ensured.
The proposed method is overcoming many deficiencies of known ex-
isting methods. As a result, a new method of point cloud acquisition is
4. Conclusion and ongoing research continuous and does not require any additional monitoring or sur-
veying activities. Moreover, data acquisition is taken in real time where
The paper presents a novel approach to automated construction
37
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
Fig. 25. Δt AB BIM model of the first (a), and of the second example (b).
Fig. 26. 4D AD BIM of the second example with the highlighted element missing in the 4D AB BIM.
Fig. 27. Planned schedule (a), re-planned schedule with baseline (b) and actual progress report (c).
changes occur, either indoor or outdoor, it enables using autonomous PC (point cloud of the whole building under construction) of a specific
mobile low-cost 3D scanning devices, while real-time detection of the time frame. This 4D AB PC is used together with the 3D AD BIM as the
built elements enables early detection of deviations in the schedule input for the element identification process, which results in a struc-
plan. tured model of the on-site situation, the 4D AB BIM. The 4D AB BIM is
Furthermore, the 3D scanning devices do not need additional ex- then compared to the 4D AD BIM and a list of differences is generated.
perts’ work for handling, special maintenance activities or preliminary Since the difference elements are linked to activities in the project
preparatory work for use. They do not need to be highly accurate, as schedule, the delayed activities, as well as those ahead of due time, are
they are only required to produce range images and gather multiple recognised and listed.
partial point clouds of workplaces in the vicinity of a worker. The In conclusion, the research project is not yet finished and there are
partial point clouds ensure adequate registration into the total 4D AB some open problems and disadvantages. For the automation of partial
38
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
point cloud registration, the worker helmets should be equipped with a clouds, J. Inf. Technol. Constr. 20 (2015) 68–79 http://www.itcon.org/2015/5.
subsystem for precise positioning and orientation, and possibly also [18] B. Akinci, S. Kiziltas, E. Ergen, I. Karaesmen, F. Keceli, Modeling and analyzing the
impact of technology on data capture and transfer processes at construction sites: a
with a subsystem for communication with the server. In our experi- case study, J. Constr. Eng. Manage. 132 (2006) 1148–1157, http://dx.doi.org/10.
ment, we have simulated some activities manually, which should be 1061/(ASCE)0733-9364(2006) 132:11(1148).
otherwise automated. Further, several solutions are anticipated for the [19] H. Fathi, I. Brilakis, A videogrammetric as-built data collection method for digital
fabrication of sheet metal roof panels, Adv. Eng. Informatics 27 (2013) 466–476,
precise positioning and orientation, e.g. inertial positioning using http://dx.doi.org/10.1016/j.aei.2013.04.006.
Bluetooth beacons, UWB tags and anchors, Wi-Fi-based positioning, and [20] A. Bhatla, S.Y. Choe, O. Fierro, F. Leite, Evaluation of accuracy of as-built 3D
some others. The advantage is that we can improve an approximate modeling from photos taken by handheld digital cameras, Autom. Constr. 28 (2012)
116–127, http://dx.doi.org/10.1016/j.autcon.2012.06.003.
location with the help of some significant elements, which we could [21] C. Kim, C. Kim, H. Son, Fully automated registration of 3D data to a 3D CAD model
recognise from the BIM model. The existence of the 3D AD BIM is also for project progress monitoring, Autom. Constr. 35 (2013) 587–594, http://dx.doi.
significant for the registration of partial point clouds. For commu- org/10.1016/j.autcon.2013.01.005.
[22] N. Hyland, S.O. Keeffe, N. Hyland, C. Dore, S. Brodie, Validation of As-Is and As-
nication with the server, a Wi-Fi network could be installed on site and
generated IFC BIMs for advanced scan-to-BIM methods automatic validation of As-
the protection helmets upgraded with a Wi-Fi subsystem. To simplify Is and As-generated IFC BIMs for advanced scan-to-BIM, Methods (2017).
the helmet equipment, another solution would be to upload data at the [23] M. Golparvar-Fard, F. Peña-Mora, S. Savarese, Automated progress monitoring
end of the working day, when helmets are put into docking stations. We using unordered daily construction photographs and IFC-based building informa-
tion models, J. Comput. Civ. Eng. 29 (2015) 4014025, http://dx.doi.org/10.1061/
have also anticipated helmet equipment in an affordable price range. (ASCE)CP.1943-5487.0000205.
After solving the mentioned open problems, we plan to move from [24] H. Fathi, F. Dai, M. Lourakis, Automated as-built 3D reconstruction of civil infra-
the testbed to a real construction project, where we plan to use structure using computer vision: achievements, opportunities, and challenges, Adv.
Eng. Informatics. 29 (2015) 149–161, http://dx.doi.org/10.1016/j.aei.2015.01.
(probably awkward) prototype scanning helmets, upgraded with posi- 012.
tioning and communication subsystems and a server for partial point [25] Z.A. Memon, M. Zaimi, A. Majid, M. Mustaffar, A systematic approach for mon-
cloud registration and comparison of the 4D AD and 4D AB BIM. itoring and evaluating the construction project progress (accessed September 8,
2017), J. -The Inst. Eng. Malay. 67 (2006), http://dspace.unimap.edu.my/dspace/
bitstream/123456789/13560/1/026-032_systematic approach.pdf.
References [26] V. Pătrăucean, I. Armeni, M. Nahangi, J. Yeung, I. Brilakis, C. Haas, State of re-
search in automatic as-built modelling, Adv. Eng. Informatics 29 (2015) 162–171,
http://dx.doi.org/10.1016/j.aei.2015.01.001.
[1] C. Kim, H. Son, C. Kim, Automated construction progress measurement using a 4D
[27] F. Bosché, Automated recognition of 3D CAD model objects in laser scans and
building information model and 3D data, Autom. Constr. 31 (2013) 75–82, http://
calculation of as-built dimensions for dimensional compliance control in con-
dx.doi.org/10.1016/j.autcon.2012.11.041.
struction, Adv. Eng. Informatics. 24 (2010) 107–118, http://dx.doi.org/10.1016/j.
[2] M. Kopsida, I. Brilakis, P. Vela, A review of automated construction progress and
aei.2009.08.006.
inspection methods, Proc. 32nd CIB W78 Conf. Constr. IT, 2015, pp. 421–431.
[28] K.K. Han, D. Cline, M. Golparvar-Fard, Formalized knowledge of construction se-
[3] T. Omar, M.L. Nehdi, Data acquisition technologies for construction progress
quencing for visual monitoring of work-in-progress via incomplete point clouds and
tracking, Autom. Constr. 70 (2016) 143–155, http://dx.doi.org/10.1016/j.autcon.
low-LoD 4D BIMs, Adv. Eng. Informatics 29 (2015) 889–901, http://dx.doi.org/10.
2016.06.016.
1016/j.aei.2015.10.006.
[4] S. Alizadehsalehi, I. Yitmen, The impact of field data capturing technologies on
[29] F. Bosché, M. Ahmed, Y. Turkan, C.T. Haas, R. Haas, The value of integrating Scan-
automated construction project progress monitoring, Procedia Eng. (2016) 97–103,
to-BIM and Scan-vs-BIM techniques for construction monitoring using laser scan-
http://dx.doi.org/10.1016/j.proeng.2016.08.504.
ning and BIM: The case of cylindrical MEP components, Autom. Constr. 49 (2015)
[5] M. Golparvar-fard, F. Peña-mora, S. Savarese, Monitoring of Construction
201–213, http://dx.doi.org/10.1016/j.autcon.2014.05.014.
Performance Using Daily Progress Photograph Logs and 4D As-planned Models
[30] R. Maalek, F. Sadeghpour, Accuracy assessment of Ultra-Wide Band technology in
William E.O’Neil Pre-Doctoral Candidate in Construction Management and Master
tracking static resources in indoor construction scenarios, Autom. Constr. 30 (2013)
of Computer Science Student, University of Illinois, Urbana-Champaign, (n.d.), pp.
170–183, http://dx.doi.org/10.1016/j.autcon.2012.10.005.
53–63.
[31] A.M. Costin, J. Teizer, B. Schoner, RFID and bim-enabled worker location tracking
[6] Y. Turkan, F. Bosche, C.T. Haas, R. Haas, Automated progress tracking using 4D
to support real-time building protocol control and data visualization, J. Inf.
schedule and 3D sensing technologies, Autom. Constr. 22 (2012) 414–421, http://
Technol. Constr. 20 (2015) 495–517.
dx.doi.org/10.1016/j.autcon.2011.10.003.
[32] C. Zhang, D. Arditi, Automated progress control using laser scanning technology,
[7] J. Yang, O. Arif, P.A. Vela, J. Teizer, Z. Shi, Tracking multiple workers on con-
Autom. Constr. 36 (2013) 108–116, http://dx.doi.org/10.1016/j.autcon.2013.08.
struction sites using video cameras, Adv. Eng. Informatics. 24 (2010) 428–434,
012.
http://dx.doi.org/10.1016/j.aei.2010.06.008.
[33] I. Brilakis, M. Lourakis, R. Sacks, S. Savarese, S. Christodoulou, J. Teizer,
[8] M. Memarzadeh, M. Golparvar-Fard, J.C. Niebles, Automated 2D detection of
A. Makhmalbaf, Toward automated generation of parametric BIMs based on hybrid
construction equipment and workers from site video streams using histograms of
video and laser scanning data, Adv. Eng. Informatics 24 (2010) 456–465, http://dx.
oriented gradients and colors, Autom. Constr. 32 (2013) 24–37, http://dx.doi.org/
doi.org/10.1016/j.aei.2010.06.006.
10.1016/j.autcon.2012.12.002.
[34] V. Escorcia, M.A. Dávila, M. Golparvar-Fard, J.C. Niebles, Automated vision-based
[9] S. Chi, C.H. Caldas, Image-based safety assessment: automated spatial safety risk
recognition of construction worker actions for building interior construction op-
identification of earthmoving and surface mining activities, J. Constr. Eng. Manage.
erations using RGBD cameras, Constr. Res. Congr. 2012 (2012) 879–888, http://dx.
138 (2012) 341–351, http://dx.doi.org/10.1061/(ASCE)CO.1943-7862.0000438.
doi.org/10.1061/9780784412329.089.
[10] J. Gong, C.H. Caldas, An object recognition, tracking, and contextual reasoning-
[35] T. Krijnen, J. Beetz, An IFC schema extension and binary serialization format to
based video interpretation method for rapid productivity analysis of construction
efficiently integrate point cloud data into building models, Adv. Eng. Informatics
operations, Autom. Constr. 20 (2011) 1211–1226, http://dx.doi.org/10.1016/j.
(2016), http://dx.doi.org/10.1016/j.aei.2017.03.008.
autcon.2011.05.005.
[36] P.W. Theiler, Automated Registration of Terrestrial Laser Scanner Point Clouds,
[11] B.N. Baker, B.C. Murphy, D. Fisher, Factors affecting project success, A Comp. Anal.
2015.
between Law Pract. Penins. Malaysia Repub. Singapore Common Wealth Law Bull.
[37] S. Bechtold, B. Höfle, Helios: a multi-purpose lidar simulation framework for re-
(2000) 145–173. http://psrcentre.org/images/extraimages/271214036.pdf.
search, planning and training of laser scanning operations with airborne, ground-
[12] S.A. Assaf, S. Al-Hejji, Causes of delay in large construction projects, Int. J. Proj.
based mobile and stationary platforms, ISPRS Ann. Photogramm. Remote Sens.
Manage. 24 (2006) 349–357, http://dx.doi.org/10.1016/j.ijproman.2005.11.010.
Spat. Inf. Sci. (2016) 161–168, http://dx.doi.org/10.5194/isprsannals-III-3-161-
[13] M. Golparvar-Fard, F. Peña-Mora, C.A. Arboleda, S. Lee, Visualization of con-
2016.
struction progress monitoring with 4D simulation model overlaid on time-lapsed
[38] S. Hong, I. Park, J. Lee, K. Lim, Y. Choi, H.G. Sohn, Utilization of a terrestrial laser
photographs, J. Comput. Civ. Eng. 23 (2009) 391–404, http://dx.doi.org/10.1061/
scanner for the calibration of mobile mapping systems, Sensors (Switzerland) 17
(ASCE)0887-3801(2009) 23:6(391).
(2017), http://dx.doi.org/10.3390/s17030474.
[14] R. Navon, R. Sacks, Assessing research issues in Automated Project Performance
[39] S. El-Omari, O. Moselhi, Integrating 3D laser scanning and photogrammetry for
Control (APPC), Autom. Constr. 16 (2007) 474–484, http://dx.doi.org/10.1016/j.
progress measurement of construction work, Autom. Constr. 18 (2008) 1–9, http://
autcon.2006.08.001.
dx.doi.org/10.1016/j.autcon.2008.05.006.
[15] S. Scott, S. Assadi, A survey of the site records kept by construction supervisors,
[40] G.F. Mani, P.M. Feniosky, S. Savarese, D4AR-A 4-dimensional augmented reality
Constr. Manag. Econ. 17 (1999) 375–382, http://dx.doi.org/10.1080/
model for automating construction progress monitoring data collection, processing
014461999371574.
and communication, Electron. J. Inf. Technol. Constr. 14 (2009) 129–153.
[16] M. Golparvar-Fard, M. Asce, F. Peña-Mora, S. Savarese, M. Asce, F. Peña-Mora,
[41] Z. Pučko, D. Rebolj, Automated construction progress monitoring using continuous
S. Savarese, Integrated sequential as-built and as-planned representation with tools
multipoint indoor and outdoor 3D Scanning, in: Lean Comput. Constr. Congr. - Vol.
in support of decision-making tasks in the AEC/FM industry, J. Constr. Eng. Manag.
1 Proc. Jt. Conf. Comput. Constr., 2017, pp. 105–112. http://doi.org/10.24928/
137 (2011), http://dx.doi.org/10.1061/(ASCE)CO.1943-7862.0000371.
JC3-2017/0021.
[17] A. Braun, S. Tuttas, A. Borrmann, A concept for automated construction progress
[42] I. Armeni, O. Sener, A. Zamir, H. Jiang, 3D Semantic parsing of large-scale indoor
monitoring using BIM-based geometric constraints and photogrammetric point
spaces, in: CVPR, 2016, pp. 1534–1543. http://doi.org/10.1109/CVPR.2016.170.
39
Z. Pučko et al. Advanced Engineering Informatics 38 (2018) 27–40
[43] K. Khoshelham, S.O. Elberink, Accuracy and resolution of kinect depth data for Trans. Pattern Anal. Mach. Intell. 35 (2013) 354–366, http://dx.doi.org/10.1109/
indoor mapping applications, Sensors (Basel). 12 (2012) 1437–1454, http://dx.doi. TPAMI.2012.104.
org/10.3390/s120201437. [62] M. Magnabosco, T.P. Breckon, Cross-spectral visual simultaneous localization and
[44] E. Lachat, H. Macher, M.A. Mittet, T. Landes, P. Grussenmeyer, First experiences mapping (SLAM) with sensor handover, Rob. Auton. Syst. 61 (2013) 195–208,
with kinect V2 sensor for close range 3D modelling, Int. Arch. Photogramm. Remote http://dx.doi.org/10.1016/j.robot.2012.09.023.
Sens. Spat. Inf. Sci. - ISPRS Arch. 40 (2015) 93–100. http://doi.org/10.5194/ [63] N. Karlsson, E. Di Bernardo, J. Ostrowski, L. Goncalves, P. Pirjanian, M.E. Munich,
isprsarchives-XL-5-W4-93-2015. The vSLAM algorithm for robust localization and mapping, Proc. – IEEE Int. Conf.
[45] O. Wasenmüller, D. Stricker, Comparison of kinect v1 and v2 depth images in terms Robot. Autom. 2005, 2005, pp. 24–29, , http://dx.doi.org/10.1109/ROBOT.2005.
of accuracy and precision, Lect. Notes Comput. Sci. (Including Subser. Lect. Notes 1570091.
Artif. Intell. Lect. Notes Bioinformatics). 10117 LNCS, 2017, pp. 34–45, , http://dx. [64] C.-C. Wang, C. Thorpe, S. Thrun, M. Hebert, H. Durrant-Whyte, Simultaneous lo-
doi.org/10.1007/978-3-319-54427-4_3. calization, mapping and moving object tracking, Int. J. Rob. Res. 26 (2007)
[46] L. Yang, L. Zhang, H. Dong, A. Alelaiwi, A. El Saddik, Evaluating and improving the 889–916, http://dx.doi.org/10.1177/0278364907081229.
depth accuracy of kinect for Windows v2, IEEE Sens. J. 15 (2015) 4275–4285, [65] V. Nguyen, A. Harati, R. Siegwart, A lightweight SLAM algorithm using Orthogonal
http://dx.doi.org/10.1109/JSEN.2015.2416651. planes for indoor mobile robotics, IEEE Int. Conf. Intell. Robot. Syst. 2007, pp.
[47] E. Lachat, H. Macher, M.-A. Mittet, T. Landes, P. Grussenmeyer, first experiences 658–663, , http://dx.doi.org/10.1109/IROS.2007.4399512.
with kinect v2 sensor for close range 3D modelling, Int. Arch. Photogramm. Remote [66] S. Lynen, T. Sattler, M. Bosse, J. Hesch, M. Pollefeys, R. Siegwart, Get out of my lab:
Sens. Spat. Inf. Sci. XL-5/W2, 2015, http://dx.doi.org/10.5194/isprsarchives-XL-5- large-scale, real-time visual-inertial localization, Robot. Sci. Syst. XI. (2015), http://
W4-93-2015. dx.doi.org/10.15607/RSS.2015.XI.037.
[48] S. Rusinkiewicz, Efficient Variants of the ICP Algorithm a r c Levoy, 2001, http:// [67] T. Schops, T. Sattler, C. Hane, M. Pollefeys, 3D Modeling on the go: interactive 3D
dx.doi.org/10.1109/IM.2001.924423. reconstruction of large-scale scenes on mobile devices, in: Proc. – 2015 Int. Conf. 3D
[49] O. Ozyesil, V. Voroninski, R. Basri, A. Singer, A Survey of Structure from Motion, Vision, 3DV 2015, 2015, pp. 291–299. http://doi.org/10.1109/3DV.2015.40.
2017, pp. 1–40, , http://dx.doi.org/10.1017/S096249291700006X. [68] M. Klingensmith, I. Dryanovski, S. Srinivasa, J. Xiao, Chisel: real time large scale 3D
[50] M.J. Westoby, J. Brasington, N.F. Glasser, M.J. Hambrey, J.M. Reynolds, “Structure- reconstruction onboard a mobile device using spatially hashed signed distance
from-Motion” photogrammetry: a low-cost, effective tool for geoscience applica- fields, Robot. Sci. Syst. XI. (2015), http://dx.doi.org/10.15607/RSS.2015.XI.040.
tions, Geomorphology. 179 (2012) 300–314, http://dx.doi.org/10.1016/j. [69] T. Sattler, M. Havlena, K. Schindler, M. Pollefeys, Large-scale location recognition
geomorph.2012.08.021. and the geometric burstiness problem, in: 2016 IEEE Conf. Comput. Vis. Pattern
[51] F. Dellaert, S.M. Seitz, C.E. Thorpe, S. Thrun, Structure from motion without cor- Recognit., 2016, pp. 1582–1590. http://doi.org/10.1109/CVPR.2016.175.
respondence, in: Proc. IEEE Conf. Comput. Vis. Pattern Recognition. CVPR 2000 [70] S. Mengin, C. Portes, J. Vialle, B. Vincent, Indoor positioning & navigation, Agone.
(Cat. No.PR00662), vol. 2, n.d., pp. 557–564. http://doi.org/10.1109/CVPR.2000. 1–32 (2016).
854916. [71] Pozyx – centimeter positioning for arduino, n.d. https://www.pozyx.io/ (accessed
[52] K. Häming, G. Peters, The structure-from-motion reconstruction pipeline – a survey October 5, 2017).
with focus on short image sequences, Kybernetika 46 (2010) 926–937. [72] Precise Indoor Positioning UWB RTLS – FCC certified, 30cm accuracy, n.d. https://
[53] F. Bosché, Plane-based registration of construction laser scans with 3D/4D building www.eliko.ee/products/kio-rtls/ (accessed October 5, 2017).
models, Adv. Eng. Informatics 26 (2012) 90–102, http://dx.doi.org/10.1016/j.aei. [73] S. Taneja, B. Akinci, J.H. Garrett, L. Soibelman, H.A. Karimi, Effects of positioning
2011.08.009. data quality and navigation models on map-matching of indoor positioning data, J.
[54] J. Serafin, G. Grisetti, Using extended measurements and scene merging for efficient Comput. Civ. Eng. 30 (2016) 4014113, http://dx.doi.org/10.1061/(ASCE)CP.1943-
and robust point cloud registration, Rob. Auton. Syst. 92 (2017) 91–106, http://dx. 5487.0000439.
doi.org/10.1016/j.robot.2017.03.008. [74] D. Rebolj, Z. Pučko, N.Č. Babič, M. Bizjak, D. Mongus, Point cloud quality re-
[55] R.B. Rusu, N. Blodow, Z.C. Marton, M. Beetz, Aligning point cloud views using quirements for Scan-vs-BIM based automated construction progress monitoring,
persistent feature histograms, 2008 IEEE/RSJ Int. Conf. Intell. Robot. Syst. IROS, Autom. Constr. 84 (2017) 323–334, http://dx.doi.org/10.1016/j.autcon.2017.09.
2008, pp. 3384–3391, , http://dx.doi.org/10.1109/IROS.2008.4650967. 021.
[56] R.B. Rusu, N. Blodow, M. Beetz, Fast Point Feature Histograms (FPFH) for 3D re- [75] BIM Solutions|General Contractor Solutions, n.d. http://gc.trimble.com/product-
gistration, 2009 IEEE Int. Conf. Robot. Autom. 2009, pp. 3212–3217, , http://dx. categories/bim-solutions (accessed April 3, 2018).
doi.org/10.1109/ROBOT.2009.5152473. [76] Solibri, Solibri|Solibri Model Checker, Solibri, 2016. https://www.solibri.com/
[57] N. Zikos, V. Petridis, 6-DoF low dimensionality SLAM (L-SLAM), J. Intell. Robot. products/solibri-model-checker/ (accessed March 24, 2018).
Syst. Theory Appl. 79 (2015) 55–72, http://dx.doi.org/10.1007/s10846-014- [77] H. Hamledari, S.M. Asce, B. Mccabe, M. Asce, S. Davari, A. Shahi, Automated
0029-6. schedule and progress updating of IFC-based 4D BIMs, J. Comput. Civ. Eng. 31
[58] J. Engel, T. Schöps, D. Cremers, LSD-SLAM: Large-Scale Direct monocular SLAM, (2017) 1–16, http://dx.doi.org/10.1061/.
Lect. Notes Comput. Sci. (Including Subser. Lect. Notes Artif. Intell. Lect. Notes [78] H. Son, C. Kim, A.M. Asce, Y.K. Cho, M. Asce, Automated schedule updates using as-
Bioinformatics). 8690 LNCS, 2014, pp. 834–849, , http://dx.doi.org/10.1007/978- built data and a 4D building information model, J. Manag. Eng. 33 (2017) 1–13,
3-319-10605-2_54. http://dx.doi.org/10.1061/(ASCE)ME.1943-5479.0000528.
[59] T. Pire, T. Fischer, G. Castro, P. De Cristóforis, J. Civera, J. Jacobo Berlles, S-PTAM: [79] T. Valenko, U. Klanšek, An integration of spreadsheet and project management
Stereo Parallel Tracking and Mapping, Rob. Auton. Syst. 93 (2017) 27–42. http:// software for cost optimal time scheduling in construction, Organ. Technol. Manage.
doi.org/10.1016/j.robot.2017.03.019. Constr. Int. J. 9 (2017) 1627–1637, http://dx.doi.org/10.1515/otmcj-2016-0028.
[60] R. Mur-Artal, J.M.M. Montiel, J.D. Tardós, Orb-slam: a versatile and accurate [80] Microsoft, Coordinate mapping, Kinect Wind. SDK 2.0. 785530 (2014) 785530.
monocular slam system, IEEE Trans. Robot. 31 (2015) 1147–1163. https://msdn.microsoft.com/en-us/library/dn785530.aspx.
[61] D. Zou, P. Tan, CoSLAM: Collaborative visual SLAM in dynamic environments, IEEE
40