100% found this document useful (1 vote)
507 views34 pages

LMB Technical Bulletin No. 2 Series of 2017

The LMB Memorandum Circular (LMC) No. 2017-003 recognizes UAS as one of the instruments that may be used in the conduct of land survey. The LMB Technical Bulletin No. 2 series of 2017 contains standards and guidelines needed in the implementation of the LMC.

Uploaded by

RheaLynDealca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
100% found this document useful (1 vote)
507 views34 pages

LMB Technical Bulletin No. 2 Series of 2017

The LMB Memorandum Circular (LMC) No. 2017-003 recognizes UAS as one of the instruments that may be used in the conduct of land survey. The LMB Technical Bulletin No. 2 series of 2017 contains standards and guidelines needed in the implementation of the LMC.

Uploaded by

RheaLynDealca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 34
GUIDELINES ON THE USE OF UNMANNED AERIAL SYSTEMS (UAS) IN SUPPORT OF LAND SURVEY oop Foreword This Technical Bulletin contains standards and quidelines peated in ype implementation of LMB Memorandum Circular No. 2017-_D)%_ dated 2 “Pe ‘on the Use of Unmanned Aerial Systems (UAS) to Support Land Surveys,” well as background information into UAS-related research activities undertaken by the University of the Philippines Department of Geodetic Engineering and Walter Volkmann of Florida, USA-based Micro Aerial Projects L.L.C. in collaboration with the Foundation for Economic Freedom (FEF) and the Land Management Bureau of the Department of Environment and Natural Resources (DENR). Acting Director Land Management Bureau Table of Contents Foreword, A. Background. A.1.1 Basic differences between traditional photogrammetry and UAV. ssabarealT photogrammetry A.1.2 UAS components. A.1.2.1 Types of UAVS.. ‘A.1.2.2 UAS sensors (cameras) A.1.3 Advantages and Challenges In Using UASs ic principle of UAV photogrammetry: Structure-from-Motion (SfMl)/Multi-View Stereopsis -10, 12 A2Bi ‘A3 UAS Validation. 12 A3.1 Typical test bed configuration, A3.2 Test Bed Measurements ‘A. Typical UAS Mapping Workflow AA.1 Project Design AA4.2 Reconnaissance. A.A4.3 Flight Planning... ‘AA.A Establishment of Ground Control Points (GCPS).. AA.S Aerial image acquisition... ‘A.4.6 Image processing. ‘AA,7 Quality control (QC) of UAS mapping products... References. -Beiprly Table of Figures Figure 1. Digitized parcel boundaries from extracted UAS-based image approach Indonesia, ire 2. Parcel boundaries manually and digitized from UAS ortho-photo - Germany, Figure 3. Cadastral plan with UAS ortho-photo backdrop in Albania. Figure 4. Cadastral plan with UAS image-derived coordinates in Albania. Figure 5. Different camera orientations between traditional photogrammetry... Figure 6. Typical types of UAVs: fixed-wing and rotor-type. Figure 7. Typical UAS sensors... Figure 8 Structure-from-Mation (SfM) Figure 9, ‘Structure’ refers to the scene features; ‘motion’ refers to the moving camera positions...10 Figure 10. Multi-view stereopsis (MvS) automatic feature matching algorithm incorporated in SfMI 11 Figure 11. Multi-view stereopsis (MvS) automatic feature matching algorithm incorporated in SfM 11 igure 12. SIFT object matching. Distinctive image features from scale-invariant keypoints. a Figure 13. MVS-SIFT Work Flow. oA Figure 14. Sample test bed design: 250m x 250m, about 300 images with 7034/80% overlap..non12 Figure 15. Typieal UAS Mapping Work Flow.. ' iB Figure 16. Relation of ground sample distance (GSD), camera focal leet (f, pixel (p), and UAV flying, height (h) above terrain... ved Figure 17. Image footprint and overlap. 7 eld Figure 18, Pixel size and image resolution... S Figure 19. Inspect project area for potential obstructions and sources of interference during GNSS ‘observations and UAV flight. AT Figure 20. Project area on Google Earth, fight plan and GCPs. Figure 21. Image footprints showing forward overlaps. Figure 22. Well-distributed GCPs. Figure 23. Pre-marked GCPs in Barangay Poblacion, Novgsay Bulacan. Figure 24, Pre-marked GCPs at Bacoor test bed site... Figure 25. GCP on asphalt pavement with 2-inch concrete nail. Figure 26. Road marking and curb edges, which can be used as natural control points (highlighted with orange circles) . Figure 27. GCP observation in Barangay Poblacion, Norzagaray.... Figure 28. GCP observation. Figure 29. Sparse point cloud with relative camera positions after photo alignment process. (Fushe Milot, Albania). - 23 Figure 30. Dense point cloud of Fushe Milot (Albania) 24 24 Figure 31. Polygon Mesh or Triangulated Irregular Network (TiN) Figure 32. DSM and DTM . Figure 33. UAS ortho-photo, GSD 10cm. Figure 34 Normal error distribution. The st often the normal, or Gaussian, distribution. Figure 35. Effect ofthe number of GCPs on RMSE and standard deviation... sss? Figure 36. Effect of the number of GCPs on RMSE and standard deviatior Figure 37. The original target (left) and the corresponding targets in the Source: Manyoky et al. (2011)... z : Figure 38. Determination ofthe target center points using the centroid Operator wv. ose cnn nnn 28 Figure 39. Example of poor quality ortho-image. Figure 40. Example good-quality ortho-image .. Figure 41. Warped ‘straight’ fence Figure 42. Warped automobile. ‘age with motion blur. Figure 43. Pseudo-ortho-photo (shows building lean) and true ortho-photo (does not show building 20 30 lean). Figure 44, Shadows in ortho-imagery Tables Table 1. Basic Differences between Traditional Photogrammetry and UAV Photogrammetry Table 2. Advantages and challenges associated with UASSs..... Table 3. Recommended minimum number of checkpoints for UAS ortho-photo mapping Table 4. Horizontal accuracy classes for digital ortho-photos. “maul A.A Background Existing mapping procedures related to cadastral surveys generally involve substantial fieldwork and various manual processes. UAV-based mapping workflows allow for the extraction of property boundaries visible on high-resolution UAV imagery using image analysis methods (Figures 1-4). Figure 1. Digitized parce! boundaries from Figure 2. Parcel boundaries manually and extracted UAS-based image approach- digitized from UAS ortho-photo - Germany Indonesia ‘Source: Crommelinek et al. (2016) ‘Source: Ramadhani (2016) Figure 3. Cadastral plan with UAS ortho-photo Figure 4. Cadastral plan with UAS image- backdrop in Albania derived coordinates in Albania ‘Source: Voikmann and Barnes (2014) ‘Source: Votkmann (2014) op A.1.1 Basic differences between traditional photogrammetry and UAV photogrammetry “Tractional Photogrammetry (igaaee Figure 5, Different camera orientations between traditional photogrammetry ‘and UAV photogrammetry Source: Nissen (2018) Traditional photogrammetry uses near-vertical (tit < 3°) aerial photography to generate maps while UAV photogrammetry acquires aerial images with varying tilt and orientation (Figure 6) then uses image processing software, such as structure-from- motion, to generate ortho-photo maps. Furthermore, traditional photogrammetry relies ‘on GCPs for aerial triangulation to generate supplementary photo-control points needed in rectifying the aerial photographs; UAV photogrammetry, on the other hand, uses numerous identifiable ground features automatically matched by software to rectify the aerial images, Table 1 below shows a summary of the basic differences between traditional photogrammetry and UAV photogrammetry: Parameter Traditional Photogrammetry UAV Photogrammetry Aircraft Manned: pilot, navigator, ‘Unmanned, remotely-controlled Flying '500-10,000 meters. 5 150 meters. height Platform Stable; small yaw, pitch and roll Less stable; larger yaw, pitch and roll Endurance | Long (hours) Short (20-30 minutes with battery) Camera High-performance, metric, | Consumer, __non-metric, often calibrated, high optical quality, | uncalibrated, ‘moderate optical _quaity; /weight~80-100 kg weight~1-5 kg py ‘Area Large; modest number of images | Small, large number of images coverage Resolution | Medium-high: ~0.05-1.00 m High: ~0.01-0.10 m Flight Usually parallel strips, carefully| Ideally parallel strips, sometimes more pattern planned forward and side overlaps | variable >large overlaps (~60-80%) planned to protect againet gaps or tits ‘Table 1. Basic Differences between Traditional Photogrammetry and UAV Photogrammetry ‘Source: Settengren and Walker (2016), A.1.2 UAS components UASs consist of three fundamental components (Volkmann and Bames, 2014): (2) UAV, commonly known as a drone without a human pilot on board but remotely controlled, carrying navigation equipment such as GNSS, advanced autopilot module (APM), and inertial measurement unit (IMU); camera; and battery (lithium-polymer, (b) Laptop or base station on which the initial flight planning is done and which shows real-time navigation, imagery and telemetry information (latitude, longitude, GNSS quality, etc.); and (c) Remote control (RC) transmitter which can be used to manually control the UAV and which receives basic data on the status of the vehicle (battery voltage, GNSS quality, roll, pitch and heading) 1.2.1 Types of UAVs UAVs can be divided into two distinct types: fixed-wing and rotor, or vertical take- off and landing (VTOL). VTOL platforms are generally designed as multi-rotor copters with 3, 4 (‘quadcopter’), 6 (hexacopter’) or 8 propellers (‘octocopter’), Fixed-wing UAVs are designed much like conventional aircraft which require forward motion to fly and space to take off and land. VTOL vehicles, on the other hand, require very little space for take-off and landing and have the ability to hover if so desired. VTOL UAVs are therefore preferable for smaller areas where high- resolution imagery is required. Fixed wing UAVs are likely more productive over large areas, although these will generally be at a lower resolution because of the ‘speed at which these UAVs must fly and the restrictions on camera shutter speeds and continuous exposure rates. Figure 6. Typical types of UAVs: fixed-wing and rotor-type. ‘Source: hites:/ina.unep netigeas/archive/pdis/GEAS _May2013_EcoDrones pat When selecting a UAV it is important to consider whether coverage of a whole area is needed at the same time, or whether it is better to map incrementally as the spatial data is needed so that the data is more current. 1.2.2 UAS sensors (cameras) UAV mappers use a wide range of cameras for their missions. Most cameras used for UAV mapping are lightweight and can be programmed to shoot pictures at regular intervals or controlled remotely. Small UASs have take-off weights for mapping purposes from 1 kg to 5 kg and are equipped with non-metric digital cameras in the 10-16 megapixel range. Pree Figure 7. Typical UAS sensors ‘Source: US Oregon Department of Transportation (DOT), htip:/www. oregon gov/odot A.4.3 Advantages and Challenges in U: Advantages Challenges Lightweight and easy to transport Limited fight time depending on model Low-cost high-resolution images Limited by camera weight Low cost operations Ar space limitations and legal restrictions, Can fly at variety of altitudes Can be limited by wind speed and gusts “aunpodi( ‘Can map areas not accessible by land or] — Limted amount of appropriate image ‘water on an on-demand schedule processing software ‘Quick avaitabilty of raw data Due to small image footprints, numerous images must be captured Time-intensive to create ortho-mosaics with minimal geographic reference datum errors. Video recording capabilities ‘Table 2. Advantages and challenges associated with UASs. (Sources: Hardin and Hardin, 2010; Niethammer et al, 2012: CielMap, 2012; Laliberte et al. 2010) hotogrammetry: Structure-from-Motion (SfM)/Mu The SfM approach originated in the computer vision community and incorporates automatic feature-matching algorithms (Westoby et al., 2012). Instead of a single stereo pair - the standard input in traditional photogrammetry- the SfM technique requires multiple, overlapping photographs as input to feature extraction and 3-D reconstruction algorithms. By using multiple overlapping images, SfM incorporates simultaneous, highly-redundant, iterative bundle adjustment procedures and extracts ‘a database of features automatically. As a result, a very accurate point matching between photographs can be achieved, and an extremely dense point cloud can be ‘extracted. (Madden, M. et al., 2015) » y a Bo ‘. SIM reconstructs: neuapever NJ yy” Easement Rial ee bee Hawes, Sructureom-Aoten (SMM) Figure 9. “Suche refers to the scone See idealy char BOND (cure moto’ vere tethe moving comers positions. ‘Source: hitp:simage.slideserve.com The SfM technique incorporates two automatic feature-matching algorithms, namely, (a) Multi-View Stereopsis (MvS), and (b) Scale Invariant Feature Transform (SIFT). \ 10 ‘The term ‘stereopsis' (or stereoscopic depth) refers to the perception of depth and 3-dimensional structure associated with binocular vision; or, what is referred to as seeing in 3-dimension. A basic requirement for traditional photogrammetry is the geometric stability of the aerial camera. UASs, however, use non-metric cameras which are not geometrically stable. To overcome this problem, UAS mapping uses a combination of traditional photogrammetry and computer vision techniques, known as Multi-View Stereopsis (MvS). Mv matching and reconstruction is a key ingredient in the automated acquisition of geometric object and scene models from. multiple photographs. Figure 10 and Figure 11 below illustrate the MvS technique. Figure 10. Multi-view stereopsis (MvS) Figure 11, Multi-view stereopsis (MvS) automatic feature matching algorithm automatic feature matching algorithm. incorporated in STM incorporated in SIM Source: Left, Fig 6a - Snavely (2010) ‘Source: Rul, JJ. et al. (2013) The SIFT algorithm, originally proposed by Professor David Lowe (University of British Columbia - Canada) in 1999 and refined in 2004, detects and describes local features in images. Cited in more than 12,000 publications until 2016, SIFT is widely used in image search, object recognition, gesture recognition, video tracking and related tasks. (Figure 12) Figure 12, SIFT object matching. Distinctive image features from scale-invariant keypoints. raphy ‘Source: Lowe (2004) ‘The workflow associated with MVS and SIFT automatic feature matching is shown in Figure 13. Se Coords ‘Optimize F/e /« Poets |<] re ve | >/ x [>| az, | >, Figure 13. MVS-SIFT Work Flow ‘Source: Barnes et al, (2014) A.3 UAS Validation UAS validation can be achieved by making comparative measurements over a test bed with a network of ground validation points for which highly-accurate 3-dimensional, coordinates have been independently determined with the use of established terrestrial survey methods. 3.1 Typical test bed configuration Figure 14. Sample test bed design: 250m x 250m, about 300 images with 70%/80% overlap. ‘Source: Volkmann (2017) | A. incom x meas mam a A3.2 Test Bed Measurements The purpose of the test bed is to provide a calibration site of the UAS with well- defined conditions designed to test the geometric performance of the UAS under real world conditions and where different hardware configurations, flight pattems, cameras, etc, of various UAS operators will be compared under controlled conditions. The test will determine the achievable accuracy of the UAS, although, it may not always be clear which factors have significant impact on the achieved py result: the software, the quality of the camera, or perhaps the number and distribution of the control points used Aside from hardware-related factors, the efficiency and effectiveness of the software plays a major role in obtaining accurate results. The calibration test will likewise investigate the performance of different software packages, AAT u Ww The general sequence of the activities undertaken in a UAS mapping project includes the following tasks (Figure 15): i 1 1 Project Design Reconnaissanc | Flight Planning ——>— L | j Ground Control Image ‘Aerial image < Gualty Controt le L_ Processing ‘Acquisition Figure 15. Typical UAS Mapping Work Flow ‘Source: Volkmann and Bares (2014) A.4.1 Project Design Project design includes strategy and preparations for the mapping project implementation. Aside from the size of the area to be mapped, flight planning is based on major variables that include (a) the desired ground sample distance (GSD), (b) the camera focal length, f, and (c) the UAV flying height, h. (Figure 16). A411 Size of area to be mapped: the larger the area to be mapped, the choice ‘of a UAV leans towards the fixed-wing type. 4.1.2 Ground sample distance (GSD): small GSD values generally require lower flying heights and slower ground speeds. For high-resolution (GSD ~ 1-2em) projects, the preferred platform is multi-rotor UAV. | 13 BEF nett ithe nlpe mec NF cant an eat ea gh ps J \ Source: Settengren and Walker (2016) GSDis used to express the spatial resolution of digital maps. This is the dimension of a square on the ground covered by one pixel (p) on the image and is a function of the resolution of the camera sensor, the camera focal length (f) and the UAV flying height (h) above terrain. A.4.1.3 Computation of mission parameters. (Source: Volkmann and Barnes, 2014) From Figure 16, by ratio and proportion, the desired GSD and flying height can be derived: focal length (f) / pixel (p) = flying height (h) / GSD then, GSD=(hx p)ff and h=(fx GSDyp From Figure 11 below, the following mission parameters can be computed: Figure 17. Image footprint and overlap. ‘Source: Barnes et al. (2014) where s = exposure distance interval d= spacing between flight lines w= GSD (width of sensor, in number of pixels) 1 = GSD (length of sensor, in number of pixels) opel (a) Exposure distance interval (s), which is dependent on the desired forward overlap, If the forward overlap is a% and shv = (100-a)/100, then s = (100 - a) x (wi00) (b) Spacing between flight lines (d), which is dependent on the desired side- lap If the side-lap is b%, and di (100 — b)/100, then d = (100 = b) x (7 100) UAS mapping researches show that forward overlaps of 80% and side-laps of 70% yield accurate mapping results. A.4.1.4 Relation of pixel size to image resolution Figure 18. Pixel size and image resolution ‘Source: Aerotas (San Francisco, California), a ce UAS Mapping Conference, Palm Springs, 13 September 2016 i With any given lens setting, the smaller the size of the pixel, the higher the resolution will be and the clearer the object in the image will be. (Source Wikipedia). 4.9 cmipixet 18.2 emipixel Resolution in aerial photography is measured as ground sample distance (GSD) - the length on the ground corresponding to the side of one pixel in the image, or the distance between pixel centers measured on the ground (these are equivalent). A larger GSD (10 cm) means that fewer details will be resolvable in the image and it will be of lower quality, while a smaller GSD (5 cm) means the exact opposite. GSD goes up as the drone flies higher and goes down as the drone flies lower. GSD is also affected by the camera's focal length, as well as its pixel size. (Greenwood, 2016) 4.1.5 Nature of Terrain: For areas with small launch and landing space requirements, multi-rotor UAVs, or VTOL platforms, are the recommended type of UAVs. Furthermore, VTOL UAVs can be controlled more precisely as these can fly at slow speeds, thus these are safer to use in rolling terrain than fixed-wing UAVs A.4.1.6 Flying height and ground speed: When GSD and focal length dictate a particularly low flying height, the exposure distance intervals tend to be short, i.e., the camera must expose at very short time intervals. Fast ground speeds at low altitudes require short exposure times to avoid image blur, or image smear. This is epi, not favorable for fixed-wing UAVs which must maintain a minimum airspeed during flight 4.1.7 Regulatory limitations. The Civil Aviation Authority of the Philippines (CAAP) imposes certain basic limitations for non-commercial UAV flights, such as: {a) Flying height ceiling. UAV flights are limited to a ceiling of 400 feet (~120 meters) above ground level.; and (b) Required radius from airports. No UAV flights must be conducted within a 10-km radius from an airport. Refer to CAAP Memorandum Circular No. 29-15, Series of 2015, dated 8 December 2015 for other relevant provisions. A418 IVAS Pre-requisites for Geodetic Engineers In the conduct of survey using UAS the following pre-requisites must be considered by the Geodetic Engineer. a. UAS survey can be used if boundary points are placed on the ground prior to UAS data acquisition. These include but not limited to cadastral survey, original survey and verification survey. b. Survey processes that require stake-out activities may not be possible using UAS. These include but not limited to relocation survey or subdivision survey. c. UAS survey (with full or minimal GCPs) must be connected to a reference point consistent to the horizontal datum used by the land agency. d. Processed images and digitized lines that indicates boundary lines must be consistent to the datum and projection system used by the land agency. e. Digitization must only be done on points/ comers that are distinguishable in the processed ortho-image. Otherwise if some points are not visible, perform other methods such as conventional methods using optical instruments (e.g. total station, theodolite and tape) or GNSS technology and processes, 4.1.9 IVAS Pre-requisites for DENR-LMS and other Related Land Agencies In implementing IVAS and in the evaluation of survey returns from UAS survey the following pre-requisites must be considered by the concemed land agencies. a. To fully maximize the data and information obtained through UAS, the land agency must have a digital geographic information system (GIS) capable of handling, analysing and maintaining both vector and raster data, GIS software must be complemented by competent and capable human resource. b. However, if the land agency currently lacks GIS capability, the current system may still be adopted but the land agency must be able to provide additional apt storage facility for additional survey return requirement such as compact discs and hard copy of ortho-images, A.4.2 Reconnaissance It is advisable to conduct field reconnaissance of the project area to check the presence of power lines or cell site towers that may not be visible, or do not appear, on satellite imagery. The design of the flight plan may need to be changed, unexpectedly, at the moment of the UAV’s actual flight due to the presence of obstructions that may cause potential interference during flight. (Figure 19) Figure 19. Inspect project area for potential ‘obstructions and sources of interference during GNSS observations and UAV flight. ‘Source: Blanco etal, (2017) A.4.3 Flight Planning The design of flight paths is an important component of UAV mapping. This is typically done using software packages: many drone manufacturers offer proprietary software with their drones. Mission Planner, an open-source software package, is the single most widely used solution. (Greenwood, 2016). UAV flight paths should be designed to ensure a sufficient amount of both forward and lateral photographic overlap in order to better allow post-processing software to identify common points between each image. There is no universally accepted overlap standard, as higher or lower figures may be appropriate for different situations, such as dense built-up areas or relatively featureless landscapes. Recommended ‘overlaps are 80% (forward) and 60%-80% (lateral or side-lap). Mission planning software computes the ground coverage size or “footprint” of the photograph, which is dependent on the camera's focal length, the size of its sensor, and how high the UAV is flying above ground level. (Abdullah, 2014). From this ‘ground coverage calculation, the software can compute the number of flight paths Needed to cover the project area, and will determine the spacing needed between these flight lines to ensure adequate overlap. The software then determines the minimum number of images needed to adequately cover this area, as well as the most suitable flight altitude to ensure adequate coverage as well as a sharp ground resolution The following inputs are needed to generate a UAV flight plan: (a) area to be mapped, outlined by a polygon, (b) flying height, (c) camera geometry (focal length and sensor dimensions), and (d) forward overlap and side-lap. (Figure 20 and Figure 21) 7 Figure 20. Project area on Google Earth, flight plan and GcPs. Source: Cramer (201 1) Figure 21. Image footprints showing forward overlaps ‘and side-laps. Source: Munjy (2015) (a) Flight plans shall be referenced to WGS84 datum which is the same datum used by UAVs for navigation. With automated flight planning software based on satellite imagery such as Google Earth, this is automatically achieved. In addition, intended waypoint coordinates and planned flying heights can be easily verified. (b) For fixed-wing flight plans, it is advisable to avoid cross winds in order to eliminate or minimize ‘crabbing’, or angular deviation of the planned UAV course and the actual UAV heading (horizontal camera orientation). Due to difficulty in predicting wind directions accurately, fixed-wing UAV users may also need to change flight plans at the launch site to make flight lines parallel to the prevailing wind direction Changes in the flight plan, either due to obstructions or prevailing winds are facilitated by the highly-automated flight planning software. (©) For areas where prevailing downwind or upwind is significant, fixed-wing UAVS will travel faster on downwind courses with the wind behind it than on upwind courses with the wind facing it. In this case, the camera should be set to trigger at regular distance, rather than time, intervals. Multi-rotor platforms are not dependent on either airspeed or wind direction, thus planning of multi-rotor flights is easier than for fixed-wing UAVs. However, multi- rotor systems tend to have relatively higher power consumption and therefore shorter fight times. Many battery-powered quadcopters, for example, can typically fly for 15-20 minutes per battery pack ply 18 A.4.4 Establishment of Ground Control Points (GCPs) 4.4.1 Pre-marking, or signalization, of GCPs ‘A minimum number of ground control points (around five) is generally required by the software for the referencing process to function. More ground control points permit more accurate results. Ground control points cannot be clustered, but have to be scattered around the area to be mapped for best results. (Figure 22). GCPs, ributed across the mapping area, should be pre-marked, or signalized, with appropriate targets and dimensions ideally prior to the flight so that these can be positively identified on the ortho-photo images. (Figure 23, Figure 24 and Figure 25). The required size of the targets in the field depends on the planned flight altitude and the focal length of the camera. GCPs can also be naturally occurring ground features. The images shown in Figure 26 were taken from a height of about 40m above ground. At such low altitudes no targets ere needed as long as clearly defined terrain vertices are visible in the area Figure 22, Well-distributed GCPs. ‘Source: Day-Keystone Aerial Surveys (2016) Figure 23. Pre-marked GCPs in Barangay Poblacion, Norzagaray, Bulacan. ‘Source: Blanco et al, (2017) 19 20cm x 206m white-painted —-10em-diameter round plastic disk, plastic panel used as checkpoint. Figure 24. Pre-marked GCPs at Bacoor test bed site ‘Souree: Valkmann (2017) Figure 25. GCP on asphalt pavement with 2-inch concrete nail ‘Source: Clarke and Ramirez (2009) Figure 26. Road marking and curb edges, which can be used as natural control points (highlighted with orange circles). ‘Source: Manyoky et al. (2011) A.4.4.2 Field survey of GCPs The coordinates of GCPs should be connected to either NAMRIA or DENR-LMS stations, referenced to either WGS84 or PRS92 datum, using either electronic total stations or by means of differential GNSS to an accuracy less than 2 cm. The corresponding certification of coordinates should be obtained from the appropriate agency and submitted with the survey returns if the UAV output (ortho-imagery) is used to support the mapping component of the land survey. (Figure 27 and Figure 28) oni 20 Figure 27. GCP observation in Barangay Figure 28. GCP observation Poblacion, Norzagaray in Bacoor, Cavite. Source: Blanco et a, (2047) ‘Source: Volkmann (2017) 4.4.3 An altemative methodology: UAV mapping without GCPs ‘An evolving, cutting-edge technology with UAVs is ‘mapping without ground control points’. The UAV is equipped with a survey-grade, multi-frequency GNSS receiver to locate the camera exposure positions, thus, eliminating the need for GCPs. Image processing is done by post-processing kinematic (PPK) methods. Side-lap is also significantly reduced, increasing the area flown per mission. A.4.5 Aerial image acquisition ‘A.4.8.1 Checking of platform assembly and pre-flight procedures After the UAV is placed at the planned take-off position, the link between the remote-control (RC) transmitter and the base station is established, and the flight mission plan is uploaded to the memory of the UAV flight controller. ‘4.5.2 Conduct of UAV flight operation (a) The launching of the UAV may either be in a manual or automatic mode, although it is always good practice to use the manual mode for first take-off to test the operation of the UAV. (0) It is important to test for radio interference in the project area, since, after take- off, the only means to take manual control over UAV is via wireless connection (c) To test and ensure the stability of VTOL UAVs, allow the UAV to hover about two meters above the ground before the actual flight. For hand-launched fixed- wing UAVs, perform a simulated flight pattern in the vicinity of the UAV operator. These tests are conducted to detect possible site-specific conditions that may affect the safety and feasibility of the planned flights (d) When the flight is switched to automatic mode i. Observe the UAV at all times, or follow the progress and behavior of the UAV on the RC transmitter and the base station computer, ii, The operator should be ready to override the automatic mode, if necessary; iii, Monitor the energy levels of the UAV flight batteries; epi iv. Monitor other air traffic in the project area; and v. Identify the nearest and safest emergency landing sites, if the need arises. Uncontrolled changes in the attitude of the UAS during data collection may contribute to considerable challenges during image processing, particularly those associated with abrupt variations in scale, viewing angle and illumination that occur during windy conditions or as a result of operator's actions. In general, lighter platforms are more prone to increased variability in attitude with results that range from the nonrepresentation of a given object of interest to the reduction of overlapping areas, which may affect further processing steps involving mosaicking and 3D reconstruction of the scene. 4.5.3 Landing and post-flight tasks It is good practice to inspect the UAV immediately after landing to check the following: (a) Any signs of overheating of the motors or electronics components; and (b) The quality of the aerial images after the first flight and make adjustments to the camera settings, if necessary, for further flights Before leaving project area, check on Google Earth whether the designed flight plan and coverage was achieved A.4.6 Image processing 46.1 Generation of sparse (low-density) point cloud The original images serve as basic input to the 3D modeling phase of MvS. The information on acquisition coordinates and the roll, pitch and yaw of the UAV platform are used to preliminarily orient the images. The images are then aligned a8 follows (Gindraux et al. (2017) (a) A feature-detection algorithm is applied to detect features, or ‘keypoints’, on every image. (Fonstad et al., 2013) The number of detected keypoints depends on the image resolution, image texture and illumination: (b) Matching keypoints are identified and inconsistent matches removed (Westoby et al., 2012): and (0) A bundle-adjustment algorithm is used to simultaneously solve the 3D geometry of the scene, the different camera positions and the camera parameters (focal length, coordinates of the principal point and radial lens distortions. (Seitz et al., 2006) The output of this step is a sparse point cloud. (Figure 29) ose) 22 Figure 29. Sparse point cloud with relative camera positions after photo alignment process. (Fushe Milot, Albania). ‘Source: Bames ot al. (2014) ‘A.4.6.2 Introduction of GCPs The GCPs are manually identified on the images and their coordinates imported The GCP coordinates are used to refine the camera calibration parameters and to optimize the geometry of the output point cloud. (Smith et al., 2015). A tagged marker is manually positioned on the target image of the GCP for each photograph ‘on which it appears. With forward overlaps and side-laps of 80% and 70%, respectively, a single GCP can appear on several photographs, thus, making phase the most labor-intensive task in the processing The GCPs are centered over their corresponding images in the aerial photographs. When this phase is done, linear transformation parameters are computed to transform the model from an arbitrary local system to the GCP system, either WGS84 or PRS92. An error analysis by the software enables the user to identify and eliminate blunders in the GCP establishment. The sparse point cloud is then re-transformed for optimum positioning in the GCP coordinate system. A.4.6.3 Building the Dense Point Cloud Multi-view stereo image matching algorithms and the improved camera positions are applied to increase the density of the sparse point cloud. The density of the final georeferenced dense point cloud is strongly related to the number of matching keypoints. Different cloud quality parameters (‘low,’ ‘medium,’ and ‘high’ are available for the “build dense cloud” step. This image processing phase usually takes the longest time of all processing steps. (Figure 30) 23 Figure 30. Dense point cloud of Fushe Milot (Albania) ‘Source: Barnes et al. (2014) A.4.6.4 Generating the Digital Surface Model (DSM) The dense point cloud is then used to generate a surface model in the form of a polygon mesh or triangulated irregular network (TIN) for engineering applications (Figure 31), or raster format for 3D visualization in a geographic information system (GIS). Then a texture atlas is finally built and applied to the surface features of the model which is also used to distinguish a bare-earth elevation model from @ non- bare-earth elevation model. The digital terrain model (DTM) is prepared from the DSM by filtering out all features above terrain level. (Figure 32) ge Sur Dita Terran Vodet Figure 31. Polygon Mesh or Triangulated Irregular Figure 32. DSM and DTM Network (TIN), ‘Source: Wikipedia ‘Source: www scisofware.com 46.5 Generating the Ortho-imagery Outputs A texture map derived from all images is applied to the polygon mesh and used to generate an ortho-photo - a rectified mosaic of aerial photographs corrected for terrain relief and photograph tilt and having a uniform scale across the mosaicked image. (Figure 33). ep 24 ‘Source: Hughes (2015) 4.7 Quality control (QC) of UAS mapping products Quality control activities are performed to validate the accuracy and completeness of the outputs and are aimed to identify defects, if any, so that these may be corrected before outputs are finally released to the client. Ortho-photo and DSM. quality mainly depends on camera resolution, flight height and accuracy of GCPs. The deliverables (ortho-imagery and 3D outputs) should meet the specifications stipulated by the end-user. The following quality assurance and quality control checks and tests should be satisfied (a) Data integrity. The output files are in the correct format and the files have not been corrupted: complete coverage of the area of interest has been achieved; accurate georeferencing information exists for each image file, in the specified coordinate system; size of image pixels matches the data product specification. (b) Spatial accuracy. The outputs must meet the end-user requirements for horizontal accuracy which must be assessed using root-mean-square error (RMSE) statistics in the horizontal plane, ie., RMSEx, RMSEy and RMSEr. Horizontal accuracy is assessed for each well-defined point (e.g., GCP) using the difference between reference coordinate from an electronic total station or an RTK GPS and its corresponding coordinate from the ortho-photo. Checkpoints, or GCPS which are not used for the calibration or geo-referencing of the ortho-imagery, are used as independent validation control points. The independent source of higher accuracy for checkpoints shall be at least three times more accurate than the required accuracy of the geospatial data being tested. (ASPRS, 2014) Accuracy testing by an independent source of higher accuracy is the preferred method for positional accuracy. The independent source of higher accuracy shall be the highest accuracy feasible and practicable to evaluate the accuracy of the dataset, Aside from the measurement of the coordinates of checkpoints, the process of accuracy assessment includes preparation of field sketches/photographs to aid the image analyst in the proper identification of the checkpoints in the ortho-image. ply (i) Checkpoint density and distribution Both the total number and spatial distribution of checkpoints play an important role in the accuracy of any geospatial data. For accuracy testing purposes, the distribution of the checkpoints will be project-specific and must be determined by mutual agreement between the data provider and the end user. The minimum number of checkpoints that shall be tested will depend on the terrain and existing land-use and shall be distributed to reflect the geographic area of interest and the distribution of errors in the dataset (Table 3). The generally-accepted rule is for the accuracy to be determined with a confidence level of 95% (+ 2o in Figure 34); for example, for 20 test points, 19 points must pass the specified threshold. ‘TerrainiLand-use Chee: 25% k Points Confidence ‘Level Relatively flat, open, few houses and 10 10 ‘sparse vegetation Relatively flat, dense housing and thick 10 10 ‘vegetation Gently sloping, open, few houses and 16 16 sparse vegetation Gently sloping, dense housing and thick 20 18 ‘vegetation ‘Table 3. Recommended minimum number of checkpoints for UAS ortho-photo mapping Checkpoints may be distributed more densely in the vicinity of important features and more sparsely in areas that are of litle or no interest. Where it is not practically possible, due to limitations in access or terrain, to apply the specified numbers in Table 3, data providers should use their best professional judgment in selecting locations for checkpoints. je 20 io lol e Figure 34 Normal error distribution. The statistical model used to determine confidence level is ‘most offen the normal, or Gaussian, distribution Source: Wikimedia Commons gl (i) Horizontal Accuracy Determination This policy will adopt the equations used by the United States Federal Geographic Data Committee (FGDC) in their National Standards for Geospatial Data Accuracy (NSSDA) of 1998 and likewise adopted in 2014 by the American Society for Photogrammetry and Remote Sensing (ASPRS) for surveys from small UASs. Horizontal positioning accuracy is expressed by the following formula: Accuracy: = 2.4477 x RMSEx = 2.4477 x RMSEy where the root mean square error (RMSE), is expressed by the equations RMSE, = AX RMSE RMSE, = (RMSE? + RMSI and where r= radial displacement AX = difference in eastings between reference GCP and image coordinates AY = difference in northings between reference GCP and image coordinates n= number of well-defined test points within the test, itn (iii) Effect of Number of GCPs on RMSE Figure 35, Effect of the number of GCPs on RMSE and standard deviationand Figure 36 Figure 36. Effect of the number of GCPs on RMSE and standard deviationshow the direct relationship between the number of GCPs and the derived root-mean-square-error (RMSE) and standard deviation based on UAV-based studies. RMSE levels off at 4om-10cm with at least ten (10) GCPs. Gindraux et al. (2017) show that the DSM accuracy increases asymptotically with increasing number of GCPs until a certain GCP density is reached, similarly as reported by Rock et al., (2011) and Tonkin and Midgley (2016), oy estou “ i | Figure 35. Effect of the number of Figure 36. Effect of the number of GCPs GCPs on RMSE and standard deviation on RMSE and standard deviation ‘Source: Long et al. (2016) ‘Source: Gindraux et al. (2017) pbk)» (iv) Effect of image blur on RMSE One of the main problems preventing full automation of data processing of UAV imagery is the degradation effect of image blur caused by camera movement during image acquisition. This can be caused by the normal flight movement of the UAV as well as strong winds, turbulence or sudden operator inputs. This blur disturbs the visual analysis and interpretation of the data, causes errors and can degrade the accuracy in automatic photogrammetric processing algorithms. The detection and removal of these images is currently achieved manually, which is both time-consuming and prone to error, particularly forlarge image-sets. Due to these distortions the laid-out targets cannot be detected accurately in the images, causing difficulties to manually measure the center of these points. The targets shown in Figure 37. The original target (left) and the corresponding targets in the image with motion blur. Source’ Manyoky etal, (2011)could not be used as tie points for the bundle block adjustment in the data evaluation because their center could not be defined accurately enough. Figure 37. The original target (left) and the corresponding targets in the image with motion blur. Source: Manyoky et al. (2011) A possible solution to measure these targets would be the application of a centroid operator (see Figure 38). The centroid operator determines the central Point of distorted targets in images. However, due to the white trail of the blurred images, even the centroid operator could not determine the center of all targets. (Manyoky et al.2011) Figure 38. Determination of the target center points using the centroid operator ‘Source: Manyoky eal. (2011) (c) Completeness and aesthetics. The outputs must meet the customer's expectations in terms of the following general ortho-image quality characteristics which can be checked visually: + Tonal quality should produce uniform contrast and tone across the image. (Figure 39) Sel a + No scratches, image blemishes, haze, debris and artifacts. + No smears and blurs, + No clouds or cloud shadows - flying conditions, flight direction and time should be such that individual photos have tolerable cloud cover and sufficient contrast in features of interest. (Figure 40) + No warped or wavy features - distinct linear ground features, such as fences, roads and curbs, should not deviate from their apparent path. (Figure 41 and Figure 42) + No mosaic lines through build 198. + Building lean - tall structures should have minimal lean in order not to obscure adjacent ground features. (Figure 43) + Shadows should not be so dark that these obscure image content. In areas with tall trees or significant urban development with tall buildings, photography should be acquired when the sun angle creates minimal shadow. (Figure 44) + Edge match and feature misalignment. Horizontal displacement along seam lines is not allowed to have seamless imagery. Ortho-images are inspected for completeness to ensure no gaps or image misplacements exist within and between adjacent images, Figure 39. Example of poor Figure 40. Example good-quality quality ortho-image ortho-image ‘Source: Wierzbicki et al. (2015) ‘Source: Wierzbicki et al. (2015) Figure 41. Warped ‘straight’ fence Figure 42. Warped automobile. ‘Source: Lufa.diy.dranes.com ‘Source: Lufa.diy.drones.com mph building lean) ‘Source: BAE Systems, San Diego, Califomia, USA ()tp://uww.baesystems,com) ale SN Figure 44. Shadows in orthoimagery Source: hilips:/is. stackexchange.com 47.1 Appropriate mapping scale standards based on RMSE, of digital ortho- photos These guidelines shall adopt the following mapping scales deemed applicable for UAS-based mapping outputs based on Table B6 of the ASPRS Positional Accuracy Standards for Digital Geospatial Data (Edition 1, Version 1.0— November 2014) as published in the Photogrammetric Engineering and Remote Sensing (PERS), Vol. 81, No.3, March 2015, pp. A1-A26. Class 4 Plarumetsic Accuracy ‘Map scale Limiting RMSE: (meters) 0.0128 150 0.025 1100 0.050 11200 os | 800 02 os a 125 250 5.00 —— 30 References Abdullah, Q.A, (2014). Designing Flight Route. The Pennsylvania State University, John A. Dutton E-Education Institute, College of Earth and Mineral Sciences, 2014 https: /www.e-education.psu.edu/geog97g/nodel658 American Society for Photogrammetry and Remote Sensing (2014). ASPRS Positional Accuracy Standards for Digital Spatial Data (Edition 1, Version 1.0 - November 2014). Photogrammetric Engineering and Remote Sensing, Vol. 61, No. 3, March 2015, pp. A1-A26 Barnes, G., W. Volkmann, R. Sherko, and K. Kelm (2014). Drones for Peace; Part 1 of 2 - Design and Testing of a UAV-based Cadastral Surveying and Mapping Methodology in Albania, In Proceedings, World Bank Conference on Land and Poverty, Washington D.C., 24-27 March 2014, 28 p. (Accessed 17 March 2017 from httovAvww fao.oraffileadmin/user_upload/nrand_ tenure/JAS pdf) Blanco, A.C., L.P. Balicanta, Y.F. Pagdonsolan and R.R.T. Francisco (2017). A Study on the Use of Unmanned Aerial Systems (UAS) in the Conduct of Land Surveying: A Research Project in Support of Technology for Property Rights. Department of Geodetic Engineering, University of the Philippines, Diliman, Quezon City. CielMap (2012). Low-Cost Unmanned Aircraft Systems for Development Projects: A Quiet Revolution. UNEP Global Environmental Alert Service (GEAS), May 2013. Crommelinck, S., R. Bennett, M. Gerke, M. Y. Yang and G. Vosselman (2016). Review of Automatic Feature Extraction from High-Resolution Optical Sensor Data for UAV-Based Cadastral Mapping. Remote Sensing, 22 August 2016, 8, 689. 28p; www. mdpi.com/ournal/remotesensing Crommelinck, S., R. Bennett, M. Gerke, M. Y. Yang and G. Vosselman (2017) Contour Detection for UAV-based Cadastral Mapping. Remote Sensing, 18 February 2017, 9, 171. 13p; www.mdpi. com/journal/remotesensing Cunningham, K., G. Walker, E. Stahlke and R. Wilson (2011). Cadastral Audit and Assessments using Unmanned Aerial Systems. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol, XXXVIII- ViC22, UAV-g, Conference on Unmanned Aerial Vehicles in Geomatics, Zurich, Switzerland, 2011. (Accessed = on 12, March_~=—2017_—from htto:/www.teamconsulting.co“images/UAV-Geomatics Day, D.R. (2016). Unmanned Aerial System Survey Point Collection Accuracy Assessment. Keystone Aerial Surveys, Inc., https:/kasurveys com/documents/KAS_UASAccuracy. pdf Federal Geographic Data Committee - Geospatial Positioning Accuracy Standards, National Standard for Spatial Data Accuracy, Part 3 (1998). Reston, Virginia, USA. Spal at Greenwalt, C. and M. Schultz (1962). Principles of Error Theory and Cartographic Applications. Greenwood, F. (2016). Chapter 4: How to make maps with drones. Drones and Aerial Observation via WikiMedia Commons, pp.35-47 http://drones. newamerica org/primer/Chapter%204 pdf Hardin, P.J. and T. J. Hardin (2010). Small-Scale Remotely Piloted Vehicles in Environmental Research. Geography Compass, Vol. 4, Issue 9, 1 September 2010, pp. 1297-1311. hito//onlinelibrary. wiley com (Accessed 12 March 2017) Kedzierski, M., A. Fryskowska, D. Wierzbicki and P. Delis (2015). Cadastral Mapping based on UAV Imagery. In Proceedings, 15" International Scientific and Technical Conference, October 26-29, 2015, Yucatan, Mexico. pp. 12-15, (Accessed 29 March 2017 from hittp://conf racurs rusimages/2015/ pdf) Kelm, K. (2014). UAVs Revolutionize Land Administration. GIM International. October 2014, pp.35-37. (Accessed 15 March 2017 from hitos /Avww.gim- international. com), Kurezynski, Z., K. Bakula, M. Karabin, M, Kowalczyk, J. S. Markiewicz, W. Ostrowski P. Podlasiak and D. Zawieska (2016). The Possibility of Using Images Obtained from the UAS in Cadastral Works. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol.XLI-B1, pp. 909-915, XXIII ISPRS Congress, 12-19 July 2016, Prague, Czech Republic. (Accessed on 24 March 2017 from hitos://www.researchaate net/publication/307530775) Laliberte, A., J. Herrick, A. Rango and C. Winters. Acquisition, ortho-rectification and object-based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring. ASPRS Photogrammetric Engineering and Remote Sensing (PERS) 2010, 76:661-672 Long, N., B. Millescamps, F. Pouget, A. Dumon, N. Lauchasee, and X. Bertin (2016) Accuracy Assessment of Coastal Topography Derived from UAV Images. International Society of Photogrammetry and Remote Sensing (ISPRS) XXIII Congress, Prague Czechoslovakia, Poster Presentation. https /Awww.researchgate.net/oublication/303827235) (Accessed on 2 April 2017) Lowe D.G. (2004). Distinctive Image Features from Scale-Invariant Keypoints. International Journal of Computer Vision, 2004. Madden, M., T. Jordan, S. Bernardes, D. L. Cotten, N. O'Hare, and A. Pasqua (2018) Unmanned Aerial Systems and Structure from Motion Revolutionize Wetlands Mapping. (Accessed on 27 March 2017 from https.//jwww.academia.edu/17677913) Manyoky, M., P. Theiler, D. Steudler, and H. Eisenbeiss (2011). Unmanned Aerial Vehicle in Cadastral Applications. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. XXXVIII-1/C22, 57-62, ISPRS Workshop, 14-16 September 2011, Zurich, Switzerland, (Accessed 26 March 2017 from https /www1.ethz ch/igp/geometh/uav_g/proceedinas/) » Saphy Niethammer, U., M.R. James, S. Rothmund and M. Joswig (2012). UAV-based remote sensing of the Super-Sauze landslide: An evaluation. Engineering Geology, 128: 2-11, March 2012. httos /Amw.researchaate net/oublication (Accessed 44 March 2017) Nissen, E. (2016). Structure-from-Motion. Colorado School of Mines, Nissen_OpenTopo-ASU-Apr2016-structure-from-motion.pdf., 43p. (Accessed 23 March 2017 from https://cloud. sdsc.edu/1... ...16UNAVCO/nissen.pdf) Ramadhani, S.A. (2016). Using Unmanned Aircraft System Images to Support Cadastral Boundary Data Acquisition in Indonesia. Thesis, Master of Science in Geo-information Science and Earth Observation, University of Twente, Enschede, The Netherlands, February 2016, 86p. (Accessed on 17 March 2017 from htto-//www ite. nlfibrary/papers 2016/msc/a/ramadhani.pdf) Rijsdijk, M., W.H.M. van Hinsbergh, W. Witteveen, G.H.M. ten Buuren, GA Schakelaar, G. Poppinga, M. van Persie, and R. Ladiges (2013). Unmanned Aerial Systems in the Process of Juridical Verification of Cadastral Border. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. XL-1/W2, 325-331, UAV-g2013, 4-6 September 2013, Rostock, Germany. (Accessed 19 March 2017 from http://www. uav- g.org/Presentations/UAS... Rijsdijk_M..pdf) Ruiz, J.J. et al. (2013). Evaluating the Accuracy of DEM Generation Algorithms from UAV Imagery. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-1/W2, UAV-g2013, 4-6 September 2013, Rostock, Germany, pp.333-337. Seitz, S.M., B. Curless, J. Diebel, D. Scharstein and R. Szelski (2006). A comparison and evaluation of multi-view stereo reconstruction algorithms. In Proceedings, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06), New York, NY, USA, 17-22 June 2006, Volume 1 Settergren, R. and S. Hiker (2016). UAS mapping under the hood. In Proceedings, UAS Mapping Technical Demonstration and Symposium, Palm Springs, Florida, USA, 12 September 2016, Powerpoint Presentation, 126 slides). (Accessed on 31 March 2017 from https:/Avww. 'S.Of -content/uploads/Walker. Siebert, T., R. Wackrow and J.H. Chandler (2016). Automatic detection of blurred images in UAV image sets. ISPRS Journal of Photogrammetry and Remote Sensing 122 (2016):1-16. www.elsevier.com Snavely, N. (2010). Structure from Motion. Lecture 11 Powerpoint, Computer Vision S6670, Department of Computer Science, Cornell University, New York, USA. (Accessed on 31 March 2017 from hito:/www slideserve com) Volkmann, W. (2017). Results of an Experiment to Determine Accuracies Obtained in Structure from Motion (SfMl) Modeling With and Without the Use of 33 Srifodaly www. researchgate.net/publication/ (Accessed 17 March 2017) Whitehead, K. and C.H. Hugenholtz (2015). Applying ASPRS Accuracy Standards to Surveys from Small Unmanned Aircraft Systems (UAS). American Society for Photogrammetry and Remote Sensing (ASPRS) - Photogrammetric Engineering and Remote Sensing, Volume 81, No. 10, October 2015, pp.787-793 https://www.researchgate.net/publication/ (Accessed 10 March 2017) Wikimedia Commons. Wikipedia raph 34

You might also like