| 459 | | 67/4 | V G 2023 GEODETSKI VESTNIK | letn. / Vol. 67 | št. / No. 4 | RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES ABSTRACT IZVLEČEK KLJUČNE BESEDE KEY WORDS UAV photogrammetry, multi-altitude, data quality, ground control points, flight parameters, multispectral sensor, accuracy UAV fotogrametrija, več višin, kakovost podatkov, kontrolne točke, parametri leta, multispektralni senzor, točnost UDK: 528.7 Klasifikacija prispevka po COBISS.SI: 1.01 Prispelo: 23. 4. 2023 Sprejeto: 9. 8. 2023 DOI: 10.15292/geodetski-vestnik.2023.04.459-472 SCIENTIFIC ARTICLES Received: 23. 4. 2023 Accepted: 9. 8. 2023 Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić ANALIZA TOČNOSTI UAV- FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA ACCURACY ANALYSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS In recent years, unmanned aerial vehicles (UAVs) have become increasingly important as a tool for quickly collecting high-resolution (spatial and spectral) imagery of the Earth's surface. The final products are highly dependent on the choice of values for various parameters in flight planning, the type of sensors, and the processing of the data. In this paper ground control points (GCPs) were first measured using the Global Navigation Satellite System (GNSS) Real-Time Kinematic (RTK) method, and then due to the low height accuracy of the GNSS RTK method all points were measured using a detailed leveling method. This study aims to provide a basic assessment of quality, including four main aspects: (1) the difference between an RGB sensor and a five-band multispectral sensor on accuracy and the amount of data, (2) the impact of the number of GCPs on the accuracy of the final products, (3) the impact of different altitudes and cross flight strips, and (4) the accuracy analysis of multi-altitude models. The results suggest that the type of sensor, flight configuration, and GCP setup strongly affect the quality and quantity of the final product data while creating a multi- altitude model does not result in the expected quality of data. With its unique combination of sensors and parameters, the results and recommendations presented in this paper can assist professionals and researchers in their future work. V zadnjih nekaj letih daljinsko vodeni letalniki (angl. unmanned aerial vehicles) postajajo vse pomembnejše orodje za hitro zbiranje visokoločljivostnih slik Zemljine površine (prostorsko in spektralno). Končni izdelki so zelo odvisni od izbire vrednosti različnih parametrov pri načrtovanju leta, vrste senzorjev in obdelave podatkov. V članku so najprej izmerjene oslonilne točke (angl. ground control points) z uporabo metode GNSS RTK, nato pa so bile vse točke zaradi nizke višinske točnosti metode GNSS RTK izmerjene s podrobno nivelacijsko metodo. Cilj študije je zagotoviti osnovno oceno kakovosti, vključno s štirimi glavnimi vidiki: (1) razliko med RGB-senzorjem in petpasovnim multispektralnim senzorjem glede na točnost in količino podatkov, (2) vplivom GCPs na točnost končnih izdelkov, (3) vplivom različnih višin in prečni snemalni pas ter (4) analizo točnosti modelov z več višinami. Rezultati kažejo, da vrsta senzorja, konfiguracija leta in postavitev kontrolnih točk močno vplivajo na kakovost in količino končnih podatkov izdelkov, medtem ko ustvarjanje modela z več višinami ne prinese pričakovane kakovosti podatkov. Z edinstveno kombinacijo senzorjev in parametrov lahko v članku predstavljeni rezultati in priporočila pomagajo strokovnjakom in raziskovalcem pri prihodnjem delu. Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | SI | EN | 460 | | 67/4 | GEODETSKI VESTNIK RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN 1 INTRODUCTION The 3D modeling generated from images captured by high-resolution cameras on unmanned aerial vehicles (UAVs) is becoming a popular and low-cost alternative for large-scale mapping (Smith, Carrivick, and Quincey, 2016). Recent advancements in sensors and flying systems have expanded the use of UAVs in various fields such as forestry (Kranjec, Čekada, and Kobal, 2021), agriculture (Su et al., 2022), surveying (Sertić et al., 2022), natural disaster monitoring (Touge et al., 2022), and many others. The quality of the final output is highly dependent on the selection of several parameters for the flight mission and data processing. Successfully achieving a flight mission with desired accuracy requires a deep understanding of the different flight settings that depend on site conditions, weather, and available lighting. There are several crucial elements to consider when capturing data with UAVs, including flying altitude, percentage of forward overlap and sidelap, and the speed of the aircraft while taking images (Nagendran et al., 2018). The Structure from Motion (SfM) technique aims to reconstruct a surface or object by matching tie points obtained from multiple images. Each point includes a position and color extracted from an image (Westoby et al., 2012). This method does not require metric cameras, making the SfM a more feasible approach due to the lower cost and availability of nonmetric cameras (Smith, Carrivick, Quincey, 2016). The SfM algorithm generates a sparse point cloud using a local reference system. There are two methods for georeferencing the images: direct georeferencing using navigation sensors (mainly Global Navigation Satellite Systems (GNSS)) (Türk et al., 2022), and indirect georeferencing using ground control points (GCPs). GCPs have been shown to mitigate systematic lateral and vertical deformations in the resulting data products (James and Robson, 2014). The distribution of GCPs has been extensively analyzed by various studies. A meta-study (Singh and Frazier, 2018) found a weak negative correlation between residual statistics and the number of GCPs collected per hectare, but no clear relationship between the number of GCPs and the size of the study area. The distribution of GCPs significantly impacts on spatial accuracy of the orthophoto, also a systematic distribution of GCPs is recommended (Manfreda et al., 2019; Villanueva and Blanco, 2019). However, the optimal number of GCPs is not clear-cut, with relatively small study sites sug- gesting 5-6 GCPs for stability in vertical error and 5 GCPs for stability in horizontal error (Manfreda et al., 2019; Tonkin and Midgley, 2016). On the other hand for large areas (larger than 1000 ha), 5 GCPs resulted in low spatial quality (Sanz-Ablanedo et al., 2018), who recommended using a mini- mum of 50 GCPs. Oniga et al. (2020) achieved similar results and recommended integrating 15 or 20 GCPs in image processing. Improving spatial accuracy can also be accomplished through oblique imagery (Luo et al., 2022) or cross flight strips (Gerke and Przybilla, 2016). Residual measurements are commonly performed either within the point cloud or after bundle block adjustment. However, it is important to note that these measurements might not accurately capture the displacements of image points in the final product, as there could be potential offsets introduced during the process of orthophoto generation. After creating the dense point cloud, we can construct the proper 3D model by applying triangulation algorithms to build a mesh. Additionally, with the exterior and interior orientation parameters, we can Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 461 | GEODETSKI VESTNIK | 67/4 | RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES produce orthophotos, where all objects with a certain height are accurately positioned in a 2D plane (Verhoeven, 2011). The first type of 2.5D model created is the Digital Elevation Model (DEM) or Digital Surface Model (DSM), which considers the height of all objects above the ground. The Digital Terrain Model (DTM) is a 2.5D model that excludes above-ground elements such as buildings and trees and is produced by triangulating the ground points into the mesh. The final product of SfM are image interior and exterior orientations as well as sparse point cloud. After that, a Multi-View Stereo (MVS) algorithm utilizes oriented images to operate on each image pixel, generating a dense point cloud, which is the foundation for deriving orthophotos and DTMs through subsequent process- ing. (Vitti et al., 2019). In terms of spectral discrimination, remote sensing imagery is typically based on the reflectance of spe- cific leaves and canopies in the visible (red, green, blue, RGB) and non-visible (near-infrared, NIR, and far-infrared, thermal) ranges of the spectrum. Reflectance is measured using visible, thermal (Goddijn- Murphy et al., 2022), multispectral, and hyperspectral sensors (Imai et al., 2019). While many studies have been conducted using RGB imaging sensors, they are limited to visible wavebands and lack sufficient sensitive band information (Lu et al., 2021). Although hyperspectral imaging sensors have a large number of wavebands, they are expensive and their data processing is complex (Qin et al., 2016). Multispectral image sensors are very suitable for calculating vegetation index estimation, with low costs, sufficient spectral information, and easy data processing. (Osco et al., 2019). Furthermore, multi-rotor UAVs are widely used in agricultural remote sensing due to their good stability and adaptability (Shendryk et al., 2020). Despite the weak dependencies between several factors that influence data quality, existing study results are highly heterogeneous. Thus, this paper aims to provide best practice guidance for optimal flight configurations by synthesizing the results of a thorough quality assessment that considers four main aspects: (1) the influence of sensors, (2) ground-truthing, (3) the impact of cross flight strips, and (4) the accuracy of orthophoto obtained by processing multi-altitude images dataset. The latter three aspects aim to improve data quality during and after photogram- metric processing, while the first aspect also focuses on the quantity of data obtained during capture and processing. 2 MATERIALS AND METHODS 2.1 Study area The research was conducted on a 2 ha agricultural land without vegetation in the vicinity of Bački Petrovac, Serbia which is shown in Figure 1. According to Stöcker et al. (2020) to mini- mize the impact of terrain on generating tie-points, a predominantly flat terrain without natural or artificial objects was selected. The research assumes that the accuracy and quality of the data are influenced by four different criteria described in the introduction. The detailed procedure is shown in Figure 2. Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 462 | | 67/4 | GEODETSKI VESTNIK RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN Figure 1: Study area Figure 2: Workflow 2.2 UAV and data collection The research was conducted using a low-cost UAV , the DJI Phantom 4, which is equipped with a stand- ard integrated RGB sensor and an additional multispectral MicaSense RedEdge-M sensor weighing 170 grams that is mounted with MX Mount Kit. The MicaSense camera has 5 bands RED (663-673 nm), GREEN (550-570 nm), BLUE (465-485 nm), REDEDGE (712-722 nm), and NIR (820-860 nm). These bands represent the visible spectrum and NIR spectrum and allow for the calculation of vegetation indices that serve to monitor the condition of plants (Kelecey and Lucieer, 2012). The flight plan is defined using the DroneDeploy software package. With a mission planner, many parameters can affect the number of collected images of the study area, as well as the Ground Sampling Distance (GSD). It refers to the distance between the centers of adjacent pixels on the ground. Proper Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 463 | GEODETSKI VESTNIK | 67/4 | RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN recording and sensor calibration should result in a model with relative accuracy within 1-3 times the GSD value. The absolute accuracy of the model, however, is lower, with 1-2 times the GSD value in the E and N axis and 1-3 times in the vertical H axis (Sanz-Ablanedo et al., 2018). Three flight missions were carried out at altitudes of 50 m, 80 m, and 120 m above ground, using forward and cross flight strips. According to Stöcker et al. (2020), for areas that represent agricultural lands, an overlap of 70- 80%, both forward and sidelap is sufficient. For this reason, for all three missions, the forward and side overlaps were identical (75% and 70%, respectively). GCPs are systematically distributed over the study area and they were used to determine the exterior orientation. They were materialized using wooden square plates measuring 25 cm x 25 cm. The square was divided into four equal fields, two of which were black and two white. The E and N coordinates of 41 control points were determined using the GNSS RTK method. The GNSS antenna used was the T rimble R8, and each point was measured for 5 seconds with an accuracy of better than 2 cm. The coordinates were defined in the official coordinate system of the Republic of Serbia ETRS89/UTM34 (EPSG: 25834). Differential corrections were obtained from the nearest reference station (Novi Sad) in the Gentoo system. Due to large errors in height measure- ment using the GNSS RTK method (Krzyżek and Kudrys, 2022), they were determined by detailed leveling. The local date in the height sense was represented by a point, whose height was determined by the GNSS RTK method, while the heights of other points were determined concerning it. A Leica DNA03 leveling instrument with a bar code rule was used, which according to Mazić et al. (2013) can achieve an accuracy of 1.0 mm. 2.3 Flight altitude Table 1: Images datasets Variation Flight altitude (m) Flight strips DJI Phantom 4 RGB sensor MicaSense sensor Forward Side No. of images GSD (cm) No. of images GSD (cm) I 50 x x 208 1.62 1855 3.78 II 50 x 102 1.70 800 3.79 III 80 x x 92 2.61 875 5.98 IV 80 x 45 2.70 350 5.94 V 120 x x 49 3.77 485 8.94 VI 120 x 23 4.03 190 8.97 VII 50 + 120 x x 257 1.70 2340 3.84 VIII 50 + 80 + 120 x x 349 1.90 3215 3.99 Following the basic photogrammetric principles and formulas (Kraus, 2007), it is clear that reducing the flight altitude increases the number of collected images and improves the quality of the final product. Several different flight patterns were exemplified for this study. Following existing literature that sug- gests the use of forward and cross imaging strips (ISO, 2013), image datasets were divided into those that use forward and cross flight strips and those that only use forward. Considering that the mission was done at three flight altitudes (50 m, 80 m, and 120 m), the first image dataset consists of images collected at a flight altitude of 50 m with all flight directions, while the second combination consists of images also at a flight altitude of 50 m only with forward flight strips. The same procedure was applied for the remaining two flights altitudes, resulting in a total of six combinations based on flight altitude Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 464 | | 67/4 | GEODETSKI VESTNIK RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN and flight strips. In addition, two multi-altitude image datasets were added. The first multi-altitude combines images taken at flight altitudes of 50 m and 120 m in all directions, while the second one combines images from missions performed at 50 m, 80 m, and 120 m in all directions. Table 1 shows the previously explained combinations, flight altitude, flight strips for each combination, the number of images, and average GSD in centimeters obtained using each of these image datasets. The trajectory of each mission is shown in Figure 3. Figure 3: Trajectory of flight mission 2.4 Distribution and establishment of GCP The measurement of GCP coordinates is often today the most time-consuming on-site operation of a UAV photogrammetric survey (Forlani et al., 2018). However, it is still the safest and most accurate way to georeference a UAV photogrammetric block (Hugenholtz et al., 2016). GCPs are usually placed on the outside edge and several inside the imaging area. Checkpoints (ChPs) serve for independent accuracy determination and they are distributed throughout the area. To investigate the impact of the number of GCPs on the final accuracy of the model, 41 points were set up. For each approach, eight scenarios were tested where the number of used GCPs was changed i.e. 3, 4, 5, 6, 8, 10, 15, 21. The remaining points were used as independent ChPs for determining the horizontal and vertical accuracy of the final products. The distribution of GCPs and ChPs depending on the model is shown in Figure 4. Figure 4: Distribution of GCPs Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 465 | GEODETSKI VESTNIK | 67/4 | RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN 2.5 Data processing The photogrammetric processing was performed using Pix4Dmapper, maintaining the original image resolution. Dense point clouds were generated, DSMs and orthophotos were produced with a resolution of 1 GSD. The MicaSense multispectral sensor captured five separate images, one for each band, and an alignment process was needed to group the images taken in each trigger. The final products varied depending on the camera used, resulting in two types of orthophotos. The first was obtained from the RGB sensor images and contained three bands (Blue, Green, and Red), and the second orthophoto was obtained from the multispectral sensor images and contained five bands (Blue, Green, Red, RedEdge, and NIR). Figure 5a shows the orthophoto obtained by processing images from the RGB sensor, while the orthophoto in Figure 5b is obtained by processing images from the multispectral sensor (RGB= NIR-RedEdge-Red). Figure 5: a) RGB orthophoto b) Multispectral orthophoto The accuracy was evaluated on the final products (orthophoto and DSM) to determine the absolute accuracy and to reveal any errors that arose during the orthorectification process (Fras, Šušteršič, Kežul, 2021). The positions of the ChPs were visually identified and marked in the orthophoto using QGIS 3.22.15, and the H coordinate was determined using the DSM and the Sample Raster Value option in the QGIS toolbox. To describe the overall planimetric and vertical error of a particular processing scenario, the root mean square error (RMSE) was calculated according to the ISO (2013) standard. In this context, the GNSS and leveling measurement of the ChPs coordinates was treated as the true value, and the extracted coordinates from the orthophoto and DSM were considered the predicted value. The final accuracy is presented in GSD. 3 RESULTS AND DISCUSSION The accuracy of products of UAV photogrammetry was determined by locating and marking ChPs on the orthophoto image. After that, using the Sample raster value in QGIS and DSM, the height of ChPs was also obtained. The GSD is critical in determining the accuracy. To compare results obtained from flights at different altitudes and with different sensors, it is important to measure accuracy relative to the GSD so that it can be useful for images with varying GSDs, also values are given in centimeters. Table 2 displays the RMSE of horizontal and vertical ChPs residuals for all RGB sensor datasets, while odd rows of T able 4 show the RMSE of the overall positional (3D) accuracy for RGB sensor datasets. The model with the best horizontal accuracy in GSD (2.05 cm-0.54 GSD) falls under category V and is based on 21 GCPs. Simi- larly, the model demonstrating the best horizontal accuracy in centimeters (1.96 cm-0.72 GSD) belongs to category IV and also incorporates 21 GCPs. Concerning vertical accuracy, we observe that the model achieving the best precision in GSD (3.24 cm-0.80 GSD) corresponds to category VI and is based on 6 Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 466 | | 67/4 | GEODETSKI VESTNIK RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN GCPs. Conversely, when considering accuracy in centimeters, the model exhibiting the best vertical preci- sion (1.48 cm-0.91 GSD) belongs to category I and is based on 8 GCPs. In terms of positional accuracy ratings, models with the most accurate measurements in GSD (3.34 cm-1.24 GSD) and centimeters (3.09 cm-1.82 GSD) belong to categories IV and II, respectively, both with 21 GCPs. A comparison between models employing only forward flight and those incorporating forward and cross strips, at the same altitude and with an equal number of GCPs, reveals that the majority of models utilizing forward and cross strips exhibit superior horizontal accuracy, whereas vertical and positional accuracy demonstrate less discernible trends, dependent to a greater extent on the altitude at which the aerial images were acquired Table 2: RGB sensor – horizontal and vertical RMSE GCP I h / z (GSD) (cm) II h / z (GSD) (cm) III h / z (GSD) (cm) IV h / z (GSD) (cm) V h / z (GSD) (cm) VI h / z (GSD) (cm) VII h / z (GSD) (cm) VIII h / z (GSD) (cm) 3 3.73 / 2.20 3.07 / 3.26 2.31 / 1.58 2.12 / 1.85 1.51 / 0.96 1.45 / 0.88 3.41 / 1.11 3.23 / 1.76 6.05 / 3.57 5.23 / 5.55 6.05 / 4.14 5.73 / 5.01 5.72 / 3.64 5.86 / 3.58 5.81 / 1.9 6.15 / 3.35 4 4.07 / 2.49 3.07 / 3.00 2.24 / 1.49 2.01 / 1.25 0.99 / 0.95 1.15 / 0.94 3.14 / 1.05 2.73 / 3.62 6.60 / 4.04 5.23 / 5.11 5.87 / 3.89 5.43 / 3.40 3.75 / 3.61 4.64 / 3.80 5.34 / 1.79 5.19 / 6.88 5 2.97 / 1.20 2.74 / 2.31 1.94 / 1.44 1.69 / 1.14 1.41 / 1.43 1.21 / 1.02 2.88 / 3.40 2.63 / 1.23 4.82 / 1.95 4.66 / 3.94 5.08 / 3.78 4.57 / 3.10 5.32 / 5.41 4.91 / 4.13 4.90 / 5.78 5.00 / 2.34 6 2.95 / 1.01 2.65 / 1.18 1.83 / 0.93 1.90 / 0.91 1.19 / 0.99 1.20 / 0.80 2.88 / 2.72 2.42 / 2.27 4.78 / 1.65 4.51 / 2.02 4.79 / 2.43 5.13 / 2.47 4.52 / 3.74 4.84 / 3.24 4.90 / 4.64 4.61 / 4.32 8 2.70 / 0.91 2.22 / 1.00 1.71 / 0.91 1.58 / 0.92 1.15 / 0.96 1.21 / 1.02 2.24 / 2.57 2.02 / 2.03 4.39 / 1.48 3.78 / 1.71 4.47 / 2.39 4.28 / 2.49 4.35 / 3.64 4.88 / 4.14 3.82 / 4.38 3.84 / 3.86 10 1.87 / 1.37 1.80 / 1.52 1.33 / 1.01 1.45 / 1.01 0.86 / 1.26 0.91 / 1.35 1.54 / 1.35 1.44 / 1.49 3.03 / 2.22 3.06 / 2.60 3.48 / 2.66 3.92 / 2.74 3.27 / 4.77 3.67 / 5.47 2.63 / 2.30 2.74 / 2.84 15 2.00 / 1.07 1.85 / 0.94 1.21 / 0.87 1.24 / 0.90 0.84 / 1.13 1.13 / 1.10 1.54 / 1.24 1.51 / 1.02 3.24 / 1.74 3.15 / 1.60 3.18 / 2.28 3.37 / 2.45 3.20 / 4.29 4.56 / 4.46 2.63 / 2.12 2.88 / 1.95 21 1.38 / 1.36 1.17 / 1.39 0.79 / 0.98 0.72 / 1.01 0.54 / 1.21 0.84 / 0.99 1.19 / 1.60 1.22 / 1.17 2.25 / 2.21 2.00 / 2.37 2.07 / 2.58 1.96 / 2.74 2.05 / 4.59 3.42 / 4.01 2.03 / 2.72 2.32 / 2.23 In the case of the multispectral sensor, whose results for horizontal and vertical accuracy are shown in T able 3, and positional accuracy are shown in even rows of T able 4. Category VIII encompasses the model achieving the best horizontal accuracy in centimeters (3.84 cm-0.96 GSD), based on 21 GCPs. On the other hand, category VI features the model with the best vertical accuracy in GSD (5.99 cm-0.66 GSD), based on 4 GCPs. Notably, the model displaying the most accurate measurements in both centimeters and GSD (2.16 cm-0.56 GSD) belongs to category II, with a configuration based on 6 GCPs. In terms of positional accuracy assessment, category II is assigned to the model demonstrating the best centim- eter accuracy (4.85 cm-1.28 GSD), incorporating 6 GCPs. Conversely, category III designates the best model in GSD positional accuracy (6.99 cm-1.17 GSD), based on 21 GCPs. When comparing models with only forward flight to those combining forward and cross strips, at equivalent altitudes and GCPs quantities, no definitive trend emerges, as the results predominantly indicate similar performance. When comparing positional accuracy in RMSE with flight altitude, interesting results are obtained. In general, for RGB sensors, there is a noticeable trend of better positional accuracy expressed in GSD as the flight altitude increases. However, for multispectral sensors, no clearly defined trend in terms of GSD accuracy is observed. When discussing RGB multi-altitude models, interesting observations have been made. In Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 467 | GEODETSKI VESTNIK | 67/4 | RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN the multi-altitude category VII models a slightly better horizontal accuracy is achieved compared to category I models and much worse than category V models. In another multi-altitude category VIII model, we have even better horizontal and positional accuracy than in category VII. Table 3: Multispectral sensor – horizontal and vertical RMSE GCP I h / z (GSD) (cm) II h / z (GSD) (cm) III h / z (GSD) (cm) IV h / z (GSD) (cm) V h / z (GSD) (cm) VI h / z (GSD) (cm) VII h / z (GSD) (cm) VIII h / z (GSD) (cm) 3 1.92 / 2.32 2.01 / 2.83 0.95 / 0.85 1.47 / 5.99 1.02 / 4.85 0.89 / 2.18 1.91 / 1.41 1.47 / 11.00 7.26 / 8.79 7.63 / 10.75 5.71 / 5.11 8.77 / 35.62 9.12 / 43.42 8.07 / 19.63 7.36 / 5.42 5.90 / 43.91 4 1.86 / 2.70 1.60 / 0.91 1.60 / 2.85 1.10 / 4.11 1.19 / 3.41 0.66 / 1.95 1.40 / 1.13 3.79 / 10.83 7.04 / 10.23 6.08 / 3.46 9.62 / 17.1 6.58 / 24.45 10.67 / 30.55 5.99 / 17.56 5.38 / 4.35 15.13 / 43.24 5 2.03 / 1.75 1.56 / 1.20 1.67 / 3.44 1.15 / 1.60 0.92 / 3.69 0.87 / 2.61 1.71 / 1.37 3.27 / 15.24 7.69 / 6.63 5.95 / 4.58 9.99 / 20.62 6.88 / 9.53 8.25 / 33.02 7.88 / 23.45 6.59 / 5.27 13.07 / 60.82 6 1.17 / 1.56 1.15 / 0.56 1.25 / 2.27 0.95 / 1.55 1.09 / 3.07 0.99 / 2.15 1.39 / 1.72 2.30 / 9.92 4.44 / 5.92 4.36 / 2.16 7.48 / 13.6 5.68 / 9.22 9.79 / 27.51 8.94 / 19.37 5.36 / 6.63 9.20 / 39.6 8 1.35 / 1.00 1.34 / 0.86 1.28 / 2.15 1.00 / 1.55 1.11 / 2.00 0.89 / 1.91 1.41 / 0.81 1.39 / 8.57 5.12 / 3.79 5.10 / 3.28 7.67 / 12.9 5.99 / 9.21 9.96 / 17.93 8.00 / 17.21 5.45 / 3.12 5.57 / 34.23 10 1.07 / 0.79 1.34 / 1.17 1.10 / 1.65 1.00 / 1.23 0.88 / 1.34 0.79 / 1.32 1.14 / 0.70 1.08 / 1.43 4.06 / 3.00 5.09 / 4.44 6.61 / 9.90 5.95 / 7.35 7.95 / 12.03 7.10 / 11.88 4.40 / 2.72 4.31 / 5.72 15 1.11 / 1.49 1.17 / 1.32 0.91 / 1.51 0.93 / 1.15 0.94 / 1.10 0.85 / 1.30 1.18 / 0.78 1.09 / 1.23 4.21 / 5.65 4.46 / 5.02 5.47 / 9.04 5.53 / 6.86 8.47 / 9.86 7.65 / 11.74 4.54 / 3.02 4.37 / 4.91 21 1.06 / 1.54 1.07 / 1.15 0.78 / 0.87 0.74 / 0.99 0.76 / 1.16 0.87 / 1.51 1.06 / 1.00 0.96 / 0.88 4.01 / 5.83 4.06 / 4.37 4.67 / 5.25 4.42 / 5.92 6.84 / 10.43 7.88 / 13.63 4.08 / 3.84 3.84 / 3.53 Table 4: RGB and multispectral sensor - positional RMSE GCP I RGB / MSS* (GSD) (cm) II RGB / MSS* (GSD) (cm) III RGB / MSS* (GSD) (cm) IV RGB / MSS* (GSD) (cm) V RGB / MSS* (GSD) (cm) VI RGB / MSS* (GSD) (cm) VII RGB / MSS* (GSD) (cm) VIII RGB / MSS* (GSD) (cm) 3 4.33 / 3.01 4.48 / 3.47 2.81 / 1.28 2.81 / 6.17 1.79 / 4.96 1.70 / 2.36 3.59 / 2.38 3.68 / 11.10 7.01 / 11.37 7.61 / 13.15 7.33 / 7.65 7.58 / 36.64 6.74 / 44.34 6.85 / 21.16 6.10 / 9.13 6.99 / 44.28 4 4.77 / 3.28 4.30 / 1.84 2.69 / 3.28 2.37 / 4.26 1.38 / 3.61 1.48 / 2.06 3.31 / 1.80 4.53 / 11.48 7.72 / 12.39 7.31 / 6.97 7.02 / 19.61 6.39 / 25.30 5.2 / 32.27 5.96 / 18.47 5.62 / 6.91 8.60 / 45.80 5 3.20 / 2.68 3.58 / 1.98 2.42 / 3.83 2.04 / 1.97 2.01 / 3.80 1.59 / 2.75 4.45 / 2.19 2.91 / 15.59 5.18 / 10.13 6.08 / 7.50 6.31 / 22.9 5.5 / 11.70 7.57 / 33.97 6.40 / 24.66 7.56 / 8.40 5.52 / 62.20 6 3.12 / 1.96 2.90 / 1.28 2.05 / 2.59 2.10 / 1.82 1.55 / 3.26 1.44 / 2.37 3.97 / 2.22 3.32 / 10.19 5.05 / 7.40 4.93 / 4.85 5.35 / 15.48 5.67 / 10.81 5.84 / 29.14 5.80 / 21.25 6.74 / 8.52 6.30 / 40.65 8 2.85 / 1.68 2.44 / 1.60 1.94 / 2.50 1.83 / 1.85 1.50 / 2.29 1.58 / 2.11 3.41 / 1.63 2.86 / 8.69 4.61 / 6.35 4.14 / 6.06 5.06 / 14.95 4.94 / 10.98 5.65 / 20.47 6.36 / 18.92 5.79 / 6.25 5.43 / 34.67 10 2.32 / 1.33 2.36 / 1.78 1.67 / 1.99 1.77 / 1.59 1.53 / 1.61 1.63 / 1.54 2.05 / 1.34 2.07 / 1.79 3.75 / 5.02 4.01 / 6.74 4.35 / 11.9 4.77 / 9.44 5.76 / 14.39 6.56 / 13.81 3.48 / 5.14 3.93 / 7.14 15 2.27 / 1.86 2.07 / 1.77 1.50 / 1.76 1.54 / 1.48 1.41 / 1.45 1.58 / 1.56 1.98 / 1.41 1.83 / 1.64 3.67 / 7.03 3.51 / 6.70 3.91 / 10.52 4.15 / 8.79 5.31 / 12.96 6.36 / 13.99 3.36 / 5.41 3.47 / 6.54 21 1.94 / 1.87 1.82 / 1.57 1.27 / 1.17 1.24 / 1.24 1.33 / 1.39 1.31 / 1.75 1.99 / 1.45 1.69 / 1.30 3.14 / 7.06 3.09 / 5.95 3.31 / 6.99 3.34 / 7.36 5.01 / 12.42 5.27 / 15.69 3.38 / 5.56 3.21 / 5.18 * (Multispectral sensor), Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 468 | | 67/4 | GEODETSKI VESTNIK RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN With the multispectral sensor, category VII is particularly highlighted with exceptionally good positional accuracy with a very uniform distribution concerning the number of GCPs. While another multi-altitude category, category VIII, has much worse results even though it has many more pictures. Figure 6 displays the distribution of horizontal and positional accuracy across all categories in terms of GSD. This visualization enables us to conclude the impact of the number of GCPs on the final accuracy. If horizontal accuracy is considered, it improves significantly with an increase in GCPs. For positional accuracy, it can be divided into two groups: (i) up to 5 GCPs, occasional unexpected results with poor accuracy are observed, and (ii) 5 or more GCPs which generally follow the trend and have accuracy below 4 GSD (except for some models from category VIII). It's also worth mentioning that with an increase in height, the influence of the number of GCPs on both horizontal and positional accuracy decreases. In variant VIII of the multispectral sensor, the RMSE values for models with 3, 4, 5, 6, and 8 GCPs are much worse than those shown in Figure 6. Although UAVs can capture images with a resolution of a few centimeters, this study finds that accuracy can vary from a few centimeters to several decimeters depending on the flight configuration chosen. T o fully utilize the capabilities of UAVs, careful planning of the mission is crucial, balancing the number of images and GCPs collected to meet accuracy requirements while also considering factors such as time constraints and hardware limitations. The study compares accuracy, data collection, and total measurement and processing time for different sensors, flight altitudes, flight patterns, and multi-altitude models. The results show a consistent picture of the trade-offs between accuracy, data collection, and time. This study examines the accuracy, and potential of using UAVs and multispectral sensors in agriculture, including the capturing, classification, and calculation of various vegetation indices. The RGB camera only can capture 3 bands, limiting its use, while the multispectral sensor has a wider range of electro- magnetic radiation and the ability to capture 5 bands. The results showed that with the same flight parameters, there were significant differences in GSD depending on the sensor used, and therefore in the final spatial accuracy. The multispectral camera performed well at flight altitudes of 50 m and 80 m (independent of flight strips), however, an increase in altitude to 120 m resulted in significant devia- tions in vertical accuracy, while horizontal accuracy was almost identical to that of the RGB sensor. The possibility of using both sensors at the same time to obtain high spatial and spectral resolution should not be disregarded (Tait et al., 2019). This study did not find significant differences between models using only forward flight patterns and models using both. In fact, in most cases, models using only forward flight patterns resulted in better horizontal accuracy and roughly the same positional accuracy. However, an interesting conclusion can be drawn that vertical accuracy is much worse for models using only forward flight patterns. Variations in altitude can impact the resolution of orthophotos and DSM. As seen in T able 1, the higher the altitude, the lower the resolution per cm/pixel. GSD is crucial for producing high-resolution maps. When capturing data, it is important to consider both the minimum GSD and the area to be covered. Client requirements for the resolution of orthophotos and DSM must also be taken into account. This helps to save time and minimize the amount of data used during the processing stage, which can some- times take several days. Multi-altitude models did not produce the expected results, with the product GSD being the same size as the GSD for the lowest flight altitude, while the positional RMSE of ChPs Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 469 | GEODETSKI VESTNIK | 67/4 | RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN was significantly worse compared to models that only had one flight altitude. An exception was category VII with a multispectral sensor, which gave unexpectedly good results. Figure 6: Distribution of RMSE When speaking about the impact of GCPs on the final accuracy in horizontal and positional terms, there is not much deviation or difference in positional RMSE after 10 GCPs in this area and with this equipment. The use of GCPs can significantly enhance the accuracy of the three-dimensional (3D) Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 470 | | 67/4 | GEODETSKI VESTNIK RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN information, making their measurement a crucial aspect of georeferencing UAV image blocks. It has been shown that increasing the number of GCPs leads to improved accuracy of the final products, such as point cloud, 3D mesh, orthophoto, DSM, or DTM (Gindraux et al., 2017; Agüera-Vega, Carvajal- Ramírez and Martínez-Carricondo, 2017). However, adding a large number of GCPs can be both time-consuming and computationally demanding, making it important to find an optimal number of GCPs. The spatial distribution of GCPs and its impact on the accuracy of the georeferencing process also needs to be investigated, especially in the case of small study areas. The density and relative distance of GCPs may affect the final accuracy. In category VIII, a substantial vertical error offset was observed in the scenario with less than 10 GCPs. This magnitude of height offset has been reported previously (Manfreda et al., 2019) and is specific to DJI UAVs. 4 CONCLUSION This paper presents guidelines for optimal UAV data collection based on a thorough examination of data quality metrics applied to multiple orthophotos created using different flight configurations and sensors. The methods considered a range of factors, including the use of RGB and multispectral sen- sors, flight parameters, and the number of images taken. The findings emphasize that the type of sensor, flight altitude and pattern, multi-altitude flights, and the arrangement of ground control points all have a substantial effect on the final data quality of the orthophoto and DSM. To summarize, the following suggestions can be made: – Using both RGB and multispectral sensors simultaneously provides high spatial and spectral reso- lution. The RGB camera provides excellent accuracy results, while the multispectral camera adds crucial bands for the classification and calculation of vegetative indices. – Ten ground control points can be recommended as the optimal survey design, as the absolute ac- curacy does not significantly improve with more GCPs and additional GCPs may slow down the process in areas that are similar to our research area. – Using cross flight strips is only necessary if high vertical accuracy is required. In terms of horizontal accuracy, adding cross flight strips is unnecessary. – Multi-altitude flights are useless at higher altitudes or when mapping areas instead of an individual object. Their purpose is to increase the density of the point cloud when mapping objects with fine details and when mapping small areas that can be captured from flight altitudes less than 50 meters. In general, these findings have important implications for UAV development. The fact that the data quality can vary significantly based on flight configurations and sensors presents both risks and opportunities. There is a risk that UAVs are used as ready-made products with a limited understanding of photogram- metric principles and the ability to customize flight configurations. As a result, even if the final product appears to be of good quality, there may be undetected spatial offsets, deformations, or poor elevation results. On the other hand, these results also highlight the potential for customizing UAV workflows. Different flight configurations and ground-truthing methods offer a wide range of options to adapt data collection to financial, personnel, and time constraints, and to align it with customer needs and require- ments across various industries. This makes UAV workflows a viable and sustainable tool for delivering reliable and cost-effective information to address current and future challenges. Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 471 | GEODETSKI VESTNIK | 67/4 | RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN Future research building on these results could take various approaches, such as exploring the integration of thermal and hyperspectral cameras with UAVs and their accuracy and data collection capabilities. The potential applications of multispectral cameras across various industries should not be overlooked. Additionally, the research could examine the impact of multi-altitude flight at lower altitudes and with forward flight strips. Literature and references: Agüera-Vega, F ., Carvajal-Ramírez, F ., & Martínez-Carricondo, P . (2017). Assessment of photogrammetric mapping accuracy based on variation ground control points number using unmanned aerial vehicle. Measurement, 98, 221-227. Forlani, G., Dall’ Asta, E., Diotri, F ., Morra di Cella, U., Roncella, R., & Santise, M. (2018). Quality assessment of DSMs produced from UAV flights georeferenced with on-board RTK positioning. Remote sensing, 10(2), 311. Fras, M. K., Šušteršič, K., & Kežul, A. Š. (2021). POPOLNI ORTOFOTO V URBANIH OKOLJIH. Geodetski Vestnik, 65(1). Gerke, M., & Przybilla, H. J. (2016). Accuracy analysis of photogrammetric UAV image blocks: Influence of onboard RTK-GNSS and cross flight patterns. Photogrammetrie, Fernerkundung, Geoinformation (PFG), (1), 17-30. Gindraux, S., Boesch, R., & Farinotti, D. (2017). Accuracy assessment of digital surface models from unmanned aerial vehicles’ imagery on glaciers. Remote Sensing, 9(2), 186. Goddijn-Murphy, L., Williamson, B. J., McIlvenny, J., & Corradi, P . (2022). Using a UA V thermal infrared camera for monitoring floating marine plastic litter. Remote Sensing, 14(13), 3179. Hugenholtz, C., Brown, O., Walker , J., Barchyn, T ., Nesbit, P ., Kucharczyk, M., & Myshak, S. (2016). Spatial accuracy of UAV-derived orthoimagery and topography: Comparing photogrammetric models processed with direct geo-referencing and ground control points. Geomatica, 70(1), 21-30. Imai, N. N., Tommaselli, A. M. G., Berveglieri, A., & Moriya, E. A. S. (2019). Shadow detection in hyperspectral images acquired by UAV . The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 42, 371-377. International Standardization Organization (ISO). ISO 19157: 2013 Geographic Information-Data Quality; European Committee for Standardization: Brussels, Belgium, 2013. James, M. R., & Robson, S. (2014). Mitigating systematic error in topographic models derived from UAV and ground-based image networks. Earth Surface Processes and Landforms, 39(10), 1413-1420. Kelcey, J., Lucieer, A.(2012). Sensor COrrection of a 6-Band Multispectral Imaging Sensor for UAV Remote Sensing. Remote sensing 4.5: 1462-1493. Kranjec, N., Čekada, M. T ., & Kobal, M. (2021). NAPOVEDOVANJE DREVESNIH VRST IZ GEOMETRIJE IN INTENZITETE OBLAKA AEROLASERSKIH TOČK VRHOV DREVESNIH KROŠENJ. Geodetski vestnik, 65(2), 234-259. Kraus, K. (2007). Photogrammetry: geometry from images and laser scans (Vol. 1). Walter de Gruyter. Krzyżek, R., & Kudrys, J. (2022). Accuracy of GNSS RTK/NRTK height difference measurement. Applied Geomatics, 14(3), 491-499. Lu, J., Cheng, D., Geng, C., Zhang, Z., Xiang, Y ., & Hu, T . (2021). Combining plant height, canopy coverage and vegetation index from UAV-based RGB images to estimate leaf nitrogen concentration of summer maize. Biosystems Engineering, 202, 42-54. Luo, J., Zhao, T., Cao, L., & Biljecki, F. (2022). Semantic Riverscapes: Perception and evaluation of linear landscapes from oblique imagery using computer vision. Landscape and Urban Planning, 228, 104569. Manfreda, S.; Dvorak, P .; Mullerova, J.; Herban, S.; Vuono, P .; Arranz Justel, J.; Perks, M. Assessing the accuracy of digital surface models derived from optical imagery acquired with unmanned aerial systems. Drones 2019, 3, 15. Mazić, E., Tuno, N., Savšek, S., Kogoj, D. (2013). Optimalna dolžina vizure digitalnega nivelirja leica geosystems DNA03. Geodetski vestnik, 57 (2), 233-244. Nagendran, S. K., T ung, W . Y ., & Ismail, M. A. M. (2018, June). Accuracy assessment on low altitude UA V-borne photogrammetry outputs influenced by ground control point at different altitude. In IOP Conference Series: Earth and Environmental Science (Vol. 169, No. 1, p. 012031). IOP Publishing. Oniga, V. E., Breaban, A. I., Pfeifer, N., & Chirila, C. (2020). Determining the suitable number of ground control points for UAS images georeferencing by varying number and spatial distribution. Remote sensing, 12(5), 876. Osco, L. P ., Ramos, A. P . M., Moriya, É. A. S., de Souza, M., Junior, J. M., Matsubara, E. T., ... & Creste, J. E. (2019). Improvement of leaf nitrogen content inference in Valencia-orange trees applying spectral analysis algorithms in UAV mounted- sensor images. International Journal of Applied Earth Observation and Geoinformation, 83, 101907. Qin, Z., Chang, Q., Xie, B., & Shen, J. (2016). Rice leaf nitrogen content estimation based on hysperspectral imagery of UAV in Yellow River diversion irrigation district. Transactions of the Chinese Society of Agricultural Engineering, 32(23), 77-85. Sanz-Ablanedo, E., Chandler, J. H., Rodríguez-Pérez, J. R., & Ordóñez, C. (2018). Accuracy of unmanned aerial vehicle (UAV) and SfM photogrammetry survey as a function of the number and location of ground control points used. Remote Sensing, 10(10), 1606. Sertić, H., Paar, R., T omić, H., & Ravlić, F . (2022). Influence of Flight Height and Image Sensor on the Quality of the UAS Orthophotos for Cadastral Survey Purposes. Land, 11(8), 1250. Shendryk, Y ., Sofonia, J., Garrard, R., Rist, Y ., Skocaj, D., & Thorburn, P . (2020). Fine-scale prediction of biomass and leaf nitrogen content in sugarcane using UAV LiDAR and multispectral imaging. International Journal of Applied Earth Observation and Geoinformation, 92, 102177. Singh, K. K., & Frazier, A. E. (2018). A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications. International Journal of Remote Sensing, 39(15-16), 5078-5098. Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 | | 472 | | 67/4 | GEODETSKI VESTNIK RECENZIRANI ČLANKI | PEER-REVIEWED ARTICLES SI | EN Smith, M., Carrivick, J., & Quincey, D. (2016). Structure from motion in physical geography. Progress in Physical Geography, 40, 247–275. Smith, M., Carrivick, J., & Quincey, D. (2016). Structure from motion in physical geography. Progress in Physical Geography, 40, 247–275. Stöcker , C., Nex, F ., Koeva, M., & Gerke, M. (2020). High-quality uav-based orthophotos for cadastral mapping: Guidance for optimal flight configurations. Remote sensing, 12(21), 3625. Su, J., Zhu, X., Li, S., & Chen, W . H. (2022). AI meets UAVs: a survey on AI empowered UAV perception systems for precision agriculture. Neurocomputing. T ait, L., Bind, J., Charan-Dixon, H., Hawes, I., Pirker , J., & Schiel, D . (2019). Unmanned aerial vehicles (UAVs) for monitoring macroalgal biodiversity: Comparison of RGB and multispectral imaging sensors for biodiversity assessments. Remote Sensing, 11(19), 2332. T onkin, T . N., & Midgley, N. G. (2016). Ground-control networks for image based surface reconstruction: An investigation of optimum survey designs using UAV derived imagery and structure-from-motion photogrammetry. Remote Sensing, 8(9), 786. Touge, Y., Hasegawa, M., Minegishi, M., Kawagoe, S., & Kazama, S. (2023). Multitemporal UAV surveys of geomorphological changes caused by postfire heavy rain in Kamaishi city, northeast Japan. Catena, 220, 106702. Türk, T., Tunalioglu, N., Erdogan, B., Ocalan, T., & Gurturk, M. (2022). Accuracy assessment of UA V-post-processing kinematic (PPK) and UA V-traditional (with ground control points) georeferencing methods. Environmental Monitoring and Assessment, 194(7), 476. Verhoeven, G. (2011). T aking computer vision aloft – archaeological three dimensional reconstructions from aerial photographs with PhotoScan. Archaeological Prospection, 18(1), 67–73. Villanueva, J. K. S., & Blanco, A. C. (2019). Optimization of ground control point (GCP) configuration for unmanned aerial vehicle (UAV) survey using structure from motion (SFM). The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 42, 167-174. Vitti, D. M. D. C., Marques Junior, A., Guimarães, T. T., Koste, E. C., Inocencio, L. C., Veronez, M. R., & Mauad, F . F . (2019). Geometry accuracy of DSM in water body margin obtained from an RGB camera with NIR band and a multispectral sensor embedded in UAV . European Journal of Remote Sensing, 52(sup1), 160-173. Westoby, M. J., Brasington, J., Glasser, N. F., Hambrey, M. J., & Reynolds, J. M. (2012). ‘Structure-from-Motion’photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology, 179, 300-314. Nikola Santrač, M.Sc. University of Novi Sad, Faculty of Agriculture Trg Dositeja Obradovića 8, 21000 Novi Sad, Serbia e-mail: nikola.santrac@polj.edu.rs Assoc. Prof. Pavel Benka, Ph.D. University of Novi Sad, Faculty of Agriculture, Trg Dositeja Obradovića 8, 21000 Novi Sad, Serbia e-mail: pavel.benka@polj.edu.rs Assist. Prof. Mehmed Batilović, Ph.D. University of Novi Sad, Faculty of Technical Sciences Trg Dositeja Obradovića 6, 21000 Novi Sad, Serbia e-mail: mehmed@uns.ac.rs Radoš Zemunac, M.Sc. University of Novi Sad, Faculty of Agriculture, Trg Dositeja Obradovića 8, 21000 Novi Sad, Serbia, e-mail: zemunac.radoš@polj.uns.ac.rs Sanja Antić, M.Sc. University of Novi Sad, Faculty of Agriculture Trg Dositeja Obradovića 8, 21000 Novi Sad, Serbia e-mail: sanja.antic@polj.uns.ac.rs Milica Stajić, M.Sc. University of Novi Sad, Faculty of Agriculture Trg Dositeja Obradovića 8, 21000 Novi Sad, Serbia e-mail: milica.stajic@polj.uns.ac.rs Nenad Antonić, M.Sc. Cinteraction Nikolajevska 2, 21000 Novi Sad, Serbia e-mail: nenadsbc@gmail.com Santrač N., Benka P ., Batilović M., Zemunac R., Antić S., Stajić M., Antonić N. ( (2023). Accuracy analysis of UAV photogrammetry using RGB and multispectral sensors. Geodetski vestnik, 67 (4), 459-472. DOI: https://doi.org/10.15292/geodetski-vestnik.2023.04.459-472 Nikola Santrač, Pavel Benka, Mehmed Batilović, Radoš Zemunac, Sanja Antić, Milica Stajić, Nenad Antonić | ANALIZA TOČNOSTI UAV-FOTOGRAMETRIJE Z UPORABO RGB IN MULTISPEKTRALNEGA SENZORJA | ACCURACY ANAL YSIS OF UAV PHOTOGRAMMETRY USING RGB AND MULTISPECTRAL SENSORS | 459-472 |