• Wed. Nov 20th, 2024

Drone based LiDAR accuracy tests – sUAS News

Byadmin

Feb 28, 2022



For the past 15 months, 3DroneMapping has been working on its drone-based LiDAR system. There have been enormous changes in the INS and LiDAR sensor market and these sensors are now much cheaper, more accurate and smaller than ever before. But most critically, software has kept up to date with changes in the small LiDAR market and has offered some very useful tools to get the absolute best results from such small sensors.

As solid-state LiDAR units are becoming more common, it is now very possible to place them in more challenging environments. While they still need to be clean and not exposed to extreme vibrations, EMF and water, they are far more robust. Also, heavy lift aircraft are becoming more popular these days as the standard payload for a LiDAR and imager are between 1.2 – 1.6kg.
So why is LiDAR so important as a payload for aerial survey? Mainly because of the ability for LiDAR to “see” though vegetation. There are other factors too such as:
Increased swath coverageReduced data capture timeReduced processing timeEase of ground classificationReliability of ground classification
Here are a few samples that compare LiDAR based pointclouds to those generate from standard photogrammetry (images only or PhoDAR)

LiDAR in pink, tiepoints in blue. Blue giving false results on water
As can be seen, LiDAR gives very good coverage over densely vegetated areas. Cultivated fields (in this case maize) has been a huge headache for photogrammetrists when using images only for determining DTM. As this type of photogrammetry can only measure what it “sees”, often ground is not visible and thus not taken into account. Far more ground points are surveyed from LiDAR and help determine where natural ground levels actually are.
In some cases, reflection from reflective surfaces such as water can cause photogrammetry software to create depressions or spikes in pointclouds based on images. More often is the case, these are not classified as noise as there are neighbour points that are generated to create a pseudo surface.
LiDAR sensors only require about 35-25% sidelap. The angles of data capture do not require large overlaps (unless calibration is being done). This means that far less distance needs to be travelled for data capture of the same area. The example below is for 50ha at 120m AGL. LiDAR coverage is only 6.99km but using a 48MP camera with a 35mm lens would take 17.39km to achieve the same coverage(at 65% sidelap)
Processing time is really dependent on the hardware and software being used and what workflows are in place. Our office findings is that the processing time that is automated(tie point generation, alignments, raw data extraction, etc) are very similar for areas 1000ha and up. But there is a massive reduction in manual editing and classification when using LiDAR data. Here, very little manual intervention needs to take place and processing times are as little as 3% of that required to clean PhoDAR data.

LiDAR data can further be refined by strip adjustment software. This software takes into consideration certain drifts in the INS measurements, sensor inaccuracies and reflectance issues. All of this is calculated and the block adjusted accordingly. The results from such adjustment can be dramatic and certainly help in following classification and noise identification routines.

Strip alignment



Source link