Why the best 3D mapping drone has camera and laser options
The evolution of 3D mapping for drones
Mapping and surveying has long been predicted as the most lucrative market segment to benefit from drone technology due to a huge potential for rapid and accurate mapping data capture.
Photogrammetry has been proving this concept beautifully for almost ten years. But flying a drone with a LIDAR scanner was far from easy until just recently. They were simply too heavy and demanded a lot of power to lift, which constrained their use on small, unmanned aircrafts.
Everything about LIDAR had to get more compact without sacrificing too much quality. Not only this, but drone LIDAR sensors needed special engineering to tune their function to the speed of multicopter, fixed-wing and VTOL drones.
All of this, and they had to provide data that could be processed without a lot of expertise. Although they have, engineering around these payloads is at a relative infancy. And quality is just starting to make them super viable.
In fact, we’ve arrived now at a critical new phase of 3D mapping. Specifically, combining quality drone LIDAR data with advanced RGB data—like that captured with an oblique 3D mapping payload—offers the most comprehensive 3D map possible, with the lowest overhead.
Why do you need both LIDAR and photogrammetry data for 3D mapping?
Laser scanning and photogrammetry are two different techniques for acquiring 3D point clouds. In many cases they complement each other with their strengths. Let’s look at how they work before getting into the benefits of using both.
Both drone photogrammetry and drone LIDAR capture point cloud data. However, they do it differently. While drone LIDAR scanners directly create point clouds and data-rich maps via actively sending, receiving and measuring laser light pulses, drone photogrammetry relies on cameras to capture overlapping imagery with location information embedded into every pixel.
What’s the difference between these methods and why does it matter?
- Photogrammetry offers life-like imagery in map form, with resolutions down to a centimeter, for deciphering fine detail, and post-processing options to bring accuracy to this range as well.
- Photogrammetry falls short in vegetated areas due to its method: photographs taken from the sky cannot precisely capture ground-level data in greenfield situations due to shadows and physical blocking by the vegetation.
- LIDAR offers point clouds that provide precise horizontal and vertical measurement readings on the surface being captured. The method enables capturing ground-level information wherever vegetation allows light to penetrate to the ground.
- LIDAR outputs can be colorized but are not super sharp in terms of offering recognizable features for analysis.
So why would you want both LIDAR and RGB data?
Using two different data acquisition techniques, you can call on and blend each of their strengths and cancel their weaknesses.
For example, LIDAR can provide ground-level data when flying over densely vegetated areas, such as forests. Its vertical accuracies are not compromised by shadows or vegetation up to a certain density. You can even theoretically fly it in the dark and capture the data points.
Meanwhile photogrammetry will provide sharp, lifelike and colorized features. All of this adds color, shadow and context to a map. Horizontal accuracy with photogrammetry is high for most purposes.
Integrating both data sets makes feature identification and accurate analysis even amidst vegetation possible. You can also double check the accuracy of both methods against each other.
What’s better—double sensor or single sensor capture?
Now that we know that there are benefits to using both photogrammetry and LIDAR for mapping the same area, the question becomes this: is having both sensors capturing on one flight better, or is it better to fly two flights? The devil is in the details.
Some systems on the market today carry dual payloads. This is well-marketed. Yet, one of these payloads will have to be lighter weight. In the case of LIDAR and RGB data capture in a dual payload setup, the RGB sensor will need to be lighter. I.e., LIDAR carried by drones is still too complex for scaling down to weigh less.
So you can get both sets of data in a single flight. But you will sacrifice quality and accuracy in your RGB data, significantly. Flying two flights may seem time consuming, but if you have the right drone—that can fly big areas fast—you can pull this off in the same time as a drone that takes longer. The right drone will get you the highest quality with very little, if any, sacrifice in terms of field time.
3D mapping with drones today
We’ve now reached a really interesting time where LIDAR data is more accessible than it’s ever been thanks to an uptick in quality of drone-carried sensors. If you can get both photogrammetry data and a good and reliable LIDAR sensor out of one platform, this is ideal.
Wingtra’s 3D mapping drone
The WingtraOne GEN II mapping drone is a complete data capture solution, which means it can be used for photogrammetry, multispectral and LIDAR scanning. It is known for its efficiency and ability to cover massive areas in a relatively short time without the need for repeated flights. With this platform and its swappable, high-quality payloads, both LIDAR and RGB data can be captured for professional 3D mapping.
With high-quality RGB data and high-quality LIDAR data, you can be sure that your 3D maps serve all of your needs across the entire spectrum of outputs. Plus, you’ll always have high-integrity data to plug into all of your complementary and analytics software. All of this equals peace of mind and increased productivity when basing every step on clear, accurate data.
What are the most common 3D mapping outputs
3D point clouds avail a range of 3D mapping products. These are some of the most popular in demand today:
- 3D LIDAR point clouds—LIDAR point clouds are detailed 3D representations, containing precise X, Y, and Z coordinates for each data point according to a preferred coordinate system.
- DSMs / DEMs—digital surface models (DSMs) and digital elevation models (DEMs) are derived from RGB data alone. The color coding on these maps allows for snapshot and deeper analysis of elevations and terrain features.
- Building information models (BIMs): Key for architecture and construction sectors, you can integrate LIDAR and RGB data to get highly detailed BIMs, for precise and rich analytics of buildings and structures.
- Annotated 3D maps: Combining LIDAR and RGB can also avail annotated 3D maps, where specific features can be labeled and classified based on both spatial and color information.
- Change detection maps: By integrating LIDAR and RGB drone data over time, you can track changes much more efficiently and over wider areas than ever before. This is useful for monitoring the environment across the whole range of sectors, such as mining, construction, forestry, and natural disaster forecasting and management.
- 3D mesh models: Combining LIDAR and RGB data can generate textured 3D mesh models. These models provide a detailed and realistic surface representation, useful for applications in urban planning, architecture, and even gaming.
All these 3D mapping products are used by different industries. For example, mining companies use orthomosaics for stockpile management, reporting, and auditing as well as highwall safety checks and pre-post blast assessments. Similarly, the construction industry relies on a range of drone maps for real-time site monitoring while GIS analysts use orthomosaic (raster) maps as a basis for creating topographic (vector) maps, which enable spatial analysis.
The importance of data accuracy for drone 3D mapping applications
Mapping professionals require high data accuracy. This means that the position of a 3D point needs to match its real-world equivalent with a small margin of error, which makes the data and subsequent data analysis more reliable. 3D mapping drones employ positioning technology, including GPS and GNSS correction signals to position a drone and the point data it captures at any given moment. But this is only satellite-level accuracy, which is at best a meter.
Today, the recommended technologies to improve the accuracy of drone mapping data are post-processed kinematic (PPK) and real-time kinematic (RTK). Both methods correct the location of drone mapping data and, in the case of high-quality PPK, can even remove the need for GCPs, bringing absolute accuracy down to cm (sub-inch) range.
Another quality indicator of a point cloud is point density, which is expressed as the point number per square meter. The point density decreases more or less quadratically with distance.
What is the best drone for 3D mapping?
Several quality drones compete for attention in the 3D drone mapping space. So the first thing to do is ask yourself what exactly you need 3D drone mapping data for. Is it a large area? Is it for vertical asset inspection? Is it for areas with high vegetation and that require sub-cm detail? The system you choose needs to accommodate these needs. For vertical asset management, multirotors are better, since fixed-wings do not hover.
However, for large-area mapping, you’ll want a system that is both efficient and reliable. You’ll also want a VTOL drone that protects a payload while flying in cruise to give you maximum coverage. Why? Because you already have to race against time to get that area mapped as frequently as possible. You are also racing against weather and other time window constraints like air or traffic shutdown.
All things being otherwise equal, reliability is truly the shining factor when it comes to large areas and fixed-wings.
Last thoughts
Getting 3D maps with a drone is now easier and more cost effective than ever if you choose the right system. This also involves the right support, since this capturing and processing is not always straightforward in complicated or crowded areas.
Make sure you have great support behind you when you choose a drone platform—talk to other professionals in the field and find out what they have to say about the support they receive and the quality of data they obtain. All of this combined with a clear sense of what you need, how often and why will get you a solution you can count on.
PS: If you are a service provider, don’t forget to think about the projects a system can make possible in the future, beyond the ones you already tackle.
FAQ
How do you perform 3D mapping with a drone?
3D mapping with a drone typically involved the following steps:
- Ensure you are operating within regulations with a system that is permitted to fly in the area
- Set the area and planning the flight path for the drone
- Insert the camera, LIDAR scanner or both, as well as a fully-charged battery
- Fly the drone across the flight path and maintain visible line of site
- Process the images in-house or send them to a software service for processing
- Analyze the data and plug it into analytics and sharing tools for use
What is the best drone for 3D mapping?
The WingtraOne GEN II is the recommended drone for most mapping projects that require large-area coverage and do not involve vertical asset inspection.
Which other applications are possible with drone LIDAR, in addition to 3D mapping?
What is oblique imagery and why do you need it for 3D mapping?
Oblique imagery is taken with the camera axis directed at an inclination to the ground—usually, an angle of 40° to 50° is used. For successful 3D mapping, cross-hatch flights are recommended with a nadir sensor for high side and front image overlaps. Due to its tilted configuration, an oblique sensor captures richer vertical data and eliminates the need for a cross-hatch flight pattern.
What is the best 3D mapping software for drones?
This really depends on what you are used to working with and whether they have robust tools for availing good quality and accuracy. We highly recommend Bentley ContextCapture and Esri SiteScan to process Wingtra oblique data for rich and detailed 3D maps.
LIDAR processing can be tightly aligned with the payload, yet in the case of Wingtra, it is software agnostic with several recommended partners on the same page linked above.