(a) coordinate system of the Velodyne LiDAR system; (b) geometric relation of laser intrinsic parameters in linear representation, a calibration plane, and the residual.

(a) coordinate system of the Velodyne LiDAR system; (b) geometric relation of laser intrinsic parameters in linear representation, a calibration plane, and the residual.

Source publication
Article
Full-text available
This paper presents a fully-automated method to establish a calibration dataset from on-site scans and recalibrate the intrinsic parameters of a spinning multi-beam 3-D scanner. The proposed method has been tested on a Velodyne HDL-64E S2 LiDAR system, which contains 64 rotating laser rangefinders. By time series analysis, we found that the collect...

Similar publications

Article
Full-text available
Within the study of geometry in the middle school curriculum is the natural development of students’ spatial visualization, the ability to visualize two- and three-dimensional objects. The national mathematics standards call specifically for the development of such skills through hands-on experiences. A commonly accepted method is through the instr...

Citations

... Early work related to in-situ calibration focused mainly on enhancing the internal characteristic parameters of static terrestrial laser scanners (TLS) or multibeam LiDAR units [20], [21], [22], [23], [24]. Since these systems do not have a georeferencing unit, the reconstructed point clouds are manually georeferenced using control points. ...
Article
Full-text available
Uncrewed aerial vehicles (UAV) carrying sensors such as light detection and ranging (LiDAR) and multiband cameras georeferenced by an onboard global navigation satellite system/inertial navigation system (GNSS/INS) have become a popular means to quickly acquire near-proximal agricultural remote sensing data. These platforms have bridged the gap between high-altitude airborne and ground-based measurements. UAV data acquisitions also allow for surveying remote sites that are logistically difficult to access from ground. With that said, deriving well-georeferenced mapping products from these mobile mapping systems (MMS) is contingent on accurate determination of platform trajectory along with inter-sensor positional and rotational relationships, that is, the mounting parameters of various sensors with respect to the GNSS/INS unit. Conventional techniques for estimating LiDAR mounting parameters (also referred to as LiDAR system calibration) require carefully planned trajectory and target configuration. Such techniques are time-consuming, and in certain cases, not feasible to accomplish. In this work, an in-situ system calibration and trajectory enhancement strategy for UAV LiDAR is proposed. The strategy uses planting geometry in mechanized agricultural fields through an automated procedure for feature extraction/matching and using them to enhance the quality of LiDAR-derived point clouds. The proposed approach is qualitatively and quantitively evaluated using calibration datasets as well as separately acquired validation datasets to demonstrate the performance of the developed procedure. Quantitatively, the accuracy of the resulting UAV point clouds after system calibration and an accompanying trajectory enhancement improved from as much as 43 cm to 4 cm.
... This structure makes it almost impossible to accurately determine the position of target points. As a result, the calibration of these scanners must rely on plane features rather than point targets [7][8][9][29][30][31][32]. These have been extensively studied in the literature to examine their performance and calibration. ...
... These have been extensively studied in the literature to examine their performance and calibration. For example, Muhammad and Lacroix calibrated an HDL-64E using manually extracted wall surfaces [29]; Glennie and Lichti presented the static calibration and analysis of an HDL-64E S2 [7]; Atanacio-Jiménez et al expressed a pattern calibration signature (a larger cuboid control target) to calibrate their HDL-64E to minimize systematic errors caused by the geometric calibration factor and solve the intrinsic and extrinsic parameter [33]; Chen and Chien used an automatic random sample consensus plane detection algorithm to extract vertical walls for evaluation of the HDL-64E [30]; Chan et al used cylindrical targets such as lampposts to calibrate an HDL-32E sensor [34,35]; and Glennie et al reported a calibration and stability analysis of a VLP-16 [32]. Although some research on flash-LiDAR has been performed [36][37][38], the study of its calibration has been minimal. ...
Article
Full-text available
The geometry and calibration of three-dimensional multi-beam laser scanning (MBLS) are more difficult than the single-beam laser scanning, especially for one laser emitter and multiple laser-echo detection within the same optical path and scanning with a two-axis-mirror. This paper focuses on the influence of the main systematic errors for the geometric imagery of the MBLS, and presents a plane-based self-calibration to improve the geometric positioning. First, the model of geometric imaging and systematic errors for the MBLS is presented, and the adjustment of plane-based self-calibration is developed. Second, the influence of systematic errors for geometric imagery of the MBLS is simulated and conducted to find the main errors. Third, a strong network configuration based on planar calibration is addressed and implemented, and the improvement of accuracy is examined via qualitative and quantitative analysis. The results show that the rangefinder offset, horizontal direction circle index and vertical circle index are the main systematic errors, and the accuracy of distance is corrected from 29.94 cm to 2.86 cm with an improvement of 86.89% for the plane-based calibration, and validation indicates that is corrected from 25.47 cm to 5.60 cm with an improvement of 88.25%.
... Up to now, LiDAR calibration is still a concern in related communities. Regarding the internal calibration of common LiDAR, most methods place LiDAR in regular scenes [13]- [17] or use well-designed targets [18], [19] to extract the co-visibility features to optimize parameters. For spinning LiDAR, traditional calibration methods still require ad-hoc tools or regular calibrated scenes [12], [20]- [23]. ...
... In [14], LiDAR is placed in definite distance to a wide regular wall to refine the prior calibration. Like [14], LiDAR is put inside two tailor-made cuboid targets made up of five pattern planes to estimate parameters with definite positions [19]. ...
Article
Spinning Light Detection And Ranging (LiDAR) is a range scanning device with a larger field of view, which is formed by continuous spinning of a LiDAR with a rotating mechanism. Precise and robust calibration of the relative pose between LiDAR and rotating mechanism is critical for surveying, mapping, localization, etc. However, existing methods generally require ad-hoc tools, well-designed targets, or regular calibrated scenes. For improving calibration in unprepared scenes, we propose a Filtered Grid Random Sample Consensus (FGRSC) calibration approach, constituted by a Filtered Grid (FG) strategy and a Grid Random Sample Consensus (GRSC) method. Based on dividing the whole scene points into small grids when sufficient scanning is received. Firstly, FG is designed to calculate point curvature upon receiving each frame of spinning LiDAR scanning. Then FG normalizes the number of low curvature points to get an approximate probability of containing planes for every grid. Finally, given the probabilities provided by FG, GRSC preselects grids to extract and associate planes to estimate parameters with a weighted optimization. The experimental results derived from indoor and outdoor unprepared scenes show that our FGRSC outperforms the state-of-the-art methods.
... Moreover, minimizing the discrepancies between the point cloud and pattern planes attained 0.0156 m of point cloud accuracy [27]. In addition to the installed target-based approaches, the static on-site recalibration approach using planes accomplished 0.013 m of planar misclosure [28]. Besides, the kinematic calibration of HDL-64E on a moving vehicle attained 0.023 m of planar RMS residuals [29]. ...
... The proposed approach showed a similar (or somewhat better) level of RMSE to the static calibrations. [25] 0.013 m 63.8% HDL-64E Chen and Chien [28] 0.013 m 40% HDL-64E Glennie [29] 0.023 m 37.8% HDL-64E Chan and Lichti [30] 0.008 m 0.014 m 41-71% HDL-32E Glennie et al. [31] 0.025 m 20% VLP-16 Proposed 0.007 m 35-81% VLP-16 Secondly, the proposed kinematic self-calibration of the backpack-based MBL system can significantly reduce time and cost compared to the traditional target-based static calibrations. The proposed one does not need an installation of targets and tripods to fix the scan location since it is aiming for kinematic in situ calibration. ...
Article
Full-text available
Light Detection and Ranging (LiDAR) remote sensing technology provides a more efficient means to acquire accurate 3D information from large-scale environments. Among the variety of LiDAR sensors, Multi-Beam LiDAR (MBL) sensors are one of the most extensively applied scanner types for mobile applications. Despite the efficiency of these sensors, their observation accuracy is relatively low for effective use in mobile mapping applications, which require measurements at a higher level of accuracy. In addition, measurement instability of MBL demonstrates that frequent re-calibration is necessary to maintain a high level of accuracy. Therefore, frequent in situ calibration prior to data acquisition is an essential step in order to meet the accuracy-level requirements and to implement these scanners for precise mobile applications. In this study, kinematic in situ self-calibration of a backpack-based MBL system was investigated to develop an accurate backpack-based mobile mapping system. First, simulated datasets were generated for the experiments and tested in a controlled environment to inspect the minimum network configuration for self-calibration. For this purpose, our own-developed simulator program was first utilized to generate simulation datasets with various observation settings, network configurations, test sites, and targets. Afterwards, self-calibration was carried out using the simulation datasets. Second, real datasets were captured in a kinematic situation so as to compare the calibration results with the simulation experiments. The results demonstrate that the kinematic self-calibration of the backpack-based MBL system could improve the point cloud accuracy with Root Mean Square Error (RMSE) of planar misclosure up to 81%. Conclusively, in situ self-calibration of the backpack-based MBL system can be performed using on-site datasets, reaching the higher accuracy of point cloud. In addition, this method, by performing automatic calibration using the scan data, has the potential to be adapted to on-line re-calibration.
... (Atanacio-Jiménez et al., 2011) calibrated the HDL-64E using large cuboid targets while (Muhammad, Lacroix, 2010) by extracted wall surfaces. An automatic RANSAC-based plane detection algorithm was used in (Chen, Chien, 2012) for the same purpose. A more recent version, VLP-16, was evaluated by Wang et al. (Wang et al., 2018) and compared to cheaper RS-LiDAR manufactured by Robosense. ...
Article
Full-text available
3D LiDAR sensors play an important part in several autonomous navigation and perception systems with the technology evolving rapidly over time. This work presents the preliminary evaluation results of a 3D solid state LiDAR sensor. Different aspects of this new type of sensor are studied and their data are characterized for their effective utilization for object detection for the application of Autonomous Ground Vehicles (AGV). The paper provides a set of evaluations to analyze the characterizations and performances of such LiDAR sensors. After characterization of the sensor, the performance is also evaluated in real environment with the sensors mounted on top of a vehicle and used to detect and classify different objects using a state-of-the-art Super-Voxel based method. The 3D point cloud obtained from the sensor is classified into three main object classes “Building”, “Ground” and “Obstacles”. The results evaluated on real data, clearly demonstrate the applicability and suitability of the sensor for such type of applications.
... This can be done in numerous ways, and can use more than one type of sensor. Some common extrinsic calibration procedures use a LiDAR-Camera procedure as outlined in [7][8][9][10], and multiple LiDAR sensors or multiple sensor views as illustrated by [11][12][13][14][15][16], of a fixed target structure for a faster extrinsic calibration prior to operations [17][18][19][20]. These scenarios require the sensors raw data to be correlated into one coherent picture. ...
Article
Full-text available
This paper presents a study on the data measurements that the Hokuyo UST-20LX Laser Rangefinder produces, which compiles into an overall characterization of the LiDAR sensor relative to indoor environments. The range measurements, beam divergence, angular resolution, error effect due to some common painted and wooden surfaces, and the error due to target surface orientation are analyzed. It was shown that using a statistical average of sensor measurements provides a more accurate range measurement. It was also shown that the major source of errors for the Hokuyo UST-20LX sensor was caused by something that will be referred to as "mixed pixels". Additional error sources are target surface material, and the range relative to the sensor. The purpose of this paper was twofold: (1) to describe a series of tests that can be performed to characterize various aspects of a LIDAR system from a user perspective, and (2) present a detailed characterization of the commonly-used Hokuyo UST-20LX LIDAR sensor.
... Still considering calibration issues, there have also been considerable efforts on how to perform intrinsic calibration for multibeam Lidar systems, where the results from one beam are used to calibrate the intrinsic parameters of other beams [23,24,25,26,27,28,29,30,31,32]. As for single-beam Lidar systems, [30] proposed a method for the intrinsic calibration of a revolving-head 3D Lidar and the extrinsic calibration of the parameters with respect to a camera. ...
Thesis
Full-text available
The use of sensors is ubiquitous in our IT-based society; smartphones, consumer electronics, wearable devices, healthcare systems, industries, and autonomous cars, to name but a few, rely on quantitative measurements for their operations. Measurements require sensors, but sensor readings are corrupted not only by noise but also, in almost all cases, by deviations resulting from the fact that the characteristics of the sensors typically deviate from their ideal characteristics. This thesis presents a set of methodologies to solve the problem of calibrating sensors with statistical estimation algorithms. The methods generally start with an initial statistical sensor modeling phase in which the main objective is to propose meaningful models that are capable of simultaneously explaining recorded evidence and the physical principle for the operation of the sensor. The proposed calibration methods then typically use training datasets to find point estimates of the parameters of these models and to select their structure (particularly in terms of the model order) using suitable criteria borrowed from the system identification literature. Subsequently, the proposed methods suggest how to process the newly arriving measurements through opportune filtering algorithms that leverage the previously learned models to improve the accuracy and/or precision of the sensor readings. This thesis thus presents a set of statistical sensor models and their corresponding model learning strategies, and it specifically discusses two cases: the first case is when we have a complete training dataset (where “complete” refers to having some ground-truth information in the training set); the second case is where the training set should be considered incomplete (i.e., not containing information that should be considered ground truth, which implies requiring other sources of information to be used for the calibration process). In doing so, we consider a set of statistical models consisting of both the case where the variance of the measurement error is fixed (i.e., homoskedastic models) and the case where the variance changes with the measured quantity (i.e., heteroskedastic models). We further analyze the possibility of learning the models using closed-form expressions (for example, when statistically meaningful, Maximum Likelihood (ML) and Weighted Least Squares (WLS) estimation schemes) and the possibility of using numerical techniques such as Expectation Maximization (EM) or Markov chain Monte Carlo (MCMC) methods (when closed-form solutions are not available or problematic from an implementation perspective). We finally discuss the problem formulation using classical (frequentist) and Bayesian frameworks, and we present several field examples where the proposed calibration techniques are applied on sensors typically used in robotics applications (specifically, triangulation Light Detection and Rangings (Lidars) and Time of Flight (ToF) Lidars)
... In particular, intrinsic calibration of RSBLs has been extensively treated in the literature (e.g., [23,24,32]). Besides, some recent works have proposed calibration methods to improve factory parameters in MBLs [35,36]. Temporal instability of MBL measurements poses another relevant calibration problem [37]. ...
Article
Full-text available
Multi-beam lidar (MBL) rangefinders are becoming increasingly compact, light, and accessible 3D sensors, but they offer limited vertical resolution and field of view. The addition of a degree-of-freedom to build a rotating multi-beam lidar (RMBL) has the potential to become a common solution for affordable rapid full-3D high resolution scans. However, the overlapping of multiple-beams caused by rotation yields scanning patterns that are more complex than in rotating single beam lidar (RSBL). In this paper, we propose a simulation-based methodology to analyze 3D scanning patterns which is applied to investigate the scan measurement distribution produced by the RMBL configuration. With this purpose, novel contributions include: (i) the adaption of a recent spherical reformulation of Ripley's K function to assess 3D sensor data distribution on a hollow sphere simulation; (ii) a comparison, both qualitative and quantitative, between scan patterns produced by an ideal RMBL based on a Velodyne VLP-16 (Puck) and those of other 3D scan alternatives (i.e., rotating 2D lidar and MBL); and (iii) a new RMBL implementation consisting of a portable tilting platform for VLP-16 scanners, which is presented as a case study for measurement distribution analysis as well as for the discussion of actual scans from representative environments. Results indicate that despite the particular sampling patterns given by a RMBL, its homogeneity even improves that of an equivalent RSBL.
... Still considering calibration issues, there has been also a big eort on how to perform intrinsic calibration for multi-beam Lidar systems, where the results from one beam is used to calibrate the intrinsic parameters of other beams [9,26,5,12,13,11,15,25,14,28]. As for single-beam Lidar systems, instead, [25] pro- poses a method for the intrinsic calibration of a revolving-head 3D Lidar and the extrinsic calibration of the parameters with respect to a camera. ...
Chapter
We present an improved statistical model of the measurement process of triangulation Light Detection and Rangings (Lidars) that takes into account bias and variance effects coming from two different sources of uncertainty: \((i) \) mechanical imperfections on the geometry and properties of their pinhole lens - CCD camera systems, and \((ii) \) inaccuracies in the measurement of the angular displacement of the sensor due to non ideal measurements from the internal encoder of the sensor. This model extends thus the one presented in [2] by adding this second source of errors. Besides proposing the statistical model, this chapter considers: \((i) \) specialized and dedicated model calibration algorithms that exploit Maximum Likelihood (ML)/Akaike Information Criterion (AIC) concepts and that use training datasets collected in a controlled setup, and \((ii) \) tailored statistical strategies that use the calibration results to statistically process the raw sensor measurements in non controlled but structured environments where there is a high chance for the sensor to be detecting objects with flat surfaces (e.g., walls). These newly proposed algorithms are thus specially designed and optimized for inferring precisely the angular orientation of the Lidar sensor with respect to the detected object, a feature that is beneficial especially for indoor navigation purposes.
... An alternative intrinsic parameters calibration strategy specifically dedicated to multi-beam Lidars is to calibrate the various beams against each others see literature review in [17], [18] for more details. ...
... where the parameters of the triangulation Lidar are given by C [1] and β l 2 , the ones of the odometer are given by C [2] and β o 0 , and the ones of the ultrasonic ranger are given by C [3] and β u 0 . The situation is as before, where sensors' parameters can be learned independently; one may then repeat the procedures in Sections IV-C1 and IV-C2, and then apply strategy (17) for the particular case of the ultrasonic data y u and C [3]. ...
Article
Full-text available
This paper describes a new calibration procedure for distance sensors that does not require independent sources of groundtruth information, i.e., that is not based on comparing the measurements from the uncalibrated sensor against measurements from a precise device assumed as the groundtruth. Alternatively, the procedure assumes that the uncalibrated distance sensor moves in space on a straight line in an environment with fixed targets, so that the intrinsic parameters of the statistical model of the sensor readings are calibrated without requiring tests in controlled environments, but rather in environments where the sensor follows linear movement and objects do not move. The proposed calibration procedure exploits an approximated EM scheme on top of two ingredients: an heteroscedastic statistical model describing the measurement process, and a simplified dynamical model describing the linear sensor movement. The procedure is designed to be capable of not just estimating the parameters of one generic distance sensor, but rather integrating the most common sensors in robotic applications, such as Lidars, odometers and sonar rangers and learn the intrinsic parameters of all these sensors simultaneously. Tests in a controlled environment led to a reduction of the MSE of the measurements returned by a commercial triangulation Lidar by a factor between 3 and 6, comparable to the efficiency of other state-of-the art groundtruth-based calibration procedures. Adding odometric and ultrasonic information further improved the performance index of the overall distance estimation strategy by a factor of up to 1.2. Tests also show high robustness against violating the linear movements assumption.