ArticlePDF Available

Laser Scanner Calibration Dependency On the Line Detection Method

Authors:

Abstract

Calibration is an important step of 3D structured light laser scanners to be able to obtain an accurate measuring result. During the calibration a relation between the pixels in 3D laser scanner camera image and the real world units is made. Laser line detection method as part of the calibration process plays an important role in this calibration process, but this aspect has not been investigated before. In this paper the calibration and measurement result dependency from the laser line detection with pixel or sub-pixel resolution in the calibration process has been investigated. The calibration process of the developed 3D laser scanner has been described. The results show dependency between the laser line detection method in the calibration process and the final measurement results.
ELEKTRONIKA IR ELEKTROTECHNIKA,ISSN 1392-1215,VOL.21,NO.6,2015
1Abstract—Calibration is an important step of 3D structured
light laser scanners to be able to obtain an accurate measuring
result. During the calibration a relation between the pixels in
3D laser scanner camera image and the real world units is
made. Laser line detection method as part of the calibration
process plays an important role in this calibration process, but
this aspect has not been investigated before. In this paper the
calibration and measurement result dependency from the laser
line detection with pixel or sub-pixel resolution in the
calibration process has been investigated. The calibration
process of the developed 3D laser scanner has been described.
The results show dependency between the laser line detection
method in the calibration process and the final measurement
results.
Index Terms—3D laser scanner, calibration, sub-pixel.
I. INTRODUCTION
Calibration is a process in 3D laser scanners by which a
relation between the camera image pixels and the real world
units are made. Calibration is an important step in 3D laser
scanners to be able to obtain an accurate measuring result.
Many 3D laser scanner calibration methods exist. Some
of the calibration methods [1] propose to calibrate the laser
scanner using the known dimensions of an object and
calculating all of the parameters needed for the calibration
from the laser line projected onto the surface of this object
and captured by the camera in 3D laser scanner. Other
calibration methods calibrate the 3D laser scanner by
finding separately camera lens distortions [2], [3] and the
laser light plane parameters using the known distances of an
object from the camera.
All of those methods use laser line detection methods as a
part of the calibration process, which means, that the
accuracy of the laser scanner is directly related to the
precision of the laser line detection algorithm, used in the
calibration process. So far the laser line detection accuracy
has not been investigated or taken into account in the
Manuscript received January 3, 2015; accepted May 28, 2015.
Current work has been supported by EU (FP7-SME project “Hermes”
and European Regional Development Fund), Estonian Science Foundation
(target financing SF0140061s12 and grant ETF8905), Doctoral School in
Information and Communication Technology of Estonia, Estonian IT
Academy scholarship, CEBE (Centre for Integrated Electronic Systems and
Biomedical Engineering) and Tallinn University of Technology, Thomas
Johann Seebeck electronics institute.
calibration process.
Laser line on the real image of interest is typically 1 to 10
pixels wide. Many laser line detection algorithms [4], [5] for
3D laser line scanners detect the laser line with pixel
resolution. Usually those laser line detection algorithms are
simpler and thus preferred in cases, where the accuracy of
the laser scanner is not very important or where the
processor recourses are limited.
In other cases the one pixel resolution is not sufficient, as
the actual laser line center position is located between two
pixels and if going into sub-millimeter measurements the
accuracy of the laser line detection algorithm significantly
influences [6] the real measurement results.
Similar issue also exists in other structured light based
scanning technologies [7] where the method of detecting the
fringe pattern edges during the projector calibration process
might influence the end result.
Several laser line sub-pixel detection methods [8], [9]
exist, but how much do they affect the real measurement
results compared to pixel resolution methods in the
calibration process and where is the trade-off between the
complexity of the laser line detection algorithm and the laser
scanner calibration and thus the measurement accuracy?
This influence has not been investigated previously.
In this paper the measurement result dependency from the
laser line detection method with pixel or sub-pixel
resolution in the calibration process is investigated.
The developed laser scanner is calibrated with pixel and
sub-pixel resolution laser line detection algorithm. The
measurements after the calibration process are made with
sub-pixel resolution laser line detection method. The
measurement results are compared against real world object
with known dimensions.
II. 3D LASER SCANNER MODEL AND CALIBRATION
PROCESS
The simplest approximation of a camera is a pinhole
camera. In this model the lens is substituted for an
infinitesimally small hole located in the focal plane through
which all light must pass. The image of the depicted object
is projected onto the image sensor [1].
In order to convert the laser line in pixels captured by the
camera sensor to the real world measurements in millimeters
one must understand the relationship between the camera
Laser Scanner Calibration Dependency
on the Line Detection Method
Ago Molder1, Olev Martens1, 2, Tonis Saar1, Raul Land1
1Thomas Johann Seebeck Department of Electronics, Tallinn University of Technology,
Ehitajate St. 5, 19086, Tallinn, Estonia
2ELIKO Competence Centre,
Maealuse 2/1, Tallinn 12618, Estonia
ago@kodutalu.ee
http://dx.doi.org/10.5755/j01.eee.21.6.13765
66
ELEKTRONIKA IR ELEKTROTECHNIKA,ISSN 1392-1215,VOL.21,NO.6,2015
sensor and the laser light plane.
By knowing of the camera sensor width W and height H
the full resolution image optical center in pixels can be
calculated:
,
2
xW
C
(1)
.
2
yH
C
(2)
Laser line coordinates on captured image in pixels are
defined as u and v. Camera focal length fx, fyin pixels for a
given focus depends on the lens, used by the camera, and
can be estimated by using objects in front of the camera with
known dimensions at different distances.
Knowing that the light emitted by the line laser forms a
proper virtual laser light plane
0,AX BY CZ D  
(3)
and the laser line captured by the camera is located on this
laser light plane, the laser line position in the camera
coordinate system can be calculated:
' ,
x
x
u C
xf
(4)
(5)
From this the real world coordinates of the laser line can
be calculated:
,
' '
D
ZAx By C
 
(6)
' ,X x Z
(7)
' .Y x Z
(8)
The ideal pinhole cameras do not have any image
distortions, because the lens, being the main source of the
distortions, is substituted with an infinitely small hole.
Unfortunately this is not the case with real world cameras.
Two main types of distortions exist: radial and tangential
distortions. Both of those distortions however can be taken
into account and the camera image can be corrected. Radial
distortions can be corrected with the following equations:
2 4 6
1 2 3
(1 ...),
corrected
u u k r k r k r 
(9)
2 4 6
1 2 3
(1 ...),
corrected
v v k r k r k r 
(10)
where ucorrected and vcorrected are image pixel coordinates (u, v)
after corrections, knis nth radial distortion coefficient, and
2 2 .r u v 
(11)
Tangential distortions can be corrected with the following
equations:
2 2
1 2
2 ( 2 ) ... ,
corrected
u u p uv p r u
 
 
 
(12)
2 2
1 2
( 2 ) 2 ... ,
corrected
v v p r v p uv
 
 
 
(13)
where pnis nth tangential distortion coefficient. To calculate
focal length, distortion coefficients and undistort the input
image, Zhang [10] method has been used with known
chessboard size and square side length in millimeters.
Figure 4 shows an input image with two measurement
objects after undistortion process.
To find the laser light plane parameters without knowing
the laser and camera exact positions relative to each other
and the distance from the ground the laser line is projected
on chessboard pattern with different poses of the chessboard
pattern relative to the camera (Fig. 1).
Fig. 1.Laser line on chessboard image with different poses.
While at least 2 poses of the chessboard are required to
construct a laser plane mathematical description, the more
poses are used, the more accurate description of the laser
light plane can be constructed.
By knowing the camera intrinsic parameters (focal length,
image optical center, distortion coefficients) and the
chessboard size the chessboard rotation and translation
vectors (extrinsic parameters) can be calculated for every
chessboard image by using object pose estimation. Extrinsic
parameters are translating coordinates of a point (X, Y, Z) to
a coordinate system, fixed with respect to the camera. The
rotation and translation transformation is (when z’ ≠ 0)
'
' .
'
x X
y R Y T
z Z
   
   
 
   
   
   
(14)
By this equation every point on chessboard can be
converted to real world 3D points. Due to the fact, that the
laser line lies on the same chessboard the laser line real
world 3D location can be found.
To find the laser light plane parameters A, B, C and Dthe
least square fitting (LSF) method is used. LSF uses the
concept of minimizing the square sum of normal distances
as (15) and (16) from all 3D points to the optimal plane to
determine the parameters of best-fitted plane
2 2 2
( , , , ) ,
i i i
iAX BY CZ D
P F A B C D A B C
 
 
 
(15)
where the Pidenotes the normal distance from ith point to the
plane, and
2
1min,
mi
iP
(16)
67
ELEKTRONIKA IR ELEKTROTECHNIKA,ISSN 1392-1215,VOL.21,NO.6,2015
where mis the number of points.
III. DETERMINING THE LASER LIGHT PLANE PARAMETERS
WITH PIXEL AND SUB-PIXEL LASER LINE DETECTION
METHODS
Two laser line detection methods were used in order to
find the laser light plane A, B, C, D parameters. The first
method with pixel resolution was using a second order
Gaussian derivate [11]. The second method used to detect
the laser from the chessboard image was the laser line sub-
pixel detection by convolution maximum correction with
parabola, as it has been described in the previous work [6].
Fig. 2.Laser line detection from chessboard with pixel resolution.
Fig. 3.Laser line detection from chessboard with sub-pixel resolution.
Figure 2 and Fig. 3 show the differences of detected laser
line from the same chessboard image with pixel and sub-
pixel resolution methods.
IV. RESULTS
In order to see how the A, B, C, D parameters and thus
the measurement results deviate, a set of chessboard images
was chosen for laser light plane A, B, C, D parameter
detection. Following A, B, C, D parameters were obtained
with pixel and sub-pixel laser line detection methods, as
shown in Table I and Table II.
It can be seen, that there are small differences between the
A, B and C parameters obtained. The biggest difference is
parameter A, which is the rotation of the laser light plane.
TABLE I. A, B, C, D PARAMETERS WITH PIXEL RESOLUTION OVER
VARIOUS IMAGE SETS.
Image set 1
Image set 2
Image set 3
StDev.
A
1.0655E-05
1.0771E-05
1.0690E-05
5.9414E-08
B
6.8626E-04
6.8016E-04
6.7987E-04
3.6120E-06
C
4.7890E-04
4.7946E-04
4.7949E-04
3.3044E-07
D
-1.0000E+00
-1.0000E+00
-1.0000E+00
0.0000E+00
TABLE II.A, B, C, D PARAMETERS WITH SUB-PIXEL RESOLUTION
OVER VARIOUS SETS.
Image set 1
Image set 2
Image set 3
StDev.
A
1.1135E-05
1.1169E-05
1.1037E-05
6.8613E-08
B
6.8617E-04
6.8017E-04
6.7976E-04
3.5861E-06
C
4.7894E-04
4.7948E-04
4.7951E-04
3.2051E-07
D
-1.0000E+00
-1.0000E+00
-1.0000E+00
0.0000E+00
In order to see how the laser line detection with pixel or
sub-pixel resolution in the calibration process influences the
real measurement results, two objects were used with known
dimensions for the comparison of the results. The objects
with the heights of 6 mm and 25 mm were placed under the
laser line and measured. Before the measurement the optical
distortion were corrected with the average re-projection
error of RMS = 0.3122 pixels.
The camera was in the height of ~1874 mm from the floor
surface. More close-up image shows the objects and laser
line detected from the image with sub-pixel resolution.
Fig. 4.Undistorted input image with two measurement objects.
After the laser line detection from image the detected
laser line is converted to real world mm using two A, B, C,
D parameters (A, B, C, D with pixel and sub-pixel laser line
detection method) gained in the calibration process. The
gained measurements are from camera sensor perspective.
To get the actual measurements the floor surface height is
subtracted from the initial measurements.
Figure 6 shows the detected laser line in real mm-s with
pixel accuracy A, B, C, D (a) and sub-pixel accuracy A, B,
C, D (b). Two objects measured are clearly distinguishable
out of witch one object height is approximately 6 mm and
the other is around 25 mm.
The deviation between pixel accuracy A, B, C, D (a) and
sub-pixel accuracy A, B, C, D (b) measurements can be seen
more clearly from Fig. 7 where smaller object can be seen
more closely.
Fig. 5.Measured objects and laser line detected with sub-pixel resolution.
68
ELEKTRONIKA IR ELEKTROTECHNIKA,ISSN 1392-1215,VOL.21,NO.6,2015
Fig. 6.Measured objects in real mm-s with pixel accuracy A, B, C, D (a)
and sub-pixel accuracy A, B, C, D (b).
It can be seen that there is actually rather noticeable
difference between two measurements.
Fig. 7.Measurement of smaller object in real mm-s with pixel accuracy A,
B, C, D (a) and sub-pixel accuracy A, B, C, D (b).
By subtracting measurement results gained with pixel
accuracy A, B, C, D from the measurement results gained
with sub-pixel accuracy A, B, C, D across the whole laser
line the deviation between measurements becomes clearer.
Fig. 8.Deviation between measurements by using pixel (Z1) or sub-pixel
resolution (Z2) laser line detection method in the calibration process with
different image sets (a, b, c, d).
Figure 8 shows the deviation between two measurements
Z1 and Z2 across the whole laser line with different image
sets (a, b, c, d). With current setup and image sets the
deviation is between -1 mm and +1.2 mm, what is quite
much for doing sub-millimeter measurements.
TABLE III.MEASUREMENTS OF OBJECTS HEIGHTS WITH
DIFFERENT A, B, C, D.
Measurement
Mean
(mm)
StDev.
(mm)
StDevE.
(mm)
Diff.
(%)
Real height
6.000
0
0
0
Height with pixel
accuracy A, B, C, D
5,369
0.2414
0.0239
10.52
Height with sub-pixel
accuracy A, B, C, D
5.554
0.2407
0.0238
7.43
Real height
25.000
0
0
0
Height with pixel
accuracy A, B, C, D
24.698
0.456
0.0450
1.210
Height with sub-pixel
accuracy A, B, C, D
24.864
0.455
0.0449
0.544
It can be also seen that the deviation is rather systematic
and linear from one side of the image to another. This comes
mainly from the fact that the A parameter of the laser light
plane was the one, which was the most influenced. A is the
rotation of the laser light plane.
Table III gives the deviation of measurements from the
reference objects with different A, B, C, D values. It can be
seen, that the sub-pixel laser line detection method in the
calibration process has improved the measurements results.
V. CONCLUSIONS
This paper presents how the calibration and measurement
result depends from the laser line detection method with
pixel or sub-pixel resolution in the calibration process.
Measurement results are compared with reference objects
with known dimensions. The experiments show that there is
a noticeable dependency from the laser line detection
method in the calibration process, when doing sub-
millimeter measurement. Sub-pixel laser line detection
improves measurement results. However, if the
measurement accuracy can be within few millimeters then
pixel resolution laser line detection in the calibration process
is sufficient.
REFERENCES
[1] R. Andersson, “A calibration method for laser-triangulating 3D
Cameras”, Linkopings University, Linkoping, 2008. [Online].
Available: http://www.diva-portal.org/smash/get/diva2:222205
/FULLTEXT01.pdf
[2] K. Tornslev, “3D scanning using multibeam laser”, Technical
University of Denmark, Lyngby, 2005. [Online]. Available:
http://etd.dtu.dk/thesis/185879/imm4048.pdf
[3] I. Frosio, N. A. Borghese, P. Tirelli, G. Venturino, G. Rotondo,
“Flexible and low cost laser scanner for automatic tire inspection”, in
Proc. Instrumentation and Measurement Technology Conf. (I2MTC),
2011. [Online]. Available: http://dx.doi.org/10.1109/IMTC.2011.
5944190
[4] K. Sung, H. Lee, Y. S. Choi, S. Rhee, “Development of a multiline
laser vision sensor for joint tracking in welding”, The Welding
Journal, pp. 79–s, 2009. [Online]. Available: http:// www.aws.org
/wj/supplement/wj0409-79.pdf
[5] M. Lopez, J. A. Vilan, J. M. Matias, J. Taboada, “Quality control of
wood-pulp chips using a 3D laser scanner and functional pattern
recognition”, Industrial Electronics, pp. 1773–1778, 2007. [Online].
Available: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=
4374874&isnumber=4374555
[6] A. Molder, O. Martens, T. Saar, R. Land, “Laser line detection with
sub-pixel accuracy”, Elektronika ir Elektrotechnika, vol. 20, no. 5,
pp. 132–135, 2014. [Online]. Available: http://dx.doi.org/10.5755/j01.
eee.20.5.7114
[7] K. Okarma, M. Grudzinski, “The 3D scanning system for the machine
vision based positioning of workpieces on the CNC machine tools”, in
Proc. Methods and Models in Automation and Robotics, pp. 85–90,
2012. [Online]. Available: http://dx.doi.org/10.1109/mmar.2012.
6347906
[8] V. Matiukas, D. Miniotas, “Detection of laser beam’s center-line in
2D images”, Elektronika ir Elektrotechnika, vol. 7, pp. 67–70, 2009.
[Online]. Available: http://www.eejournal.ktu.lt/index.php/elt/article
/view/10048/4991
[9] P. Pelgrims, G. V. D. Velde, B. V. D. Vondel, “Sub-pixel edge
detection”, De Nayer Instituut, 2004. [Online]. Available:
http://emsys.denayer.wenk.be/emcam/subpix_eng.pdf
[10] Z. Zhang, “A flexible new technique for camera calibration”, IEEE
Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 11,
pp. 1330-1334, 2000. [Online]. Available: http://research.
microsoft.com/en-us/um/people/zhang/papers/TR98-71.pdf
[11] Li Qi, Y. Zhang, X. Zhang, S. Wang, F. Xie, “Statistical behavior
analysis and precision optimization for the laser stripe center detector
based on Steger's algorithm”, Optics express, vol. 21, no. 11,
pp. 13442–13449, 2013. [Online]. Available: http://www.
opticsinfobase.org/oe/viewmedia.cfm?uri=oe-21-11-13442&seq=0
69
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
We describe a real-time quality control system for wood chips using a 3D laser scanner. The work evaluates the appropriateness of applying a functional rather than the typical vectorial approach to a pattern recognition problem. The problem to be resolved was to construct an online system for controlling wood-pulp chip granulometry quality for implementation in a wood-pulp factory. A functional linear model and a functional logistic model were used to classify the hourly empirical distributions of wood-chip thicknesses estimated on the basis of images produced by a 3D laser scanner. The results obtained using these functional techniques were compared to the results of their vectorial counterparts and support vector machines, whose input consisted of several statistics of the hourly empirical distribution. We conclude that the empirical distributions have sufficiently rich functional traits so as to permit the pattern recognition process to benefit from the functional representation.
Article
Full-text available
Triangulation laser range scanning, which has been wildly used in various applications, can reconstruct the 3D geometric of the object with high precision by processing the image of laser stripe. The unbiased line extractor proposed by Steger is one of the most commonly used algorithms in laser stripe center extraction for its precision and robustness. Therefore, it is of great significance to assess the statistical performance of the Steger method when it is applied on laser stripe with Gaussian intensity profile. In this paper, a statistical behavior analysis for the laser stripe center extractor based on Steger method has been carried out. Relationships between center extraction precision, image quality and stripe characteristics have been examined analytically. Optimal scale of Gaussian smoothing kernel can be determined for each laser stripe image to achieve the highest precision according to the derived formula. Flexible three-step noise estimation procedure has been proposed to evaluate the center extraction precision of a typical triangulation laser scanning system by simply referring to the acquired images. The validity of our analysis has been verified by experiments on both artificial and natural images.
Article
Full-text available
We propose a flexible technique to easily calibrate a camera. It only requires the camera to observe a planar pattern shown at a few (at least two) different orientations. Either the camera or the planar pattern can be freely moved. The motion need not be known. Radial lens distortion is modeled. The proposed procedure consists of a closed-form solution, followed by a nonlinear refinement based on the maximum likelihood criterion. Both computer simulation and real data have been used to test the proposed technique and very good results have been obtained. Compared with classical techniques which use expensive equipment such as two or three orthogonal planes, the proposed technique is easy to use and flexible. It advances 3D computer vision one more step from laboratory environments to real world use.
Article
In recent years, laser vision sensors have been widely used to track the weld joint. However, they could not be used to detect the joint precisely for high-speed welding. A multiline laser vision sensor (MLVS), which was used in this work overcame the problem. Five laser lines, which were made by an aspheric lens with a point laser source, provided important information to find the weld joint lines. Clearer images were obtained through the medium, an erosion/dilation filter, and hence, five enhanced range data were obtained by the extraction method. Through the proposed method, 20 images per second could be processed. Experiments were performed with the MLVS attached to a robot. The weld joint line was tracked with speeds of 10, 15, and 20 m/min, and the mean error was 0.3 mm and a maximum error was 0.6 mm. It was found that this MLVS was very reliable for tracking the weld interface.
Article
Due to the fact that the image sensors have a fixed resolution and pixel size - a lot of image processing algorithms are limited with pixel resolution. One of those image processing algorithms is related to 3D laser scanners and in particularly to the laser line detection from images. This pixel quantization resolution, however, in many cases is not sufficient and greatly limits the laser scanner precision. In particularly the problem is highlighted in the linearly growing or declining surfaces, but also in strait lines where the actual laser line centre position is between two pixels. In this paper two laser line centre position detection methods with sub-pixel accuracy have been proposed and investigated. The methods have been tested on real images and as seen from the results at the end of the paper, the methods greatly improve the laser line detection accuracy and resolution.
Article
Modeling, reconstruction and visualization of 3D images is crucial in such fields as industrial engineering and manufacture, medicine, game development, etc. A physical object can be transferred to 3D virtual space by using a 3D laser scanner. Quality of the 3D model of the object scanned strongly depends on that of the laser beam's center-line detected in the 2D image. A technique for detection of the laser beam's center-line in a set of 2D images is presented. It uses two shifted kernels of the first-order Gaussian derivatives combined in a nonlinear way. Filtering is performed in a number of directions to select the one with the maximal response. The laser beam's center-line is identified by the local maxima in the filtered image through evaluation of its first- and second-order derivatives. The technique was tested on images with a laser trace left on the object scanned.
Conference Paper
In the paper some algorithms applied for the calibration of the 3D scanning system and image analysis in the experimental system for positioning the workpieces on the CNC machines are discussed. The idea of the scanning is based on the application of photogrammetric algorithms using the fringe patterns approach. An experimental system consisting of three cameras and three structural light projectors has been built in order to acquire the images representing the scanned object with projected light patterns. These images are then analyzed in order to obtain the depth information for each point representing the workpiece or the background. Nevertheless, a good accuracy requires a proper calibration of the system considering the distortions introduced by the optics of the cameras and projectors. After the calibration, the acquisition of image series and their analysis, the 3D model of the object, represented as a point cloud, is obtained as the result, which should be filtered and can be fitted into the known model. The results obtained in the conducted experiments have also been compared to the effects of the application of an available commercial system with similar cameras.
Article
In this paper we show how, using off the shelf components, a D laser scanner can be obtained, that retains both flexibility and accuracy. Simple and reliable calibration procedures for geometric and light parameters are described. Results on its application to some of the production problems that do occur in the tyre industry are shown.
A calibration method for laser-triangulating 3D Cameras
  • R Andersson
R. Andersson, "A calibration method for laser-triangulating 3D Cameras", Linkopings University, Linkoping, 2008. [Online]. Available: http://www.diva-portal.org/smash/get/diva2:222205
3D scanning using multibeam laser
  • K Tornslev
K. Tornslev, "3D scanning using multibeam laser", Technical University of Denmark, Lyngby, 2005. [Online]. Available: http://etd.dtu.dk/thesis/185879/imm4048.pdf