Content uploaded by Olev Martens
Author content
All content in this area was uploaded by Olev Martens on Dec 25, 2018
Content may be subject to copyright.
ELEKTRONIKA IR ELEKTROTECHNIKA,ISSN 1392-1215,VOL.21,NO.6,2015
1Abstract—Calibration is an important step of 3D structured
light laser scanners to be able to obtain an accurate measuring
result. During the calibration a relation between the pixels in
3D laser scanner camera image and the real world units is
made. Laser line detection method as part of the calibration
process plays an important role in this calibration process, but
this aspect has not been investigated before. In this paper the
calibration and measurement result dependency from the laser
line detection with pixel or sub-pixel resolution in the
calibration process has been investigated. The calibration
process of the developed 3D laser scanner has been described.
The results show dependency between the laser line detection
method in the calibration process and the final measurement
results.
Index Terms—3D laser scanner, calibration, sub-pixel.
I. INTRODUCTION
Calibration is a process in 3D laser scanners by which a
relation between the camera image pixels and the real world
units are made. Calibration is an important step in 3D laser
scanners to be able to obtain an accurate measuring result.
Many 3D laser scanner calibration methods exist. Some
of the calibration methods [1] propose to calibrate the laser
scanner using the known dimensions of an object and
calculating all of the parameters needed for the calibration
from the laser line projected onto the surface of this object
and captured by the camera in 3D laser scanner. Other
calibration methods calibrate the 3D laser scanner by
finding separately camera lens distortions [2], [3] and the
laser light plane parameters using the known distances of an
object from the camera.
All of those methods use laser line detection methods as a
part of the calibration process, which means, that the
accuracy of the laser scanner is directly related to the
precision of the laser line detection algorithm, used in the
calibration process. So far the laser line detection accuracy
has not been investigated or taken into account in the
Manuscript received January 3, 2015; accepted May 28, 2015.
Current work has been supported by EU (FP7-SME project “Hermes”
and European Regional Development Fund), Estonian Science Foundation
(target financing SF0140061s12 and grant ETF8905), Doctoral School in
Information and Communication Technology of Estonia, Estonian IT
Academy scholarship, CEBE (Centre for Integrated Electronic Systems and
Biomedical Engineering) and Tallinn University of Technology, Thomas
Johann Seebeck electronics institute.
calibration process.
Laser line on the real image of interest is typically 1 to 10
pixels wide. Many laser line detection algorithms [4], [5] for
3D laser line scanners detect the laser line with pixel
resolution. Usually those laser line detection algorithms are
simpler and thus preferred in cases, where the accuracy of
the laser scanner is not very important or where the
processor recourses are limited.
In other cases the one pixel resolution is not sufficient, as
the actual laser line center position is located between two
pixels and if going into sub-millimeter measurements the
accuracy of the laser line detection algorithm significantly
influences [6] the real measurement results.
Similar issue also exists in other structured light based
scanning technologies [7] where the method of detecting the
fringe pattern edges during the projector calibration process
might influence the end result.
Several laser line sub-pixel detection methods [8], [9]
exist, but how much do they affect the real measurement
results compared to pixel resolution methods in the
calibration process and where is the trade-off between the
complexity of the laser line detection algorithm and the laser
scanner calibration and thus the measurement accuracy?
This influence has not been investigated previously.
In this paper the measurement result dependency from the
laser line detection method with pixel or sub-pixel
resolution in the calibration process is investigated.
The developed laser scanner is calibrated with pixel and
sub-pixel resolution laser line detection algorithm. The
measurements after the calibration process are made with
sub-pixel resolution laser line detection method. The
measurement results are compared against real world object
with known dimensions.
II. 3D LASER SCANNER MODEL AND CALIBRATION
PROCESS
The simplest approximation of a camera is a pinhole
camera. In this model the lens is substituted for an
infinitesimally small hole located in the focal plane through
which all light must pass. The image of the depicted object
is projected onto the image sensor [1].
In order to convert the laser line in pixels captured by the
camera sensor to the real world measurements in millimeters
one must understand the relationship between the camera
Laser Scanner Calibration Dependency
on the Line Detection Method
Ago Molder1, Olev Martens1, 2, Tonis Saar1, Raul Land1
1Thomas Johann Seebeck Department of Electronics, Tallinn University of Technology,
Ehitajate St. 5, 19086, Tallinn, Estonia
2ELIKO Competence Centre,
Maealuse 2/1, Tallinn 12618, Estonia
ago@kodutalu.ee
http://dx.doi.org/10.5755/j01.eee.21.6.13765
66
ELEKTRONIKA IR ELEKTROTECHNIKA,ISSN 1392-1215,VOL.21,NO.6,2015
sensor and the laser light plane.
By knowing of the camera sensor width W and height H
the full resolution image optical center in pixels can be
calculated:
,
2
xW
C
(1)
.
2
yH
C
(2)
Laser line coordinates on captured image in pixels are
defined as u and v. Camera focal length fx, fyin pixels for a
given focus depends on the lens, used by the camera, and
can be estimated by using objects in front of the camera with
known dimensions at different distances.
Knowing that the light emitted by the line laser forms a
proper virtual laser light plane
0,AX BY CZ D
(3)
and the laser line captured by the camera is located on this
laser light plane, the laser line position in the camera
coordinate system can be calculated:
' ,
x
x
u C
xf
(4)
' .
x
y
v C
yf
(5)
From this the real world coordinates of the laser line can
be calculated:
,
' '
D
ZAx By C
(6)
' ,X x Z
(7)
' .Y x Z
(8)
The ideal pinhole cameras do not have any image
distortions, because the lens, being the main source of the
distortions, is substituted with an infinitely small hole.
Unfortunately this is not the case with real world cameras.
Two main types of distortions exist: radial and tangential
distortions. Both of those distortions however can be taken
into account and the camera image can be corrected. Radial
distortions can be corrected with the following equations:
2 4 6
1 2 3
(1 ...),
corrected
u u k r k r k r
(9)
2 4 6
1 2 3
(1 ...),
corrected
v v k r k r k r
(10)
where ucorrected and vcorrected are image pixel coordinates (u, v)
after corrections, knis nth radial distortion coefficient, and
2 2 .r u v
(11)
Tangential distortions can be corrected with the following
equations:
2 2
1 2
2 ( 2 ) ... ,
corrected
u u p uv p r u
(12)
2 2
1 2
( 2 ) 2 ... ,
corrected
v v p r v p uv
(13)
where pnis nth tangential distortion coefficient. To calculate
focal length, distortion coefficients and undistort the input
image, Zhang [10] method has been used with known
chessboard size and square side length in millimeters.
Figure 4 shows an input image with two measurement
objects after undistortion process.
To find the laser light plane parameters without knowing
the laser and camera exact positions relative to each other
and the distance from the ground the laser line is projected
on chessboard pattern with different poses of the chessboard
pattern relative to the camera (Fig. 1).
Fig. 1.Laser line on chessboard image with different poses.
While at least 2 poses of the chessboard are required to
construct a laser plane mathematical description, the more
poses are used, the more accurate description of the laser
light plane can be constructed.
By knowing the camera intrinsic parameters (focal length,
image optical center, distortion coefficients) and the
chessboard size the chessboard rotation and translation
vectors (extrinsic parameters) can be calculated for every
chessboard image by using object pose estimation. Extrinsic
parameters are translating coordinates of a point (X, Y, Z) to
a coordinate system, fixed with respect to the camera. The
rotation and translation transformation is (when z’ ≠ 0)
'
' .
'
x X
y R Y T
z Z
(14)
By this equation every point on chessboard can be
converted to real world 3D points. Due to the fact, that the
laser line lies on the same chessboard the laser line real
world 3D location can be found.
To find the laser light plane parameters A, B, C and Dthe
least square fitting (LSF) method is used. LSF uses the
concept of minimizing the square sum of normal distances
as (15) and (16) from all 3D points to the optimal plane to
determine the parameters of best-fitted plane
2 2 2
( , , , ) ,
i i i
iAX BY CZ D
P F A B C D A B C
(15)
where the Pidenotes the normal distance from ith point to the
plane, and
2
1min,
mi
iP
(16)
67
ELEKTRONIKA IR ELEKTROTECHNIKA,ISSN 1392-1215,VOL.21,NO.6,2015
where mis the number of points.
III. DETERMINING THE LASER LIGHT PLANE PARAMETERS
WITH PIXEL AND SUB-PIXEL LASER LINE DETECTION
METHODS
Two laser line detection methods were used in order to
find the laser light plane A, B, C, D parameters. The first
method with pixel resolution was using a second order
Gaussian derivate [11]. The second method used to detect
the laser from the chessboard image was the laser line sub-
pixel detection by convolution maximum correction with
parabola, as it has been described in the previous work [6].
Fig. 2.Laser line detection from chessboard with pixel resolution.
Fig. 3.Laser line detection from chessboard with sub-pixel resolution.
Figure 2 and Fig. 3 show the differences of detected laser
line from the same chessboard image with pixel and sub-
pixel resolution methods.
IV. RESULTS
In order to see how the A, B, C, D parameters and thus
the measurement results deviate, a set of chessboard images
was chosen for laser light plane A, B, C, D parameter
detection. Following A, B, C, D parameters were obtained
with pixel and sub-pixel laser line detection methods, as
shown in Table I and Table II.
It can be seen, that there are small differences between the
A, B and C parameters obtained. The biggest difference is
parameter A, which is the rotation of the laser light plane.
TABLE I. A, B, C, D PARAMETERS WITH PIXEL RESOLUTION OVER
VARIOUS IMAGE SETS.
Image set 1
Image set 2
Image set 3
StDev.
A
1.0655E-05
1.0771E-05
1.0690E-05
5.9414E-08
B
6.8626E-04
6.8016E-04
6.7987E-04
3.6120E-06
C
4.7890E-04
4.7946E-04
4.7949E-04
3.3044E-07
D
-1.0000E+00
-1.0000E+00
-1.0000E+00
0.0000E+00
TABLE II.A, B, C, D PARAMETERS WITH SUB-PIXEL RESOLUTION
OVER VARIOUS SETS.
Image set 1
Image set 2
Image set 3
StDev.
A
1.1135E-05
1.1169E-05
1.1037E-05
6.8613E-08
B
6.8617E-04
6.8017E-04
6.7976E-04
3.5861E-06
C
4.7894E-04
4.7948E-04
4.7951E-04
3.2051E-07
D
-1.0000E+00
-1.0000E+00
-1.0000E+00
0.0000E+00
In order to see how the laser line detection with pixel or
sub-pixel resolution in the calibration process influences the
real measurement results, two objects were used with known
dimensions for the comparison of the results. The objects
with the heights of 6 mm and 25 mm were placed under the
laser line and measured. Before the measurement the optical
distortion were corrected with the average re-projection
error of RMS = 0.3122 pixels.
The camera was in the height of ~1874 mm from the floor
surface. More close-up image shows the objects and laser
line detected from the image with sub-pixel resolution.
Fig. 4.Undistorted input image with two measurement objects.
After the laser line detection from image the detected
laser line is converted to real world mm using two A, B, C,
D parameters (A, B, C, D with pixel and sub-pixel laser line
detection method) gained in the calibration process. The
gained measurements are from camera sensor perspective.
To get the actual measurements the floor surface height is
subtracted from the initial measurements.
Figure 6 shows the detected laser line in real mm-s with
pixel accuracy A, B, C, D (a) and sub-pixel accuracy A, B,
C, D (b). Two objects measured are clearly distinguishable
out of witch one object height is approximately 6 mm and
the other is around 25 mm.
The deviation between pixel accuracy A, B, C, D (a) and
sub-pixel accuracy A, B, C, D (b) measurements can be seen
more clearly from Fig. 7 where smaller object can be seen
more closely.
Fig. 5.Measured objects and laser line detected with sub-pixel resolution.
68
ELEKTRONIKA IR ELEKTROTECHNIKA,ISSN 1392-1215,VOL.21,NO.6,2015
Fig. 6.Measured objects in real mm-s with pixel accuracy A, B, C, D (a)
and sub-pixel accuracy A, B, C, D (b).
It can be seen that there is actually rather noticeable
difference between two measurements.
Fig. 7.Measurement of smaller object in real mm-s with pixel accuracy A,
B, C, D (a) and sub-pixel accuracy A, B, C, D (b).
By subtracting measurement results gained with pixel
accuracy A, B, C, D from the measurement results gained
with sub-pixel accuracy A, B, C, D across the whole laser
line the deviation between measurements becomes clearer.
Fig. 8.Deviation between measurements by using pixel (Z1) or sub-pixel
resolution (Z2) laser line detection method in the calibration process with
different image sets (a, b, c, d).
Figure 8 shows the deviation between two measurements
Z1 and Z2 across the whole laser line with different image
sets (a, b, c, d). With current setup and image sets the
deviation is between -1 mm and +1.2 mm, what is quite
much for doing sub-millimeter measurements.
TABLE III.MEASUREMENTS OF OBJECTS HEIGHTS WITH
DIFFERENT A, B, C, D.
Measurement
Mean
(mm)
StDev.
(mm)
StDevE.
(mm)
Diff.
(%)
Real height
6.000
0
0
0
Height with pixel
accuracy A, B, C, D
5,369
0.2414
0.0239
10.52
Height with sub-pixel
accuracy A, B, C, D
5.554
0.2407
0.0238
7.43
Real height
25.000
0
0
0
Height with pixel
accuracy A, B, C, D
24.698
0.456
0.0450
1.210
Height with sub-pixel
accuracy A, B, C, D
24.864
0.455
0.0449
0.544
It can be also seen that the deviation is rather systematic
and linear from one side of the image to another. This comes
mainly from the fact that the A parameter of the laser light
plane was the one, which was the most influenced. A is the
rotation of the laser light plane.
Table III gives the deviation of measurements from the
reference objects with different A, B, C, D values. It can be
seen, that the sub-pixel laser line detection method in the
calibration process has improved the measurements results.
V. CONCLUSIONS
This paper presents how the calibration and measurement
result depends from the laser line detection method with
pixel or sub-pixel resolution in the calibration process.
Measurement results are compared with reference objects
with known dimensions. The experiments show that there is
a noticeable dependency from the laser line detection
method in the calibration process, when doing sub-
millimeter measurement. Sub-pixel laser line detection
improves measurement results. However, if the
measurement accuracy can be within few millimeters then
pixel resolution laser line detection in the calibration process
is sufficient.
REFERENCES
[1] R. Andersson, “A calibration method for laser-triangulating 3D
Cameras”, Linkopings University, Linkoping, 2008. [Online].
Available: http://www.diva-portal.org/smash/get/diva2:222205
/FULLTEXT01.pdf
[2] K. Tornslev, “3D scanning using multibeam laser”, Technical
University of Denmark, Lyngby, 2005. [Online]. Available:
http://etd.dtu.dk/thesis/185879/imm4048.pdf
[3] I. Frosio, N. A. Borghese, P. Tirelli, G. Venturino, G. Rotondo,
“Flexible and low cost laser scanner for automatic tire inspection”, in
Proc. Instrumentation and Measurement Technology Conf. (I2MTC),
2011. [Online]. Available: http://dx.doi.org/10.1109/IMTC.2011.
5944190
[4] K. Sung, H. Lee, Y. S. Choi, S. Rhee, “Development of a multiline
laser vision sensor for joint tracking in welding”, The Welding
Journal, pp. 79–s, 2009. [Online]. Available: http:// www.aws.org
/wj/supplement/wj0409-79.pdf
[5] M. Lopez, J. A. Vilan, J. M. Matias, J. Taboada, “Quality control of
wood-pulp chips using a 3D laser scanner and functional pattern
recognition”, Industrial Electronics, pp. 1773–1778, 2007. [Online].
Available: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=
4374874&isnumber=4374555
[6] A. Molder, O. Martens, T. Saar, R. Land, “Laser line detection with
sub-pixel accuracy”, Elektronika ir Elektrotechnika, vol. 20, no. 5,
pp. 132–135, 2014. [Online]. Available: http://dx.doi.org/10.5755/j01.
eee.20.5.7114
[7] K. Okarma, M. Grudzinski, “The 3D scanning system for the machine
vision based positioning of workpieces on the CNC machine tools”, in
Proc. Methods and Models in Automation and Robotics, pp. 85–90,
2012. [Online]. Available: http://dx.doi.org/10.1109/mmar.2012.
6347906
[8] V. Matiukas, D. Miniotas, “Detection of laser beam’s center-line in
2D images”, Elektronika ir Elektrotechnika, vol. 7, pp. 67–70, 2009.
[Online]. Available: http://www.eejournal.ktu.lt/index.php/elt/article
/view/10048/4991
[9] P. Pelgrims, G. V. D. Velde, B. V. D. Vondel, “Sub-pixel edge
detection”, De Nayer Instituut, 2004. [Online]. Available:
http://emsys.denayer.wenk.be/emcam/subpix_eng.pdf
[10] Z. Zhang, “A flexible new technique for camera calibration”, IEEE
Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 11,
pp. 1330-1334, 2000. [Online]. Available: http://research.
microsoft.com/en-us/um/people/zhang/papers/TR98-71.pdf
[11] Li Qi, Y. Zhang, X. Zhang, S. Wang, F. Xie, “Statistical behavior
analysis and precision optimization for the laser stripe center detector
based on Steger's algorithm”, Optics express, vol. 21, no. 11,
pp. 13442–13449, 2013. [Online]. Available: http://www.
opticsinfobase.org/oe/viewmedia.cfm?uri=oe-21-11-13442&seq=0
69