Content uploaded by Yi Zheng
Author content
All content in this area was uploaded by Yi Zheng on Aug 27, 2019
Content may be subject to copyright.
Measurement Science and Technology
PAPER
Real-time high-dynamic-range fringe acquisition for 3D shape
measurement with a RGB camera
To cite this article: Yi Zheng et al 2019 Meas. Sci. Technol. 30 075202
View the article online for updates and enhancements.
This content was downloaded from IP address 129.186.138.35 on 27/08/2019 at 16:20
1 © 2019 IOP Publishing Ltd Printed in the UK
1. Introduction
Three-dimensional (3D) shape measurement with digital
fringe projection [1] has been applied to many different
elds including, but not limited to, manufacturing inspection,
mechanical testing and biological science. However, mea-
suring the 3D shape of high-contrast surfaces has always been
a challenging task for the optical metrology society due to
drastic variations in intensity responses [2, 3]. This becomes
even more challenging if the measured scene is dynamic.
Therefore, developing advanced high-dynamic-range (HDR)
technology capable of measuring dynamically changing
scenes is still a pressing need in order to broaden the scope of
applications for 3D optical techniques.
Studies of high-contrast surface metrology have been sig-
nicantly advanced in the past few decades with the devel-
opment of different techniques. Conventional approaches
can be divided into the following main categories: (1) using
polarization lters [4–7]; (2) imaging with multiple viewing
perspectives [8, 9]; and (3) performing image-texture-based
analysis [10]. All the aforementioned approaches have
achieved successes under different scenarios. Specically, the
method using polarization lters aims at reducing or canceling
specular reections by modulating the polarization state of
the projection light. Such a method is straightforward in prin-
ciple and easy to implement. However, the signal strength of
the camera-captured fringe image might also be reduced due
to polarization cancellation. Alternatively, the image satur-
ation induced by specular reections can also be alleviated
by performing multi-view measurements. The idea behind it
is that if saturation occurs in images captured in one camera
view, the corresponding pixels are typically not saturated
from a different camera perspective. However, identifying
similar points and performing data fusion in different camera
Measurement Science and Technology
Real-time high-dynamic-range fringe
acquisition for 3D shape measurement
with a RGB camera
YiZheng1, YajunWang2, VigneshSuresh1 and BeiwenLi1
1 Department of Mechanical Engineering, Iowa State University, Ames, IA 50011, United States of America
2 State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing,
Wuhan University, Wuhan 430079, People’s Republic of China
E-mail: yjwangisu@whu.edu.cn and beiwen@iastate.edu
Received 23 November 2018, revised 2 March 2019
Accepted for publication 5 March 2019
Published 21 May 2019
Abstract
This paper introduces a novel real-time, high-dynamic-range (HDR) three-dimensional (3D)
shape measurement method using a RGB camera. The proposed method takes advantage of
two effects to realize HDR 3D fringe acquisition for 3D shape measurements: (1) the different
responses of a RGB camera’s three channels to a single color light source can be utilized to
produce fringe images with different intensity levels; (2) the projector’s dark time can be
utilized to produce bright versus dark intensity contrast if bit-wise binary patterns are used.
Experiments demonstrate the real-time capabilities of the proposed method by dynamic 3D
shape measurements (with a maximum camera frame rate of 166 Hz). Given that a bit-wise
defocused binary pattern projection is adopted, the proposed HDR method has the potential
for high-speed applications if a high-speed color camera is present.
Keywords: high-dynamic-range, real-time, 3D shape measurement, RGB camera, 1 bit binary
patterns
S
Supplementary material for this article is available online
(Some guresmay appear in colour only in the online journal)
Y Zheng etal
Real-time, high-dynamic-range fringe acquisition for 3D shape measurement with a RGB camera
Printed in the UK
075202
MSTCEP
© 2019 IOP Publishing Ltd
30
Meas. Sci. Technol.
MST
10.1088/1361-6501/ab0ced
Paper
7
Measurement Science and Technology
IOP
2019
1361-6501
1361-6501/19/075202+11$33.00
https://doi.org/10.1088/1361-6501/ab0ced
Meas. Sci. Technol. 30 (2019) 075202 (11pp)
Y Zheng etal
2
perspectives is a nontrivial problem in general, which could
involve complicated post-processing. The image-texture-
based analysis hinges on resolving shiny surface pixels via
template matching. However, the algorithm could encounter
problems when the imaged scene does not exhibit rich enough
features or textural variations.
To address the aforementioned limitations of conventional
high-contrast surface measurements, researchers have started
to seek novel solutions by developing HDR in 3D shape meas-
urement. The mainstream of HDR approaches can be divided
into techniques based on camera exposure control [11–16]
and projector fringe pattern modulation [17–24]. The technol-
ogies based on camera exposure control essentially acquire
fringe images with different camera exposure times, and then
achieve HDR effect in 3D shape measurements either by
fusion of the fringe images with varying exposures [11, 12]
or by identifying the optimal exposure time via post-fringe
analysis [13–16]. Though successful, acquiring images with
varying camera exposures is difcult to control automati-
cally. Thus, methods categorized under this type struggle to
achieve real-time 3D shape measurements. A more straight-
forward solution is to use a large number of phase shifts to
alleviate the phase error induced by saturation [25]. Though
effective, using a large number of phase-shifting patterns is
also at the cost of the measurement speeds. To overcome such
limitations, attempts have also been made to modulate the
pixel intensities of projected fringe patterns. Some prevailing
techniques include projecting additional inverted sinusoidal
patterns [17, 18] or adaptively modulating the pattern intensi-
ties based on feedback control [19–24]. The projector-inten-
sity-modulation-based approaches can effectively address the
overexposure/underexposure problems on camera images,
and can typically achieve a pretty good level of automation.
However, using multi-gray-level 8 bit patterns for projection
limits the maximum achievable refresh rate of a typical com-
mercially available video projector (e.g. 120 Hz), which sets
an upper speed limit for dynamic 3D shape measurements.
To avoid limiting the measurement speed by the maximum
8 bit image refresh rate, researchers have started to investigate
the possibilities of using 1 bit patterns to achieve a HDR effect
in 3D shape measurements. The idea is to take advantage of
the fact that a quasi-sinusoidal prole can be produced by
blurring a 1 bit binary pattern with a defocused projector [26].
Also, the use of 1 bit binary patterns can achieve a maximum
allowable image refresh rate at the kHz-level. Based on this
idea, Suresh etal [27] recently introduced a HDR method by
utilizing the dark time between the pattern illumination to
create fringe images with two brightness levels. However, the
robustness of this two-level method can be affected if a large
specular spot exists in the scene to cause both the brighter and
the darker image to saturate. Alternatively, Wang and Yang
[28] proposed to project defocused 1 bit binary dithered fringe
patterns with multiple different colors to create fringe images
with different brightness levels. Though the use of color fringe
patterns is effective in producing HDR fringe images, each
pattern needs to be projected multiple times with different
colors to achieve the HDR effect, and thus the measurement
speed is sacriced by the number of projection colors used.
Given that color information is very useful in producing dif-
ferent levels of brightness, a natural question poses itself: can
a color camera be utilized to achieve a HDR effect? The use
of a RGB camera to realize HDR in 3D shape measurements,
however, has rarely been discussed in the existing literature.
Following this idea, in this research we introduce a novel multi-
level HDR fringe acquisition method for 3D shape measure-
ment by taking advantage of a color camera’s different channel
responses to a single-color fringe projection. Specically, the
R, G and B channels’ different intensity responses to a single
color light will be utilized to create fringe images with three
different brightness levels. Moreover, the total intensity levels
will be further doubled by incorporating the two-level image
acquisition method [27] assisted by the projector’s dark time.
The proposed approach simultaneously possesses the following
merits: (1) not requiring repeated fringe projection with mul-
tiple colors to avoid sacricing the speeds; (2) creating a total
of six (i.e.
2×3
) different brightness levels in a single-pattern
illumination period to achieve HDR with increased robustness;
and (3) using 1 bit binary patterns with the potential for high-
speed applications if a high-speed color camera is available. To
demonstrate the real-time capabilities, we measured a dynami-
cally changing subject using the proposed HDR method with
the camera’s maximum frame rate of 166 Hz (3D frame rate
achieved: 14 Hz).
2. Principle
This section will rst introduce the theoretical foundations
of the three-step phase shifting algorithm. Then, a proposed
real-time HDR method will be introduced which is built based
upon (1) sinusoidal generation from 1 bit binary patterns; (2)
fringe image acquisition with ‘double-shot-in-single-illumi-
nation’ technique; and (3) a color camera’s different channel
responses to a single-color fringe projection.
2.1. Three-step phase-shifting algorithm
Over time, a variety of phase-shifting algorithms have been
developed, including three-step, four-step and least-square
methods. Among them, the three-step phase-shifting algo-
rithm requires the least number of phase-shifting steps and
thus is desirable for real-time or high-speed applications. For
a three-step phase-shifting algorithm, the three phase-shifted
fringe patterns with an equal phase shift of
2π/3
can be math-
ematically modeled as follows:
I1
(
x,y
)=
I
(
x,y
)+
I
(
x,y
) cos[φ(
x,y
)
−2
π/
3
]
,
(1)
I2
(
x,y
)=
I
(
x,y
)+
I
(
x,y
) cos[φ(
x,y
)]
,
(2)
I3
(
x,y
)=
I
(
x,y
)+
I
(
x,y
) cos[φ(
x,y
)+
2
π/
3
]
.
(3)
Here,
I(x,y)
,
I(x,y)
and
φ(x,y)
respectively represent the
average intensity (DC component), the intensity modulation
(AC component) and the phase map to be recovered. The
phase map can be extracted by simultaneously solving the
above equationswith a closed-form solution given by
Meas. Sci. Technol. 30 (2019) 075202
Y Zheng etal
3
φ
(x,y) = tan−1
√
3(I1−I3)
2I2
−
I1
−
I3
.
(4)
Also, the fringe contrast
γ(x,y)
can be solved by
γ(x,y)=
I
(
x,y
)
I
(
x,y
)=
3
(
I
1−
I
3)
2
+(
2I
2−
I
1−
I
3)
2
I1
+
I2
+
I3
.
(5)
Due to the nature of the arctangent function, the computed
phase map
φ(x,y)
will naturally contain
2π
discontinuities.
To obtain a continuous phase map, a spatial [29] or temporal
phase unwrapping [30] can be employed. Here we adopted
a temporal phase unwrapping approach described in [31] to
obtain an absolute phase map without
2π
discontinuities. A
calibration-based 3D reconstruction approach [32] is utilized
to convert the absolute phase map to 3D coordinates.
2.2. Proposed high-dynamic range method
To achieve a high-quality 3D shape measurement with HDR,
it is crucial to generate camera-captured images with different
intensity brightness levels to accommodate high surface con-
trasts. In this research, we developed a novel HDR method
which takes advantage of the following facts: (1) the dark
time between fringe projections can be utilized to create dif-
ferent image brightness levels; and (2) a color camera has dif-
ferent responses in its R, G and B channels when exposed to a
single-color light source.
Figure 1 illustrates the overall idea of our proposed tech-
nique. A digital-light-processing (DLP) projector is utilized to
shine single-color fringe patterns (e.g. a green fringe pattern
is shown without the loss of generality). Then, a digital color
camera is used for image acquisition. To obtain camera fringe
images with different overall intensity levels, a double-shot-
in-single-illumination (DSSI) technique (see step 1 below)
will be rst implemented to obtain two raw color images with
different brightnesses. Each of the two raw color images are
then separated into three different fringe images respectively
in R, G and B channels. Given that the R, G and B channels
will have different responses to a single-color light, the three
channel images will create three different overall brightness
levels. Following the aforementioned steps, fringe images
with
2×3=6
different overall intensity levels will be gen-
erated within a single-pattern illumination period. Finally,
fusion algorithms are applied to generate a nal HDR fringe
image in preparation for 3D reconstruction. Specically, the
step-by-step development of the proposed technology is elab-
orated as follows.
• Step 1: Single-color 1 bit defocused binary pattern pro-
jection. The rst step is to illuminate fringe patterns in a
single color so that different RGB channel responses can
be produced when captured by a color camera. Further,
we adopted a sinusoidal fringe generation scheme by
defocusing a 1 bit binary pattern [26]. A conceptual illus-
tration for sinusoidal generation with single-color 1 bit
binary patterns is shown in gure2. Here, the green light
illumination is shown without the loss of generality. The
pattern will retain its binary structure at the focal plane,
yet at an out-of-focus plane the Gaussian blurring effect
will produce a quasi-sinusoidal prole. As introduced by
Lei and Zhang [26], the usage of 1 bit binary patterns
instead of 8 bit patterns for sinusoidal generation has
three major advantages: (1) to avoid introducing a 120
Hz speed bottleneck for projecting 8 bit patterns; (2) to
bypass the projector’s nonlinear gamma correction; and
(3) to bypass rigid camera-projector synchronization
since projecting 8 bit grayscale intensities is achieved
by time integration (i.e. the camera is not required to
image the entire projection cycle because precise time-
integrated grayscale acquisition is not necessary). More
importantly, if the rigid camera-projector synchronization
cannot be avoided, the following step 2 will not be even
possible.
• Step 2: Fringe image acquisition with DSSI technique.
The objective of this step is to produce fringe images
with two different intensity levels prior to color channel
segmentation. As mentioned, since 1 bit binary patterns
are used, one does not need to precisely synchronize the
image acquisition to capture the entire projection cycle.
Such exibility will be utilized to achieve the timing
control for the DSSI technique, as illustrated in gure3.
The underlying conceptual idea is that two fringe images
will be captured by the camera in one projector pattern
illumination cycle. For the rst capture (Cap 1), the entire
camera exposure is within the projector’s ON-time; yet,
for the second capture (Cap 2), part of the camera expo-
sure will be moved to the projector’s OFF-time. Although
the camera exposure time remains the same for the
entire image acquisition, Cap 1 and Cap 2 are exposed
differently in time duration to the projector’s ON-time.
Therefore, camera images with two different intensity
levels will be created, with Cap 1 being brighter than
Cap 2. Using an object with lots of reectivity variations
shown in gure4(a) as a model, gures4(b) and (c) show
two examples of Cap 1 and Cap 2 images obtained using
the DSSI technique with green color pattern projection,
in which an apparent overall intensity difference can be
observed between the two images. The details of hard-
ware control for realizing this step will be elaborated in
the experimental system setup part in the beginning of
section3.
• Step 3: Color channel separation. This step is to use
the different responses of a color camera’s R, G and B
channels to a single-color light source to create more
intensity levels. Since a digital color camera is used to
acquire Cap 1 and Cap 2 images, therefore both Cap 1
and Cap 2 images can be separated into three images,
respectively, from their R, G and B channels to create
2×3=6
images in total. Given that the projector illu-
minates fringe patterns with single-color light, the R, G
and B channel will have different intensity responses.
Therefore, the resultant separated fringe images will have
Meas. Sci. Technol. 30 (2019) 075202
Y Zheng etal
4
Figure 1. A schematic diagram of the proposed high-dynamic range (HDR) method.
Figure 2. A schematic diagram of single-color 1 bit defocused binary pattern projection.
Figure 3. Timing control for the DSSI technique.
Meas. Sci. Technol. 30 (2019) 075202
Y Zheng etal
5
different overall brightness levels. Figure5 shows the six
fringe images after separating Cap 1 and Cap 2 images
into R, G and B channel images, in which one can clearly
see that the six fringe images produce obvious intensity
differences from each other.
• Step 4: Performing HDR algorithm. Once the six-channel
fringe images are generated from the previous step,
the next step is to identify effective pixels that will be
selected in each image. The selection criteria for each
pixel is to pick the one with highest contrast
γ(x,y)
Figure 4. Demonstration of the DSSI technique. (a) A snapshot of the imaged object; (b) captured Cap 1 color fringe image; (c) captured
Cap 2 color fringe images.
Figure 5. The generated 8 bit fringe images after color channel separation. (a)–(c) The R, G and B channel images of Cap 1 color image
shown in gure4(b); (d)–(f) the R, G and B channel images of Cap 2 color image shown in gure4(c).
Figure 6. Concept of the proposed HDR algorithm for fusion of fringe images.
Meas. Sci. Technol. 30 (2019) 075202
Y Zheng etal
6
(see equation (5)) from all unsaturated corresponding
pixel locations. The conceptual idea of such selection
is illustrated in gure 6. Here we denote the three-step
phase shifted fringe images of six brightness levels as
Ii
1(x,y)
,
Ii
2(x,y)
and
Ii
3(x,y)
(
i=1, 2, 3, 4, 5, 6
), respec-
tively. The corresponding six fringe contrast maps are
denoted as
γi(x,y)
(
i=1, 2, 3, 4, 5, 6
). To perform the
selection, we create six mask images Mi(x, y ) to label all
the selected pixels following the aforementioned contrast
γ(x,y)
-based selection criteria:
Figure 7. The generated mask images for fringe images
M1(x,y)−M6(x,y)
in gure5.
Figure 8. The pinhole imaging model of a camera-projector
system.
Mi(x,y)=
1 argmax
(x,y){γi(x,y)|I
i
1(x,y)<255, I
i
2(x,y)<255, I
i
3(x,y)<255}
0 otherwise
.
(6)
The created mask images
M1−M6
following this selec-
tion criteria is shown in gure7 where all the selected
pixels for each of the six fringe images are labeled. Given
this, the merged nal three-step phase shifted fringe
images
If
1
(
x,y)
,
If
2
(
x,y)
and
If
3
(
x,y)
can be computed by
I
f
1(x,y)=
6
i=1
Ii
1(x,y)×Mi(x,y)
,
(7)
I
f
2(x,y)=
6
i=1
Ii
2(x,y)×Mi(x,y)
,
(8)
I
f
3(x,y)=
6
i=1
Ii
3(x,y)×Mi(x,y)
.
(9)
An example merged-fringe image
If
1
(
x,y)
is shown in
gure9(d), which selects each unsaturated pixel with best
contrast. Once
If
1
(
x,y)
,
If
2
(
x,y)
and
If
3
(
x,y)
are obtained,
by employing the three-step phase-shifting algorithm as
introduced in section2.1, one can obtain the phase map
and nally obtain the 3D geometry by employing the
calibration-based 3D reconstruction [32]. Essentially, the
calibration treats the projector as an inverse camera such
that both the camera and the projector follow the pinhole
imaging model. Given this, the standard stereo calibration
can be applied for the camera-projector system which
produces the projection matrices from the 3D world
coordinate to 2D camera/projector image coordinates, as
shown in gure8. Based on the established triangulation
among a projector pixel, an object point and a camera
pixel, 3D coordinates of an object point can be computed
with a proven accuracy around 0.1 mm [32].
The reconstructed 3D geometry is shown in gure 9(h),
which demonstrates a high quality across the entire surface
with large reectivity variations. To clearly demonstrate the
advantage of our proposed HDR method, we also show three
representative fringe images from all six intensity levels (i.e.
brightest, darkest and a mid-level) in gures 9(a)–(c), and
corresponding 3D results in gures9(e)–(g), respectively. To
clearly visualize their difference, a zoom-in view of all 3D
Meas. Sci. Technol. 30 (2019) 075202
Y Zheng etal
7
results are shown in gures 9(i)–(l). From the 3D results,
it can be clearly seen that the brightest fringe of all six will
create apparent saturation-induced measurement errors on the
3D geometry; the 3D result obtained from the darkest fringe
is signicantly noise-impacted due to a low signal-to-noise-
ratio; the mid-level fringe produces a better result compared
to the former two cases, yet compared to our HDR method it
has a less consistent quality of 3D reconstruction across the
Figure 9. Static 3D shape measurement result. (a)–(d) Representative fringe images by selecting the brightest fringe image, the darkest
fringe image, a mid-level-intensity fringe image and generated HDR fringe image, respectively; (e)–(h) corresponding 3D results obtained
from fringe images shown in (a)–(d); a zoom-in view of (e)–(h).
Meas. Sci. Technol. 30 (2019) 075202
Y Zheng etal
8
entire surface with intensity contrast. This example of static
3D shape measurement clearly demonstrates the effectiveness
of our proposed HDR method.
3. Experiments
To evaluate the performance of our proposed HDR method, we
set up a hardware system composed of a DLP development kit
(model: DLP Lightcrafter 4500, native resolution
910 ×1140
pixels) and a complementary-metal-oxide-semiconductor
(CMOS) color camera (model: FLIR Grasshopper3 GS3-U3-
41C6C-C). The camera is attached with a lens of 12 mm focal
length. A resolution of
1280 ×960
pixels was set for image
acquisition with a maximum frame rate of 166 frames s−1.
In this experiment, we adopted an enhanced two-frequency
phase-shifting method for temporal phase unwrapping [31].
Within which, the high-frequency fringe patterns use a fringe
period of 18 pixels, and the low-frequency fringe patterns use
a fringe period of 240 pixels. It is worth mentioning that the
proposed HDR algorithm is applied to both the high-frequency
and low-frequency patterns such that the saturated areas will
not introduce phase errors for the unwrapping process. The
timing control of the camera-projector pair follows gure3.
T1 represents the projector’s illumination period and T2 rep-
resents the projector’s dark period. The projector was cong-
ured to illuminate fringe patterns at 83 frames s−1 in order
to match with the camera’s maximum image acquisition rate
(
166 ÷2=83
frames s−1). To realize the timing control,
as illustrated in gure 3, a programmable microcontroller
(Arduino Uno) and its associated software platform (Arduino
IDE) was used to produce external hardware trigger signals
for image acquisition and pattern projection. The restrictions
for this timing control is that (1)
1/(T1+T2)
= projector
refresh rate; and (2)
T3=T4+T5<(T1+T2)/2
. In the
experiments, the time periods T1 − T5 shown in gure 3
were congured as T1 = 8 ms, T2 = 4 ms, T3 = 3 ms and
T4 = T5 = 1.5 ms. Therefore, the duration for one entire pro-
jection cycle was T1 + T2 = 12 ms.
To demonstrate the real-time capability of the proposed
HDR method, we measured the 3D geometries of dynamically
changing facial expressions. Again, here the green color was
used for fringe projection. Similar to and as shown in the static
measurements, we rst show representative fringe images
with different brightness levels (one brightest, one darkest and
Figure 10. Dynamic 3D shape measurement result when the green color fringe patterns are projected (associated supplemental video S1
(stacks.iop.org/MST/30/075202/mmedia)). (a)–(d) Representative fringe images by selecting the brightest fringe image, the darkest fringe
image, a mid-level-intensity fringe image and generated HDR fringe image, respectively; (e)–(h) corresponding 3D results obtained from
fringe images shown in (a)–(d).
Meas. Sci. Technol. 30 (2019) 075202
Y Zheng etal
9
one with mid-level intensity) as well as the generated HDR
fringe image in gures 10(a)–(d), and then show the corre-
sponding reconstructed 3D geometries in gures 10(e)–(h).
An associated supplementary video S1 was generated to
demonstrate the dynamic 3D shape measurement results. The
results clearly show that the 3D geometry reconstructed using
the proposed HDR method outperforms the ones obtained
from other single-intensity-level 3D results. In particular, the
saturation-induced errors from brighter fringes were allevi-
ated and the noise effect produced by darker fringes was suc-
cessfully reduced. This experimental result demonstrates the
success of the proposed multi-level HDR method in realizing
real-time, high-quality 3D shape measurements.
To perform a more thorough evaluation, we also performed
similar measurements with different single-color (i.e. red and
blue) fringe projections. The sample results are shown in g-
ures11 and 12, and the corresponding dynamic 3D results are
shown in their associated supplementary videos S2 and S3,
respectively. From these results, one can observe phenomena
similar to what happened with single green color fringe projec-
tion. Once again, our proposed HDR method clearly has better
3D shape measurement results compared to those obtained
from single-intensity-level fringe images. In real practice, we
selected green color by default given that most digital color
cameras use a Bayer lter for color encoding (in which green
is the dominant color that occupies 50%), so that the camera is
typically more responsive to green light projection. However,
this color choice can be altered when the dominating color of
the object texture has larger red or blue components, given
that the object will reect more red or blue color light under
these scenarios. These additional experimental results further
demonstrate the effectiveness of our proposed HDR method
and its real-time capabilities.
4. Discussion
Compared to existing HDR methods for 3D shape measure-
ment, our method simultaneously possesses the following
merits.
• Sufcient intensity variations. Our proposed HDR method
creates fringe images in six different intensity levels to
provide sufcient intensity variations to accommodate the
need for measuring high-contrast surfaces. Such scheme
possesses advantageous compared to methods with fewer
intensity levels.
Figure 11. Dynamic 3D shape measurement result when the red color fringe patterns are projected (associated supplementary video S2).
(a)–(d) Representative fringe images by selecting the brightest fringe image, the darkest fringe image, a mid-level-intensity fringe image
and generated HDR fringe image, respectively; (e)–(h) corresponding 3D results obtained from fringe images shown in (a)–(d).
Meas. Sci. Technol. 30 (2019) 075202
Y Zheng etal
10
• Does not require additional pattern projection. The dif-
ferent intensity levels are created by (1) color channel
separation in camera images and (2) double-shot image
acquisition in single-pattern illumination cycle, which does
not require additional multi-color [28] or complementary
pattern projection [17]. In other words, all intensity levels
are created within a single-pattern illumination cycle.
• Measurement automation. The proposed technology does
not require manual exposure control. The timing control
to realize HDR in 3D shape measurement is completely
achieved by hardware and thus is fully automatic.
• Potential to reach high speeds. We adopted a sinusoidal
generation method by defocusing bit-wise binary patterns
in the proposed HDR method. Therefore, this method has
the potential to achieve high-speed (e.g. kHz-level) 3D
shape measurements if a high-speed camera-projector
system is available to the user.
5. Summary
This research introduces a novel HDR fringe acquisition
method for 3D shape measurement by employing a color
RGB camera in a digital fringe projection system. By taking
advantage of (1) a color camera’s different channel responses
to a single-color light projection and (2) the brightness con-
trast produced by double-shot-in-single-illumination, fringe
images at six different overall brightness levels are created.
Experiments have demonstrated the robustness of the pro-
posed method by both static and dynamic 3D shape mea-
surements. By performing image acquisition at the camera’s
maximum frame rate (166 Hz), real-time 3D shape measure-
ments have been approached. Given that the proposed method
uses 1 bit binary patterns, high-speed 3D shape measurements
can be made possible if a high-speed color camera is utilized.
Funding
Iowa State University (ISU) (Startup); National Natural
Science Foundation of China (NSFC) (61603360).
Acknowledgment
The rst two co-authors (i.e. Yi Zheng and Yajun Wang) con-
tributed equally to this work. The authors have conrmed that
any identiable participants in this study have given their con-
sent for publication.
Figure 12. Dynamic 3D shape measurement result when the blue color fringe patterns are projected (associated supplementary video S3).
(a)–(d) Representative fringe images by selecting the brightest fringe image, the darkest fringe image, a mid-level-intensity fringe image
and generated HDR fringe image, respectively; (e)–(h) corresponding 3D results obtained from fringe images shown in (a)–(d).
Meas. Sci. Technol. 30 (2019) 075202
Y Zheng etal
11
ORCID iDs
Yajun Wang https://orcid.org/0000-0002-0165-0561
Beiwen Li https://orcid.org/0000-0001-8130-7730
References
[1] GorthiS and RastogiP 2010 Fringe projection techniques:
whither we are? Opt. Laser Eng. 48133–40
[2] LinH etal 2017 Review and comparison of high-dynamic
range three-dimensional shape measurement techniques
J. Sensors 2017 9576850
[3] FengS etal 2018 High dynamic range 3D measurements
with fringe projection prolometry: a review Meas. Sci.
Technol. 29 122001
[4] YoshinoriY, HiroyukiM, OsamuN and TetsuoI 2003 Shape
measurement of glossy objects by range nder with
polarization optical system Gazo Denshi Gakkai. Kenkyukai
Koen Yoko 20043–50
[5] UmeyamaS and GodinG 2004 Separation of diffuse, specular
components of surface reection by use of polarization and
statistical analysis of images IEEE Trans. Pattern Anal.
Mach. Intell. 26639–47
[6] ChenT, LenschPH, FuchsC and SeidelH-P 2007
Polarization and phase-shifting for 3D scanning of
translucent objects Proc. IEEE Computer Society Conf. on
Computer Vision and Pattern Recognition pp 1–8
[7] SalahiehB, ChenZ, RodriguezJJ and LiangR 2014 Multi-
polarization fringe projection imaging for high dynamic
range objects Opt. Express 2210064–71
[8] HuQ, HardingGK, DuX and HamiltonD 2005 Shiny
parts measurement using color separation Proc. SPIE
600060000D
[9] FengS, ChenQ, ZuoC and AsundiA 2017 Fast three-
dimensional measurements for dynamic scenes with shiny
surfaces Opt. Commun. 38218–27
[10] KokkuR and BrooksbyG 2005 Improving 3D surface
measurement accuracy on metallic surfaces Proc. SPIE
5856619
[11] ZhangS and YauS-T 2009 High dynamic range scanning
technique Opt. Eng. 48033604
[12] WaddingtonC and KofmanJ 2010 Analysis of measurement
sensitivity to illuminance, fringe-pattern gray levels for
fringe-pattern projection adaptive to ambient lighting Opt.
Lasers Eng. 48251–6
[13] EkstrandL and ZhangS 2011 Auto-exposure for three-
dimensional shape measurement with a digital-light-
processing projector Opt. Eng. 50123603
[14] JiangH, ZhaoH and LiX 2012 High dynamic range fringe
acquisition: a novel 3D scanning technique for high-
reective surfaces Opt. Lasers Eng. 501484–93
[15] ZhaoH, LiangX, DiaoX and JiangH 2014 Rapid in situ 3D
measurement of shiny object based on fast, high dynamic
range digital fringe projector Opt. Lasers Eng. 54170–4
[16] EkstrandL and ZhangS 2014 Automated high-dynamic range
three-dimensional optical metrology technique ASME 2014
Int. Manufacturing Science and Engineering
Conf. p V001T04A043
[17] JiangC, BellT and ZhangS 2016 High dynamic range real-
time 3D shape measurement Opt. Express 247337–46
[18] WangM etal 2017 Enhanced high dynamic range 3D shape
measurement based on generalized phase-shifting algorithm
Opt. Commun. 38543–53
[19] LiD and KofmanJ 2014 Adaptive fringe-pattern projection
for image saturation avoidance in 3D surface-shape
measurement Opt. Express 229887–901
[20] WaddingtonC and KofmanJ 2014 Camera-independent
saturation avoidance in measuring high-reectivity-
variation surfaces using pixel-wise composed images from
projected patterns of different maximum gray level Opt.
Commun. 33332–7
[21] WaddingtonC and KofmanJ 2014 Modied sinusoidal fringe-
pattern projection for variable illuminance in phase-shifting
three-dimensional surface-shape metrology Opt. Eng.
53084109
[22] LinH etal 2016 Adaptive digital fringe projection technique
for high dynamic range three-dimensional shape
measurement Opt. Express 247703–18
[23] LinH etal 2017 Three-dimensional shape measurement
technique for shiny surfaces by adaptive pixel-wise
projection intensity adjustment Opt. Lasers Eng. 91206–15
[24] BabaieG, AbolbashariM and FarahiF 2015 Dynamics range
enhancement in digital fringe projection technique Precis.
Eng. 39243–51
[25] ChenB and ZhangS 2016 High-quality 3D shape
measurement using saturated fringe patterns Opt. Laser
Eng. 8783–9
[26] LeiS and ZhangS 2009 Flexible 3D shape measurement using
projector defocusing Opt. Lett. 343080–2
[27] SureshV, WangY and LiB 2018 High-dynamic-range 3D
shape measurement utilizing the transitioning state of
digital micromirror device Opt. Lasers Eng. 107176–81
[28] WangJ and YangY 2018 High-speed three-dimensional
measurement technique for object surface with a large
range of reectivity variations Appl. Opt. 579172–82
[29] SuX and ChenW 2004 Reliability-guided phase unwrapping
algorithm: a review Opt. Laser Eng. 42245–61
[30] ZhangS 2018 Absolute phase retrieval methods for digital
fringe projection prolometry: a review Opt. Laser Eng.
10728–37
[31] HyunJ-S and ZhangS 2016 Enhanced two-frequency phase-
shifting method Appl. Opt. 554395–401
[32] LiB, KarpinskyN and ZhangS 2014 Novel calibration
method for structured light system with an out-of-focus
projector Appl. Opt. 533415–26
Meas. Sci. Technol. 30 (2019) 075202