Beruflich Dokumente
Kultur Dokumente
+ =
i n
i
i n
G G G G k " (1)
Where k=1, 22
n
-1 is edge ordinal number; n is total number of intensity image; i=1, 2n is
ordinal number of intensity image; G
i
is grey value in intensity image i, thereinto G
0
=0
15 14 13 12 11 10 9 8 6 7 5 4 3 2 1
Ordinal
number
4
3
1
Time
sequenc
2
o
1
o
14
x
A
B
C
D
E
F
G
S
T
o
13
o
12
o
11
o
13
o
11
o
12
o
14
o
0
Figure 1. Stripe edge ordinal number. Figure 2. Even-width-stripe encoding.
2.2. Acquiring projecting angle corresponded by edge in encoding pattern
According to edge ordinal number k figured out by equation (1), projecting angle corresponded can be
acquired. Projecting angle range of projector is set 2o
1
and angle between projecting centreline and
axis x is set o
0
. For example, three Gray code patterns are projected, thereby position of seven stripe
edges in projecting angle is shown in figure 2. Where A, B, C, D, E, F, G is position of stripe edges
whose ordinal number is 1, 2, 3, 4, 5, 6, 7.
Encoding by even-width-stripe pattern has advantage that easy realization, while disadvantage that
dividing projecting angle irregularly, so projecting angle range corresponded by thinnest-stripe
deduces from centre to both sides. As is shown in figure 2, thinnest-stripe width SA=AB=BC=CD=
538
DE=EF=FG=GT, while projecting angle o
11
>o
12
>o
13
>o
14
. If regions corresponded by thinnest-stripes
are looked as even angel, projecting angle error that causes reconstructed surface bend in both sides
rises.
Even-width-stripe projecting angle can be figured out by equation (2) that corrects the error above.
Putting edge ordinal number k into equation (2), projecting angle corresponded is acquired.
( )
(
+ =
1
1 1
0
2
tan
2 arctan
n
n
k
o
o o (2)
Encoding and decoding method based on stripe edge of Gray code adopts Gray code value of point
in edge in former intensity image to correspond it with that in encoding pattern. As is shown in figure
1, edges marked by broken line are all situated at stripe inner not edge in former intensity images, so
their code values are hard to be misjudged. The method removes the one-bit decoding error in Gray
code based on pixel centre theoretically.
Decoding by pixel centre, one pixel is corresponded by many object sampling points, so its grey
value cant be determined accurately, while by stripe edge, corresponding accuracy between image
sampling point and object sampling point can achieve sub-pixel degree.
3. Stripe edge sub-pixel location
Sub-pixel location technology as fitting, grey square, resampling, space square, interpolation is widely
used this year [4]. This paper adopts fitting method to acquire stripe edge in intensity image. Because
stripes in encoding pattern are vertical and those in intensity image are the same, so they are detected
by horizontal scanning. Intensity image is filtered by equation (3), and grey value f(j) is alternated by
g
4
(j).
( ) ( ) ( ) ( ) ( ) 2 1 1 2
4
+ + + + + = j f j f j f j f j g (3)
(a)
(b)
(c)
Partical
maximal value
Partical
minimal value
Start
Input image
Filtering
Sub-pixel location by
horizontal scanning
Intensity image binarization
Detect Gray code, edge ordinal
number and projecting angle
Figure out sampling point
space coordinate
Surface reconstruction
End
Figure 3. Sub-pixel location by horizontal scanning. Figure 4. Flow chart for reconstruction.
Where j is pixel row ordinal number. Figure 3(a) is intensity image whose line i is scanned, and
grey value of each pixel is shown in figure 3(b). Fitting change curve of grey value after filtering, as is
shown in figure 3(c), average of wave crest and wave hollow alternate to the edge is figured out.
Finally, its corresponded horizontal position in fitting curve is determined as sub-pixel location
position.
539
4. Reconstruction experiments
Measurement system was composed of surface light source projector, camera and measured object
simulated by 3dmax software [5]. Thereinto, projecting angle range of black-white camera was 40,
and CCD resolution was 10241024; Projector projected continuous encoding pattern whose
projecting angle range was 30. System range was about 240240mm. Matlab software was used to
process image, calculate space point coordinate and reconstruct measured surface. Program flowchart
is shown in figure 4.
Flat with depth value z=508.499mm was reconstructed by Gray code based on pixel centre and
reconstruction experimental result and error flat are shown in figure 5(a). Those by Gray code based
on stripe edge are shown in figure 5(b). Obviously, depth values of object sampling point with small
projecting angle in flat reconstructed by Gray code based on pixel centre diminished (close to camera),
while that by Gray code based on stripe edge augmented (apart from camera), because the irregular-
dividing error of projecting angle was not corrected. Depth values in flat reconstructed by Gray code
based on stripe edge did not deflect.
(a)
(b)
Figure 5. Plat reconstruction experimental results.
Specific reconstruction errors are shown in table 1.
Table 1. Reconstruction experimental findings.
Z
max
(mm) Z
min
(mm)
Maximal
error (mm)
Relative
error
Variance
Gray code by
pixel centre
509.770 507.103 1.271 0.25% 1.408
Gray code by
stripe edge
508.753 508.345 0.254 0.05% 0.035
According to table 1, measurement accuracy by Gray code based on stripe edge are higher than that
by Gray code based on pixel centre.
540
Complex 3D model was reconstructed by the method and system in this paper. Venus plaster model
and its reconstruction experimental results are shown in figure 6(a) is measured model, and (b), (c) and
(d) are reconstruction experimental results by multi-angle that can reflect measured surface truly.
(a) (b) (c) (d)
Figure 6. Reconstruction experimental results of Venus plaster model.
5. Conclusion
Method for encoding and decoding of Gray code based on stripe edge is presented. The method
adopted key technology as decoding by Gray code stripe edge, acquiring stripe edge by horizontal
scanning sup-pixel location, so one-bit decoding error of Gray code could be removed, as the same
time, advantage that high adaptability to steep part at measured surface was reserved; Object sampling
point and image sampling point corresponding point by point, quantization error caused by pixel-
centre decoding could be removed; Errors that reconstruction surface bending at both sides of
measured object caused by dividing projecting angle irregularly was corrected. According to
simulation reconstruction experiments, the method proved effective. Future plans include acquiring
higher sampling density by reducing encoding stripe width, and reducing influence to measurement
caused by shelter by multi-angle measuring and joint.
Acknowledgements
The support of National Natural Science Foundation of China under research grant 60572030,
Specialized Research Fund for the Doctoral Program of Higher Education under research grant
20050214006, Heilongjiang Province Education Department Overseas Scholars Science Research
Foundation under research grant 1055HZ027, The Key Science and Technology Research Project of
Harbin under research grant 2005AA1CG152 and Heilongjiang Province Graduate Student Innovation
Research Foundation under research grant YJSCX2005-238HLJ are gratefully acknowledged.
References
[1] Joaquim Salvi 2004 Pattern codification strategies in structured light systems Pattern
Recognition 37 827-849
[2] Jens Ghring 2001 Dense 3-D surface acquisition by structured light using off-the-shelf
components Proceedings of SPIE - The International Society for Optical Engineering 4309
220-231
[3] Sukhan Lee 2004 An active 3D robot camera for home environment Proceedings of The IEEE
Sensors 1 477-480
[4] Nelson l 2004 Creating interactive 3-D media with projector-camera systems Proceedings of
The SPIE-The International Society for Optical Engineering 5308 850-861
[5] Zhang Guangjun 2003 Elliptical-center locating of light stripe and its simulation for structured
light based 3D vision inspection Chinese Journal of Scientific Instrument 24 589-593
541