Sie sind auf Seite 1von 9

ARTICLE IN PRESS

Engineering Applications of Artificial Intelligence 20 (2007) 1013–1021


www.elsevier.com/locate/engappai

A machine vision inspector for beer bottle


Feng Duan, Yao-Nan Wang, Huan-Jun Liu, Yang-Guo Li
College of Electrical and Information Engineering, Hunan University, Postcode 410082, Changsha, China
Received 28 August 2006; received in revised form 24 November 2006; accepted 22 December 2006
Available online 26 February 2007

Abstract

A machine-vision-based beer bottle inspector is presented. The mechanical structure and electric control system are illustrated in detail.
A method based on the histogram of edge points is applied for real-time determination of inspection area. For defect detection of bottle
wall and bottle bottom, an algorithm based on local statistical characteristics is proposed. In bottle finish inspection, two artificial neural
networks are used for low-level inspection and high-level judgment, respectively. A prototype was developed and experimental results
demonstrate the feasibility of the inspector. Inspections performed by the prototype have proved the effectiveness and value of proposed
algorithms in automatic real-time inspection.
r 2007 Elsevier Ltd. All rights reserved.

Keywords: Machine vision; Beer bottle inspector; Image processing; Local statistical characteristics; Artificial neural network

1. Introduction 2. Mechanical structure and system configuration

Reusable beer bottles are widely adopted in beverage 2.1. Mechanical structure
production. Recycled bottles probably have some defects
that may cause negative even dangerous consequences for As shown in Fig. 1, the beer bottle inspector includes the
production. Hence, all recycled bottles must be cleaned and following components and modules. A separator at the
inspected before refilling and any beer bottles with defect entrance of inspector is used to separate the bottles from
must be ejected from production line. Inspection of beer each other in a certain distance. In this way, subsequent
bottle by human inspectors results in low speed and inspection can be performed reliably. A special conveyor
efficiency, because the whole inspection process is sub- including two belts that can grip the bottles enables bottles
jective and very tedious. As a replacement of human to be conveyed without anything touching the bottom and
inspector, beer bottle inspector equipped with specific high- consequently bottom inspection is available. Under the
speed image capture and processing system is able to conveyor, a cleaner is equipped for the purpose of erasing
perform inspection automatically with high speed and any possible defect or foam clinging under the bottle
accuracy. Some useful solutions for finish inspection was bottom, which may affect bottom inspection. Due to the
developed (Huimin et al., 2002; Canivet et al., 1994). This excellent consistence of illumination and long life expec-
paper presents a novel beer bottle inspector utilizing state- tancy of LED light, this efficient light is adopted to
of-art machine vision technologies to implement automatic illuminate the inspection area of beer bottle. Several
inspection of bottle wall, bottle bottom and bottle finish. photoelectric sensors equipped at different place of the
A prototype is developed and inspection algorithms are inspector are responsible for detection of bottles and
proved to satisfy the requirements of practical production. providing related information to the central control system.
Above each inspection position, an industry CCD camera
is utilized to capture the image of fast moving bottles. At
the output of the inspector, the bad bottles will be ejected
Corresponding author. off the production line by an ejector. Several position limit
E-mail address: dzhfdna@hotmail.com (F. Duan). switch are equipped in the inspector. Some dangerous

0952-1976/$ - see front matter r 2007 Elsevier Ltd. All rights reserved.
doi:10.1016/j.engappai.2006.12.008
ARTICLE IN PRESS
1014 F. Duan et al. / Engineering Applications of Artificial Intelligence 20 (2007) 1013–1021

Fig. 1. Beer bottle inspector.

operations can stop the machine. Alarm light and whistle


also work in urgent situation.

2.2. Electric control system

Fig. 2 shows the electric configuration of the beer bottle


inspector. Due to processing of large image of bottle at
very high speed especially in bottle wall inspection, two
high performance industry PCs are needed, of which one is
responsible for bottle wall inspection, the other is
responsible for bottle finish and bottle bottom inspection.
A PLC is used as low level controller, which is responsible
for the control of conveyor, ejector, sensors, protection Fig. 2. Electric configuration of beer bottle inspector.
system and so on. Before beer bottles enter into the
inspector, the separator will separate the beer bottles at a
certain interval firstly. Then the cleaner will clean possible
foam under the bottom. The related sensors will be CCD Camera CCD camera CCD camera
triggered when the beer bottles are conveyed through
different inspection position. At the same time, the two LED light
industry PCs will start the image capture and complete the LED light
real-time inspection. The inspection results will be trans-
ferred to PLC, which will control the ejector to reject the Mirror
bad bottles at last.
LED light
2.3. Light, illumination system and optical structure
Fig. 3. Optical and illumination structure.
Dedicated illumination and optical system are very
crucial in machine vision applications. Stable and reliable
light is an important factor for obtaining excellent image. order to capture the image of bottle shoulder in a high
The direction of light must be carefully controlled and resolution, two cameras are needed, which are responsible
some special filters are used to produce polarized light for for inspection of wall and bottle shoulder, respectively.
the detection of transparent scraps. In beer bottle inspec- Inspection of bottle wall is realized by a special optical
tion, LED light is the first choice due to its high efficiency, system, which can combine image of bottle wall from
excellent performance and easiness in control. Fig. 3 shows different degree in one image. It is also possible to use
the optical structure for different inspection of bottle. In dedicated mechanical instrument to rotate the bottle in 901
ARTICLE IN PRESS
F. Duan et al. / Engineering Applications of Artificial Intelligence 20 (2007) 1013–1021 1015

during the conveying process and perform bottle-wall


inspection twice to realize 3601 inspection.

3. Inspection algorithms

The beer bottle inspector is one of the typical applica-


tions of machine vision and digital image processing
technology in industry production. The most important
module of the software is the inspection algorithms, which
Fig. 4. Mark of inspection area.
must be capable of high-speed and accurate application. In
the beer bottle inspection, the spoiled part or polluted part
of bottle varies in size and position. And there are many
factors that will cause disturbance, such as the bulb and
texture of the bottle itself, surroundings light and so on.
More than this, the fast moving bottle causes a blurred
image, which is more difficult to deal with. Hence, a very
ideal and stable image is often unavailable even utilizing a
dedicated light and image capture system. The inspector
used in high-speed beer production line must inspect about
ten bottles per second. Such speed requirement causes Fig. 5. Position of target varies in captured images.
many conventional image processing algorithms incapable.
For bottle wall and bottle bottom inspection, a specific
algorithm is presented to search the cracks and tears in the and very efficient algorithm, which uses the histogram of
half transparent background (glass). While for bottle finish edge points to locate the inspection area. For example, in
inspection, the problem is to detect an annular shape and bottle wall inspection, the bottle wall image will firstly be
evaluate its quality. Therefore, a different algorithm is divided into two parts (right and left part), in which
required. formulas (1) and (2) are used to calculate the difference of
image, respectively,
3.1. Mark and determination of inspection area
rf 1 ði; jÞ ¼ 2f ði; jÞ  f ði þ 1; jÞ  f ði; j þ 1Þ, (1)
It is necessary to mark the inspection area manually first
rf 2 ði; jÞ ¼ 2f ði; jÞ  f ði  1; jÞ  f ði; j þ 1Þ, (2)
to decrease the time cost by image processing. Further
more, previous determination of the inspection area
Li þ Ri
manually is more accurate than completing the same thing Xri ¼ ði ¼ 1; 2; 3 . . . nÞ. (3)
by computer. This increases the reliability of the whole 2
system. In Fig. 4, the inspection area is marked with dot In the second step, the edge points from bottle shoulder
line. The computer only deals with the image data in the to bottle finish (shown in Fig. 6a) can be found according
inspection area. Due to much useless image data being to a carefully selected threshold TE (In our application,
omitted, high efficiency is available. TE ¼ 8). If rfi(i,j)4TE(i ¼ 1,2), then points (i,j) is con-
Several photoelectric sensors will trigger the image sidered as an edge point. In each line of a image, only two
capture when the bottles come to the inspection positions. edge points (Li, Ri) are needed, of which one is in the left
But this trigger system causes observable difference part, the other is in the right part. A reference coordinate of
between captured images. Furthermore, bottles may sway vertical axis of bottle wall is calculated by formula (3). The
a little in the fast running conveyor. Consequently, the histogram of Xr (shown in Fig. 7) is obtained through the
position of target varies in captured images as shown in statistic of Xr. Supposing a window whose width is T
Fig. 5. So it is necessary to use a certain algorithm to locate (in our application, T ¼ 6) slides from C1 to Cm in the
the inspection area in the captured image. In other words, histogram, the sum of histogram in the sliding window is
this means to determine the center of bottle bottom and got by formula (4). According to formula (5), the
finish and the vertical axis of bottle wall. coordinate of axis of bottle wall can be calculated when
The algorithm must be accurate, fast and capable against the maximum of S(x) is found. For the location of center in
large disturbance because in real-time application the the image of bottle bottom and bottle finish also, the same
bottle image contains many uncertain factors and some- algorithm is available. This algorithm utilizes the statistic
time is disturbed to a great extent. The conventional to delete distribute disturbance with large value. The final
Hough transform algorithm is very slow and not useful in result is accurate due to aids of weight addition in the
such high-speed application. Another algorithm using sliding window. Experiments have proved this algorithm is
center of gravity of image may produce large error when robust. Even if there is great error in the detection of edge
the image is disturbed greatly. This paper presents a brief points, this algorithm still outputs a very accurate value.
ARTICLE IN PRESS
1016 F. Duan et al. / Engineering Applications of Artificial Intelligence 20 (2007) 1013–1021

Fig. 6. Edge points in bottle images.

Fig. 7. Histogram of Xr.

This character is crucial in real application


XX
þT=2
SðX Þ ¼ HðXrÞ, (4)
Xr¼X T=2

Fig. 8. Mask used in pre-processing.


X þT=2
P
½XrHðXrÞ
Xr¼X T=2
Xd ¼ , (5) is the result of the convolution of inspected image and
X þT=2
P
HðXrÞ the mask shown in Fig. 8. G(x,y) is transferred to binary
Xr¼X T=2 image B(x,y) by threshold T1, which has a small value to
where X meets SðX Þ ¼ maxX 2½C 1 ;C m  SðX Þ. enable B(x,y) to include more defects. The falling and
rising edge points of horizontal and vertical direction are
obtained by formula (8). The results are saved in Ei(x,y)
3.2. Inspection of bottle wall and bottle bottom (i ¼ 1,2,3,4)

In the inspected image, the cracks and tears are darker Gðx; yÞ ¼ C 1 þ C 2 þ C 3 þ C 4 þ f ðx  7; y  7Þ
than the neighboring parts and usually have clear edges. þ f ðx þ 7; y  7Þ þ f ðx  7; y þ 7Þ
Based on this fact, the mask (shown in Fig. 8) whose size
þ f ðx þ 7; y þ 7Þ  4f ðx; yÞ, ð6Þ
and structure are optimized by experiments is chosen for
pre-processing to obtain information about edge points
(
and difference between points in the inspected image. After 1 ðGðx; yÞÞ4T 1 Þ;
pre-processing, the whole inspection area is divided into Bðx; yÞ ¼ (7)
0 ðGðx; yÞÞpT 1 Þ;
many small regions, in which some statistical character-
istics are obtained for the final evaluation that is based on
(
some special rules. 1 ðC i ðx; yÞ4T 2 Þ
Step 1: As shown in Fig. 9, pre-processing is performed E i ðx; yÞ ¼ ði ¼ 1; 2; 3; 4Þ, (8)
0 ðC i ðx; yÞpT 2 Þ
in the whole inspection area using formulas (6)–(8). G(x,y)
ARTICLE IN PRESS
F. Duan et al. / Engineering Applications of Artificial Intelligence 20 (2007) 1013–1021 1017

Fig. 9. Pre-processing.

where f(x,y) is the gray value of the inspected image, where


C1 ¼ f(x+7,y)+f(x+5,y)2f(x,y), (
HgðkÞ HgðkÞ4T4;
C2 ¼ f(x7,y)+f(x5,y)2f(x,y), Mk ¼
C3 ¼ f(x,y+7)+f(x,y+14)2f(x,y), 0 HgðkÞpT4;
C4 ¼ f(x,y7)+f(x,y14)2f(x,y).
Step 2: In each small region, by searching in B(x,y), the In our application, T1 ¼ 24, T2 ¼ 8, T3 ¼ 20, T4 ¼ 2,
connected component can be found, where the following T5 ¼ 33, T6 ¼ 25, T7 ¼ 660, T8 ¼ 0.49, T9 ¼ 2, T10 ¼ 20.
statistical information can be obtained. Fig. 10 shows some
samples. 3.3. Inspection of bottle finish

(1) the size of the connected component S, The method used in the inspection of bottle wall and
(2) the number of the falling and rising edge points of bottle bottom is unable to search defect in the image of
horizontal and vertical direction according to the bottle finish, because the image of finish has clear edges
Ei(x,y) (i ¼ 1,2,3,4), and then calculate the ratio of that cannot be distinguished from defects. A different
the falling and rising edge points of horizontal and algorithm based on neural networks is adopted in finish
vertical direction(Rx,Ry), inspection.
(3) histogram of G(x,y)Hg. Hg has 60 levels. If G(x,y)460 It is difficult for a single neural network to achieve
then the points is added up to Hg(60). perfect result. In order to obtain a satisfied result in
single neural network application, huge number of
Step 3: If S4T3 then perform the following process and samples must be offered to train the network and the
judgment. After calculating Sum and Av according to neural network itself may be very complex. Hence,
formulas (9) and (10), if one of the following conditions is the training process may be very difficult. But multiple
satisfied then the defect can be confirmed: neural networks can offer a better solution. In our
application, two neural networks are used for low-level
(1) Av4T5, inspection and high-level judgment, respectively, as shown
(2) T6oAvpT5 and Sum4T7, in Fig. 11.
(3) T10oAvpT6 and Sum4T7 and (T8oRxoT9 or Firstly, the low-level neural network inspects serial parts
T8oRyoT9) of the finish that overlay with each other to some extent.
X
60 Consequently, it is possible for the low-level neural
Sum ¼ M k k, (9) network to inspect the same point of the finish with
k¼1 several different input patterns. Before input into the
high-level neural network, the output of the low-level
Sum neural network will be transformed to binary value by a
Av ¼ , (10) threshold to greatly decrease the number of all possible
P
60
Mk input patterns to high-level neural network, which is
k¼1 therefore very reliable. As a result, even the low-level
ARTICLE IN PRESS
1018 F. Duan et al. / Engineering Applications of Artificial Intelligence 20 (2007) 1013–1021

Number of the Number of the


rising edge rising edge
points of points of
horizontal Number of the vertical Number of the
direction falling edge direction falling edge
points of points of
horizontal vertical
Inspected Gray pixels are the direction direction
region connected component Histogram of G(x,y)

150 150

100 100

50 50

0 0
-20 0 20 40 60 1 2 3 4
30 200
25
150
20
15 100
Defective
10
region 50
5
0 0
-20 0 20 40 60 1 2 3 4
8 40

6 30

4 20

2 10

0 0
-20 0 20 40 60 1 2 3 4
15 100

80
Good region
10
60
with
disturbance 40
5
20

0 0
-20 0 20 40 60 1 2 3 4

Fig. 10. Local characteristics of some samples.

Final
judgment

Low-level neural High-level neural


network network

Fig. 11. Inspection using neural networks.


ARTICLE IN PRESS
F. Duan et al. / Engineering Applications of Artificial Intelligence 20 (2007) 1013–1021 1019

neural network is sensitive to the input patterns and G(X,Y) is the pixel value of the image; R1 and R2 are the
occasionally causes wrong output, the final judgment inner radius and the outer radius of the region of interest;
supposed to be reliable and robust enough due to the Xcenter and Ycenter are the center coordinates of the finish
high-level neural network eliminates the errors caused obtained previously. STEP is the sampling step. Sampling
by low-level neural network. The low-level neural starts from b, ranges from R1 to R2 and continues for 9
network is a feed-forward neural network with 10 input steps. The output of neural networks is defined as ‘‘1’’ for
nodes, 8 hidden nodes and 1 output node. The high-level good part of the finish and ‘‘0’’ for defective part. The
neural network has 10 input nodes, 6 hidden nodes and 1 input of high-level neural network is defined by
output node. Levenberg–Marquardt learning strategy is (
0:8 if OutputLNN 4Th1 ;
adopted. I HNN ¼ (13)
As for the low-level neural network (shown in Fig. 12), 0:2 otherwise;
the input of nos. 1–9 node, representing the difference of
where OutputLNN is the actual output of the lower-level
the image, is calculated by formula (11) and the input of
neural network, Th1 is a threshold previously decided. If
no. 10 node, characterizing the brightness of the inspection
the output of high-level neural network4Th2 a defect is
region is calculated by formula (12)
confirmed and the bottle should be rejected. In our
X
R2
application, STEP ¼ 0.018, Th1 ¼ 0.3, Th2 ¼ 0.5. The
Inputi ¼ GðX ði þ 1; rÞ; Y ði þ 1; rÞÞ whole inspection process is very simple due to the
r¼R1
neural networks complete all complex analysis. All that
X
R2
inspection workers need to do is train the neural network in
 GðX ði; rÞ; Y ði; rÞÞ ði ¼ 1 to 9Þ, ð11Þ
r¼R1
a proper way.
The training processes for two neural networks are
9 X
X R2 separate. The low-level neural network is firstly trained in
Input10 ¼ GðX ði; rÞ; Y ði; rÞÞ, (12) the way introduced as follows. An image database is used
i¼1 r¼R1
to store all sample images. If huge number of samples is
where used to train the neural network at one time, it contributes
X ði; rÞ ¼ X center þ r cos ðb þ iSTEPÞ nothing for the convergence of the neural network. Hence,
during the training process, we firstly choose 40 samples to
Y ði; rÞ ¼ Y center þ r sin ðb þ iSTEPÞ:
train the neural network. Those images (shown in Fig. 13)

Fig. 12. Low-level inspection.

Fig. 13. Samples firstly used to train the neural network.


ARTICLE IN PRESS
1020 F. Duan et al. / Engineering Applications of Artificial Intelligence 20 (2007) 1013–1021

are very typical and small in number (only 40). Thus, the database to train the neural network. And then the neural
neural network is quite easy to converge. network is used in practical application again. In this way,
In the next step, the neural network is used in practical the wrong inspection case gradually appears less frequently
application. As shown in Fig. 14, when error occurs, and finally satisfies our requirement. In our application,
inspection workers can decide whether it is necessary to when the neural network achieves satisfied inspection
input the wrong inspection case to the sample image result, we used 185 samples. When the low-level neural
database. If the number of new samples input into the network works well, we begin the training of the high-level
database reaches 5, we continue using all the samples in the neural network, which is quite easy due to the small
number of possible input patterns.

4. Experimental results

Fig. 15 shows our prototype equipped with an annular


conveyor that enables us to realize the production line for
continual inspection as practical production.
In the prototype, bottle samples are inspected 50 times at
a speed about 30 000 bottles per hour. The industry PC has
a P42.4G CPU. The execution time of the inspection of
bottle wall and bottle bottom is less than 150 ms and the
inspection of bottle finish costs only 56 ms. Figs. 16 and 17
show the image of some typical bottles used in our
inspection. After completing enough experiments to adjust
some thresholds and train the neural network-based finish
inspection system, we finally achieve quite satisfying results
Fig. 14. Training neural network during online inspection. as shown in Tables 1–4. All defects larger than about 36

Fig. 15. The prototype with an annular conveyor.

Fig. 16. Some defects in bottle wall and bottom.


ARTICLE IN PRESS
F. Duan et al. / Engineering Applications of Artificial Intelligence 20 (2007) 1013–1021 1021

Fig. 17. Some typical finish images.

Table 1 Other very small chinks (Fig. 17d,e,i,j) are also detected
Inspection results of defective bottle walls and bottoms with a high correct rate. In addition, the misdetection rate
Samples Fig. Fig. Fig. Fig. Fig. Fig.
of good bottles is low. However, it is still not very satisfying
16a 16b 16c 16d 16e 16f in distinguishing between the defect and the texture of the
bottle (Fig. 16l). This problem can be partly solved by
Correct inspection 100 100 100 98 96 90 using a special machine to smooth the outside of the bottle
rate (%)
wall before inspection.

Table 2
Inspection results of good bottle walls and bottoms
5. Conclusions

Samples Fig. Fig. Fig. Fig. Fig. Fig. A successful prototype was developed and the feasibility
16g 16h 16i 16j 16k 16l of the system architecture was proved. Enough online
Correct inspection 100 100 100 100 100 88 inspections on dozens of carefully selected bottle samples
rate (%) have proved that the inspection algorithm presented in this
paper is able to achieve high correct inspection rate both to
defective bottles and good ones. In addition, artificial
Table 3
neural networks are adopted in bottle finish inspection,
Inspection results of defective finish which has proved to be very convenient for users to adjust
the system for their specific applications.
Defect samples Fig. Fig. Fig. Fig. Fig.
17a 17b 17c 17d 17e
Acknowledgments
Correct inspection 100 100 100 94 90
rate (%)
The authors appreciate the close cooperation of Mr.
Sang Cao, Mr. Jian Zhong and Mr. Zhenghua Duan for
the development of the prototype. The authors also thank
Table 4
Inspection results of good finish Mr. Hongjie Yuan and Mr. Xiaochun Li for their technical
support and assistance in collecting references.
Defect samples Fig. Fig. Fig. Fig. Fig.
17f 17g 17h 17i 17j
References
Correct inspection 100 100 96 94 92
rate (%)
Canivet, M., Zhang, R.D., Jourlin, M., 1994. Finish inspection by vision
for glass production. SPIE Proceedings 2183, 164–169.
Huimin, M., Guangda, S., Junyan, W., Zheng, N., 2002. A glass bottle
pixels in the bottle wall or bottle bottom can be correctly defect detection system without touching. Proceedings of the First
detected. And all defective finishes with cracks that may International Conference on Machine Learning and Cybernetics, vol.
cause leakage (Figs. 17a–c) can be inspected correctly. 2, Beijing, 4–5 November, pp. 628–632.

Das könnte Ihnen auch gefallen