Beruflich Dokumente
Kultur Dokumente
2534
in pixels ∆u = u2 − u1 . Considering the license plate as
a plane that moves towards the Z axis, its distance to
the camera can be computed as Z = fx (∆X/∆u). Even if
the pixel localization error is zero, the image discretization
involves a discretization error represented by the size of each
pixel in world coordinates Dx = (Z/ f )dx = Z/ fx . Being n
the pixel localization error of the projected license plate,
the worst case scenario would provide a distance value of
Z ′ = fx (∆X + nDx )/∆u. Thus, the range error will be:
nDx nZ nZ 2
Zerr = Z ′ − Z = fx = = (2)
∆u ∆u fx ∆X
Merging Eq. (1) with Eq. (2) we have the following speed
error:
Fig. 3. Flowchart representation of the overall approach. n1 Z12 n2 Z22
fx1 ∆X + fx2 ∆X
verr = v (3)
Z2 − Z1
B. Speed Error Fundamentals where n1 and n2 are the pixel localization errors of both
Average vehicle speed cameras are located in at least cameras, and fx1 and fx2 are the focal length in pixels of both
two locations at a minimum of hundreds of meters apart. cameras. Note that if the road stretch begins at Z1 = 0, the
The speed error provided by these approaches decreases speed error increases linearly with the distance to the second
linearly with the length of the stretch. Accordingly, most point Z2 . However, if Z1 6= 0, Eq. (3) involves a quadratic
of the commercial systems are designed to be used over a curve that will mainly depend on v, Z1 and Z2 .
set distance of several kilometers. In a simple scenario, given
two locations Z1 and Z2 (the stretch will be S = Z2 − Z1 ) of a
vehicle detected at times t1 and t2 (∆t = t2 −t1 ) respectively,
the average speed will be v = (Z2 −Z1 )/(t2 −t1 ) (or v = S/∆t.
Now, if we consider localization errors Z1err and Z2err ,
the worst case scenario will provide an estimated speed of
v′ = (S + Z2err + Z1err )/∆t with the following speed error:
Z2err + Z1err Z2err + Z1err
verr = v′ − v = = v (1)
∆t S
being the relative speed error verr /v = (Z2err + Z1err )/S.
License Plate Recognition (LPR) systems are usually able to (a) (b)
localize license plates in ranges that may vary between 5 −
15m depending on the application. Accordingly, in practice, Fig. 4. Simple scenario to analyze range errors. (a) Top view. (b) Lateral
view.
there is no need to estimate the relative position of the vehicle
w.r.t. the camera if the length of the stretch is large enough. In Fig. 5(a) we show the speed error using Eq. (3), for
For instance, if S = 1km we can manage localization errors a fixed Z1 = 3m, Z2 values from 0 − 55m at speeds from
of up to 15m at each camera position, and the relative speed 20 − 100kmh. The simulation shows that the speed error
error will be lower than 3%. The challenge of our approach linearly increases with the vehicle speed. The obtained curves
is that the length of the stretch (a few meters) involves show a minimum value from which the speed error increases
localization errors in the order of cms. As an example, for linearly with the distance Z2 . However, the most important
S = 6m, localization errors should be lower than 9cm at each conclusion is that, for a fixed Z1 , there is an optimal value
position to obtain a relative speed error lower than 3%. of Z2 that provides the minimum speed error independently
In order to analyze the range error in more detail, we of the speed of the vehicle. As can be observed in Fig. 5(b),
consider a simple scenario where the camera is directly for a fixed vehicle speed, different values of Z1 will involve
placed at the road plane reference (see Figs. 4(a) and 4(b)), different optimal Z2 values. Therefore, for a known Z1 we
the images are undistorted, and the range is computed using can find an optimal Ẑ2 which minimizes the speed error by
the width of the license plate. The width of the license plate differentiating Eq. (3) w.r.t. Z2 and setting it to zero:
∆X = X2 − X1 in world coordinates is a known parameter.
After camera calibration we know the focal length in pixels
∂ verr
fx = f /dx ( f focal length in mm, and dx the pixel size). = . . . = (n2 fx1 )Z22 − (2n2 fx1 Z1 )Z2 − n1 fx2 Z12 = 0 (4)
After applying a license plate localization method, the width ∂ Z2
of the license plate can be obtained at the image plane We can now solve Eq. (4) for Z2 :
2535
Fixed Z1=3m Fixed Speed 60 km/h
0.9 2
20 km/h
40 km/h Z1=3m
0.8 1.8
60 km/h Z1=4m
0.7
80 km/h 1.6 Z1=5m
100 km/h Z1=6m
1.4
Speed Error (km/h)
0.2
0.4
0.1 0.2
0 0
0 5 10 15 20 25 30 35 40 45 50 55 0 10 20 30 40 50
Z2 (m) Z2 (m)
(a) (b)
Fig. 5. Speed error along Z2 . (a) Fixed Z1 and different vehicle speeds. (b) Fixed vehicle speed and different Z1 values.
q
Z1 (n2 fx1 + n22 fx1
2 +n n f f )
1 2 x1 x2
Ẑ2 = (5)
n2 fx1
Although this error analysis has been derived from a
simple and unrealistic scenario (Fig. 4), the conclusions can
be extended to the real one (Fig. 1) by taking into account
extrinsic calibration (rotation and translation) errors between
the road plane and the camera reference. Eq. (5) will be used
to define the rotation of the cameras at Installation stage (see
Fig. 3), so that the measurements taken at the first position
could be optimally paired with measurements at the second
one. Then, Eq. 5 will be also applied to only allow optimal
matches, i. e., measurements Z1 and Z2 that accomplish the
Eq. 5, to be used when computing the vehicle speed. Thus,
we can assure that the speed error will be minimized.
C. Vehicle Speed Estimation Fig. 6. Global overview of our license plate localization system.
1) License Plate Localization: As described in previous
section, we use the license plate to compute the range of the
vehicle since we have knowledge of its standard dimensions 75 for the second one) to convert points from the world/road
in world coordinates. A global overview of our license reference to the camera reference. The calibration process is
plate localization approach is depicted in Fig. 6. As can defined assuring that the rotation between road references of
be observed we make use of a combination of the MSER both cameras is the identity matrix. Then, after license plate
[24] and SWT [25] detectors to isolate the characters regions localization we estimate the relative distance of it w.r.t. to
according to some filtering criteria (size, relative positions, the world reference by using the lower corners (p1 = (u1i , v1i )
etc.). The final character regions are used to perform a fine- and p2 = (u2i , v2i )), and considering that the height Ywi of both
tuned search of the license plate borders using the probabilis- corners is the same. In addition, we have previous knowledge
tic Hough transform [26], computing the intersection points of the distance between both corners in world coordinates
between the most prominent horizontal and vertical lines, and ∆X. Considering Ki the intrinsic calibration matrix, and using
correcting them using the corner detector algorithm described homogeneous coordinates, the projection of each 3D point
in [27]. A specific OCR (Optical Character Recognition) onto the image plane can be written as:
based on CNN (Convolutional Neural Networks) is finally
applied to assure that the vehicle captured by both cameras
Xwi
is the same. ui
R T
Ywi = Mi Pwi
2) Relative Range Estimation: The overall setup used to spi = s vi = [Ki |03 ] T
(6)
03 1 Zwi
estimate the relative range of the vehicle is depicted in Fig. 1
1
7. As can be observed, each camera points to a different
region and needs to be extrinsically calibrated. Using a Then we define the following set of equations:
chessboard calibration pattern we obtain the relative rotation
and translation (Ri , Ti with i = 25 for the first camera and i = sp1i = Mi Pwi
1
(7)
2536
IV. EXPERIMENTS
In order to validate the proposed approach, we have
devised a test scenario at the Campus University of Alcalá,
using our fully autonomous vehicle, equipped with RTK-
DGPS sensor (see Fig. 8), NTP-GPSD time service, and
CAN Bus connectivity, among other sensors and actuators.
The 25mm and the 75mm cameras cover a region centered
at 3m and 9m respectively to accomplish with Eq. (5). The
speed ground truth is obtained from the DGPS sensor by
using Northing-Easting measurements with a positional ac-
curacy of 2.5cm, which for the stretch of 6m involves relative
Fig. 7. Relative positions between cameras and world plane and license speed errors < 0.83%. Since the DGPS provides data at 20Hz
plate points.
and the cameras work at 160Hz, DGPS measurements were
linearly interpolated to be synchronized with each frame of
the camera, using the accurate PC timestamps that are given
sp2i = Mi Pwi
2
(8) by the NTP client corrected by the GPSD time service.
2 1
Xwi − Xwi = ∆X (9)
that can be rearranged as in the linear form Ax = b, being A a
5 × 5 matrix, b a 5 × 1 vector and x = (Xwi 1 , X 2 ,Y , Z 1 , Z 2 ).
wi wi wi wi
Since A is square and has full rank, the system has a unique
solution given by x = A−1 b. In order to manage one unique
point, we compute the average value of both corners at each
camera, so the final point Pwi = ((Xwi 2 + X 1 )/2,Y , (Z 2 +
wi wi wi
1
Zwi )/2) will correspond to the lower middle region of the
license plate.
Finally, the relative coordinates of points Pwi are translated
to the camera reference but maintaining their orientation
w.r.t. to the road reference, i.e.: Fig. 8. Test scenario including the vehicle used during the experiments,
and the cameras.
′
Pc25 = Pw25 + R−1
25 T25 (10) An experienced driver was requested to drive at a pre-
defined fixed speed making use of the speed controller of
′
Pc75 = Pw75 + R−1 (11) the vehicle when possible. Eight different runs were made
75 T75
at eight different speeds {10, 20, 30, 40, 50, 60, 70, 80}kmh
The final range measurement that represents the displace- approximately. Speed errors were computed using ground
ment of the lower middle point of the license plate between truth data from the DGPS. In Table I we show the mean
both cameras is computed as: absolute speed error, the standard deviation and the maxi-
′ ′
mum absolute error for each of the speeds. Note that the
D = kPc75 − Pc25 k (12) first column only shows the approximate speed of each
3) Speed Measurement: The proposed approach is based of the eight runs. As can be observed, maximum errors
on high-speed cameras (160 FPS). Accordingly, for each are within ±3kmh in all cases. The mean absolute speed
vehicle and depending on its speed we can have around error corresponding to all runs is 1.44kmh which clearly
hundreds of measurements at each camera. However, not all validates the proposed methodology. In addition, if we do
of them are used to compute the vehicle speed. Eq. 12 is only not consider optimal pairs of measurements between both
applied between pairs of measurements Pc75 ′ ′
and Pc25 that cameras following the Eq. (5) the mean absolute error is
fulfill the restriction defined in Eq. 5 to guarantee minimum 1kmh larger aproximately, in some cases > 3kmh. As an
speed errors. In addition, we have an accurate timestamp for example in Fig. 9 we show the raw speed measurements
each measurement because of the NTP server installed on the for a run of 80kmh, the DPGS ground truth speed, the
PC. Let’s consider j = 1 . . . N the number of optimal pairs speed computed by averaging all measurements, and the
′j ′j
of measurements Pc75 and Pc25 that yield the displacements speed using only optimal pairs of measurements. As can be
j j j observed, the error is clearly reduced.
D , and t25 and t75 the corresponding timestamps. Then, the
final vehicle speed estimation is given by: V. CONCLUSION
1 N
Dj We have presented a novel two-camera-based approach
V=
N ∑ tj j
(13) to perform highly accurate vehicle speed estimation. High-
j=1 75 − t25 resolution and high-speed cameras with a narrow field of
2537
TABLE I
[5] T. N. Schoepflin and D. J. Dailey, “Dynamic camera calibration of
V EHICLE S PEED E STIMATION R ESULTS IN KMH . roadside traffic management cameras for vehicle speed estimation,”
IEEE Transactions on Intelligent Transportation Systems, vol. 4, no. 2,
Aprox. Speed Mean. Abs. Err. Std. Max. Err. pp. 90–98, 2003.
10 0.90 0.31 1.24 [6] F. W. Cathey and D. J. Dailey, “A novel technique to dynamically mea-
20 1.28 0.27 1.51 sure vehicle speed using uncalibrated roadway cameras,” in IV2005,
30 1.76 0.16 1.92 IEEE Intelligent Vehicle Symposium, 2005.
40 1.63 0.27 2.01 [7] L. Grammatikopoulos, G. Karras, and E. Petsa, “Automatic estimation
50 1.66 0.50 2.17 of vehicle speed from uncalibrated video sequences,” in International
60 1.70 0.81 2.62 Symposium on Modern Technologies, Education and Professional
70 0.45 0.17 0.63 Practice in Geodesy and Related Fields, 2005.
80 2.13 0.41 2.52 [8] D. Bauer, A. N. Belbachir, N. Donath, G. Gritsch, B. Kohn, M. Litzen-
berger, C. Posch, P. Schon, and S. Schraml, “Embedded vehicle speed
estimation system using an asynchronous temporal contrast vision
84
Raw speed measurements
sensor,” EURASIP Journal on Embedded Systems, vol. 82174, 2007.
83.5
DGPS speed (ground−truth) [9] X. C. He and N. H. C. Yung, “A novel algorithm for estimating
83
Average speed vehicle speed from two consecutive images,” in IEEE Workshop on
Average optimal speed
82.5
Applications of Computer Vision (WACV07), 2007.
[10] B. Alefs and D. Schreiber, “Accurate speed measurement from vehicle
Speed (km/h)
82
trajectories using adaboost detection and robust template tracking,” in
81.5 IEEE Intelligent Transportation Systems Conference (ITSC)), 2007.
81
[11] H. Zhiwei, L. Yuanyuan, and Y. Xueyi, “Models of vehicle speeds
measurement with a single camera,” in International Conference on
80.5
Computational Intelligence and Security Workshops, 2007.
80 [12] C. Maduro, K. Batista, P. Peixoto, and J. Batista, “Estimation of
79.5 vehicle velocity and traffic intensity using rectified images,” in IEEE
International Conference on Image Processing, 2008.
79
0 20 40 60 80 100 120 [13] T. Celik and H. Kusetogullari, “Solar-powered automated road surveil-
Sample
lance system for speed violation detection,” IEEE Transactions on
Industrial Electronics, vol. 57, no. 9, pp. 3216–3227, 2010.
Fig. 9. Example at 80kmh. Raw speed measurements, DGPS ground truth, [14] H. A. Rahim, U. U. Sheikh, R. B. Ahmad, and A. S. M. Zain, “Vehi-
average speed value with all measurements, and average speed value with cle velocity estimation for traffic surveillance system,” International
optimal pairs of measurements. The periodicity of the curve is not relevant Journal of Computer, Electrical, Automation, Control and Information
since it is related with the way the pairs of measurement between both Engineering, vol. 4, no. 9, pp. 1465–1468, 2010.
cameras are stored. [15] T. T. Nguyen, X. D. Pham, J. H. Song, S. Jin, D. Kim, and J. W.
Jeon, “Compensating background for noise due to camera vibration in
uncalibrated-camera-based vehicle speed measurement system,” IEEE
Transactions on Vehicular Technology, vol. 60, no. 1, pp. 30–43, 2011.
view, different focal length and orientation are mounted [16] O. Ibrahim, H. ElGendy, and A. M. ElShafee, “Speed detection
on a fixed pole, pointing to two different stretch of the camera system using image processing techniques on video streams,”
same road lane. We have proved that there is a specific International Journal of Computer and Electrical Engineering, vol. 3,
no. 6, pp. 711–778, 2011.
geometry between the cameras that minimize the speed error. [17] Z. Shen, S. Zhou, C. Miao, and Y.Zhang, “Vehicle speed detection
In our experiments, we have obtained a mean absolute speed based on video at urban intersection,” Research Journal of Applied
error of 1.44kmh for speeds up to 80kmh. In all cases, Sciences, Engineering and Technology, vol. 5, no. 17, pp. 4336–4342,
2013.
the maximum speed error is always < 3kmh. These are [18] J. Lan, J. Li, G. Hu, B. Ran, and L. Wang, “Vehicle speed measurement
encouraging results that validate our methodology. Future based on gray constraint optical flow algorithm,” Optik, vol. 125, pp.
works will involve extensive validation, including nighttime 289–295, 2014.
[19] Y. G. A. Rao, N. S. Kumar, S. H. Amaresh, and H. V. Chirag, “Real-
scenarios and comparisons with other approaches. time speed estimation of vehicles from uncalibrated view-independent
traffic cameras,” in IEEE Region 10 Conference TENCON, 2015.
VI. ACKNOWLEDGMENTS [20] D. C. Luvizon, B. T. Nassu, and R. Minneto, “Vehicle speed estima-
tion by license plate detection and tracking,” in IEEE International
This work was supported by the Research Grants VIS- Conference on Acoustics, Speech and Signal Processing (ICASSP),
PEED SPIP2015-01737 (General Traffic Division of Spain), 2014.
[21] C. Ginzburg, A. Raphael, and D. Weinshall, “A cheap system for
IMPROVE DPI2014-59276-R (Spanish Ministry of Econ- vehicle speed detection,” in preprint arXiv:1501.06751, 2015.
omy), and SEGVAUTO-TRIES-CM S2013/MIT-2713 (Com- [22] S. Dogan, M. S. Temiz, and S. Kulur, “Real time speed estimation of
munity of Madrid). moving vehicles from side view images from an uncalibrated video
camera,” Sensors, vol. 10, pp. 4805–4824, 2010.
[23] H.-Y. Lin, K.-J. Li, and C.-H. Chang, “Vehicle speed detection from a
R EFERENCES single motion blurred image,” Image and Vision Computing, vol. 26,
pp. 1327–1337, 2008.
[1] WHO, “Global status report on road safety,” 2015, site: [24] J. Matas, O. Chum, M. Urban, and T. Pajdla, “Robust wide-baseline
http://www.who.int/violence injury prevention/road safety status/2015/. stereo from maximally stable extremal regions,” Image and vision
[2] C. Wilson, C. Willis, J. K. Hendrikz, R. L. Brocque, and N. Bellamy, computing, vol. 22, no. 10, pp. 761–767, 2004.
“Speed cameras for the prevention of road traffic injuries and deaths,” [25] B. Epshtein, E. Ofek, and Y. Wexler, “Detecting text in natural
Cochrane Database of Systematic Reviews, vol. 11, no. CD004607, scenes with stroke width transform,” in Computer Vision and Pattern
2010. Recognition (CVPR), 2010.
[3] CEM, “Centro español de metrologı́a, certificado de examen de [26] J. Matas, C. Galambos, and J. V. Kittler, “Robust detection of lines
modelo,” 2016, site: http://www.cem.es/content/ex%C3%A1menes-de- using the progressive probabilistic hough transform,” Computer Vision
modelos?term node tid depth=139. and Image Understanding, vol. 78, no. 1, pp. 119–137, 2000.
[4] G. P. Stein, O. Mano, and A. Shashua, “Vision-based acc with a single [27] J. Shi and C. Tomasi, “Good features to track,” in Computer Vision
camera: Bounds on range and range rate accuracy,” in IV2003, IEEE and Pattern Recognition (CVPR), 1994.
Intelligent Vehicle Symposium, 2003.
2538