Sie sind auf Seite 1von 12

27TH INTERNATIONAL SYMPOSIUM ON BALLISTICS FREIBURG, GERMANY, APRIL 2226, 2013

AN AUTOMATED METHOD FOR COMPUTER VISION ANALYSIS OF CANNON-LAUNCHED ARTILLERY VIDEO


Ryan J. Decker1, Mathias N. Klsch2 and Oleg A. Yakimenko3 Armaments Research, Design, and Engineering Center. Picatinny Arsenal, 94 Ramsey Avenue, Picatinny, NJ, 07806 2 Naval Postgraduate School, MOVES Institute, WA-254, 700 Dyer Road, Monterey, CA, 93943-5001 3 Naval Postgraduate School, SE/Yk, 777 Dyer Road, Monterey, CA, 93943-5194
1

This paper describes an automated method to analyze launch video of artillery projectiles. Image processing techniques are used to segment the projectile shape in each video frame. Within minutes of being fired, the position and orientation of the projectile in threedimensional space is determined from knowledge of the launch geometry. An overview of several standard methods used by the Army to characterize yawing motion is included, as well as a discussion of the limitations of those methods. Results from real artillery testing using the automated method are validated by a comparison to results from conventional pitch and yaw camera analysis to determine the value of first-maximum yaw.

INTRODUCTION

12

The accuracy of artillery weapons requires precise ballistic performance characterization of artillery projectiles. In order to determine these aerodynamic measurements, weapons developers conduct extensive computer modeling analysis, wind tunnel testing, and instrumented firings of tactical projectiles. Aeroballistic metrics from these tests are necessary to predict a projectiles trajectory in free-flight. Throughout the lifecycle of artillery projectiles, live-fire testing is conducted for design changes or lot-acceptance verification. In these tests, it is becoming increasingly common to record high-speed video of the projectiles as they exit the muzzle of the cannon. Often, these cameras are stationary digital cameras capable of recording over 500,000 frames per second (fps) [1]. Other times, when visual confirmation of the initial flight performance is desired, new state-of-the-art camera systems capable of automated movement to follow a projectile are used [2,3]. One of these systems, the Trajectory Tracker (TT), is a sophisticated optical system that works by rotating a mirror at a calculated rate, so that the projectile remains in the field of view for more than 100m following muzzle exit. A successful track of a 155mm-type artillery projectile can deliver thousands of high-resolution digital images of the projectile during the first few moments of free-flight. Depending on the zoom and position of the camera, the resolution quality of these images can deliver hundreds of pixels along the length of the projectile. This paper describes an automated method to quantify the pitching and yawing motion of a projectile during the first few moments of ballistic flight using two TT systems. Image processing tools are used to segment the shape of the projectile in

263

every frame of a TT launch video, which allows the location and observed pitch angle to be calculated with sub-pixel accuracy. Subsequent automated analysis uses the history of the projectile location and the pitching behavior to calculate estimates for the epicyclic motion, as well as other ballistic parameters such as stability and aeroballistic coefficients. Using two cameras located at different orthographic views of the line-of-fire (LOF) allows the pitching and yawing motion history of the projectile to be calculated in three dimensions (3D). In addition, the automated method is able to make corrections for camera misalignment, and thus only requires a-priori knowledge of the camera positions, cannon trunnion position, and the cannon pointing direction. Of particular interest to aeroballisticians is the value of the first-maximum yaw (FMY) that a stable spinning projectile exhibits during free-flight. This value corresponds to the largest total angle-of-attack that the projectile exhibits during its first epicycle. Since there is a direct relationship between ballistic range and the angleof-attack that a projectile exhibits, it is important to quantify this value for the initial conditions of six degree-of-freedom (6 DOF) aerodynamic simulations. The automated analysis method described in this paper is compared to the results from manual pitch and yaw camera analysis for measuring the FMY. CONVENTIONAL METHODS FOR MEASURING PITCH AND YAW OF ARTILLERY PROJECTILES For decades, facilities known as spark ranges have been used to quantify the epicyclic motion of projectiles in free-flight. When the bullet reaches a station, a flash occurs that casts two orthogonal images of the bullet onto panels known as shadowgraphs. A series of timed shadowgraphs are taken at specific distances downrange and are post-processed by hand to measure projectile location, pitch, and yaw values at each station. This is an accurate and reliable means of motion characterization, but requires extensive setup for each shot while delivering a limited number of data points. In addition, spark ranges limit the elevation of fire, and for large calibers can be restricted only to horizontal firings. Projectiles that discard components such as fin-protection hoods are often excluded from spark range characterization to avoid equipment damage [4]. Another traditional method used to measure pitch and yaw is through the use of yaw cards. At pre-determined locations along the line of fire, large pieces of cardboard or other soft material are constructed. After passing through the card, the shape of the hole made by the projectile is examined and compared to a template to estimate the pitch and yaw angles of the projectile at that location [5]. Limitations of yaw cards are that they require extensive setup before and after each shot, they deliver few data points for curve fitting, and they limit the testing conditions to firing at low quadrant elevations (QEs). The most reliable means currently used by the Army to quantify epicyclic motion is through the use of on-board electronics with inertial measurement units. Some systems, including the DFuze system are equipped with light-sensing yawsondes that can be used to determine projectile attitude relative to the position of the sun [6]. The two most significant limitations of these systems are that they may require unavailable space-claim on non-standard artillery systems, and they are extremely costly. Many developmental artillery programs are unable to afford on-board instrumentation [7].

264

Manual scoring and data reduction of stationary video systems is also conducted in which an operator is able to play the video and identify relative angles between the background and regions of the bullet to estimate pose. This type of analysis can be labor-intensive, limited to the precision of the image resolution, and is subject to human error. In addition, a stationary field of view makes it difficult to measure the observed pitch angle in more than one location along the line of fire. Typically, pitch and yaw high-speed cameras are pointed at an estimated location of the FMY. This type of analysis requires precise knowledge of both the location and orientation of all cameras used. A robust software package called Track-Eye developed by PhotoSonics is capable of man-in-the-loop computer vision analysis of aerial platforms including artillery projectiles by tracking specific features on objects in an artillery launch video [8]. Results from different cameras can be combined to quantify 6 DOF motion. Some limitations of this type of analysis are that it requires the operator to be trained in the specific program, and often requires user interaction to re-select tracked points several times during the video. Track-Eye analysis may not be practical if analyzing hundreds of shots worth of data. AUTOMATED ANALYSIS METHOD The developed automated video analysis procedure performed on each of the two TT launch videos has two main parts. The first is a segmentation algorithm which extracts important information from the launch video such as the projectile location and orientation in each image frame. The second part involves sequential postprocessing of the translational movement, analysis and correction of the observed pitching angle relative to the horizontal axis of the frame (obs), and quantification of the corrected epicyclic motion.
Projectile Shape Segmentation

In each video frame, optical character recognition is used to read important information such as the time, frame number, and video frame rate [9]. Following a conversion from color to grayscale, smoothing operations are applied to suppress noise. Edge detection techniques with variable sensitivity are combined with morphological operations to identify candidate projectile shapes in the image. Candidate shapes in contact with the border are removed [10,11]. This process is illustrated in Fig.1. If the largest candidate shape has an area greater than 2,000 pixels, its silhouette is compared to an Active Shape Model (ASM) [12]. An ASM is a numerical model used to represent the natural shape variability of a training set of similar objects. For artillery projectiles, the training set consists of images of projectiles of varying size, skew angle, and orientation. If the candidate shape is within a threshold distance from the ASM, the pixel locations of that shape are classified as belonging to the projectile. Figure 2 depicts variations in the first dimension of fluctuation (eigenvector) for the projectile ASM.

265

original image

binary gradient mask

200 400

200 400 200 400 600 800 binary image with filled holes

VAP IPT TRN 18

200 400 600 800 dilated gradient mask

200 400 200 400 600 800 cleared border image

200 400 200 400 600 800 segmented image

Figure 3. Sweep plane profile.


200 400
200 400 600 800

200 400
200 400 600 800

Figure 1. Projectile shape segmentation (axes show pixel number).

A critical step in fitting the segmented shape to the ASM is the measurement of obs. This is done by defining the nose region as the five percent of pixels that are farthest from the central moment of the entire projectile shape. Then, obs is calculated from the angle between the central moment of the nose region and the central moment of the projectile
Nosemoment , y Centermoment , y obs = tan 1 Nose moment , x Center moment , x

(1)

This results in a robust measurement of obs which is dependent upon the average of hundreds of pixels as opposed to the precision of just a single pixel as used in other conventional methods such as manual scoring.
Velocity Analysis

An a-priori estimate for the projectile muzzle velocity is used as an input by the TT software to determine a predicted sweep path for the mirror. The simplicity of the geometry in Fig.3 allows all calculations to be conducted using a coordinate frame centered at the orthogonal point () and in the camera sweep plane. For each shot, the TT software generates a scan output file that contains the time-history of the mirror rotation recorded during the tracking operation. This data is essential to the velocity and pitching motion analysis. A critical value extracted from the scan output file is the standoff distance (Dort) which represents the distance of the camera to the LOF in the sweep plane. When the

Figure 2. Projectile Active Shape Model.

266

projectile has reached the orthogonal point (), it is exactly Dort meters from the camera. The number of pixels between the nose and the base of the projectile when it reaches this point (Nproj) can be used to determine an estimate of the number of radians per image pixel (K) for the entire launch video
1 1 K = 2 N proj tan 1 (0.5 LDort )

(2)

where L is the actual projectile length. The number of radians per pixel is required for correcting the angular position estimate of the projectile in the sweep plane, which is calculated as
x = centerpixel,x + correction,x , correction,x = K (Centermoment , x 0.5 N cols ) y = K (Centermoment , y 0.5 N rows )

(3) (4)

where the rotation of the center pixel (centerpixel,x) is taken from the scan output file, and Ncols and Nrows refer to the image resolution (usually 1024x512). Using these corrections for the projectile viewing skew angle history, the velocities in the horizontal (X) and vertical (Y) directions are determined from
Vx = Dort

(sec 2 ( y )) (sec 2 ( x )) , V y = Dort t t

(5)

and the velocity angle history (V) in the image frame becomes
V = tan 1 (V yVx1 )

(6)

A linear fit to the V is computed for each shot to be used in correcting the true pitch angle of the projectile.

Pitch and Yaw Analysis

The value of obs calculated in the segmentation process is a measure of the projectiles apparent pitch angle in the image frame. When the LOF is not parallel to the image plane (when x 0) the calculated obs value is an over-estimate of the actual pitch angle because the projectile appears shorter as illustrated in Fig.4. If a projectile was a two-dimensional (2D) body, then the pitch angle correction for the skew angle would be
' = obs cos( x )

(7)

The accuracy of the simple trigonometric correction in Eq. (7) was evaluated using Computer Aided Design (CAD) software. A 3D model of a 155mm-type projectile was oriented to a pitch angle of 10. Screenshot images were taken at different viewing skew angles (x) ranging from -50 to 50. The process was repeated for a pitch angle of 5. Results showed that the segmentation and pitch measurement algorithm with the correction in Eq. (7) was accurate to within 0.0095 per degree. To

267

improve this result even further, the empirical relationship, accurate for both pitch angles to within 0.0001 per degree for all skew angles in the test set, was established
' = obs (cos( x ) + 0.0114| x |)

(8)

These error estimates may not directly correlate to the accuracy of the segmentation algorithm for real projectile video. The resolution of the CAD projectile images provided just over 200 pixels along the axis of the projectile. This number is actually smaller than most frames in a typical TT launch video, suggesting that the performance could actually be better from real video. The real video, however, may be subject to increased noise, occlusion, glint, and decreased contrast with the background which may hinder the ability of the algorithm to segment the projectile as accurately as it does for the CAD images which contain no artificial noise. A final correction must also be made because the true pitch angle is measured relative to the velocity vector of the projectile, not the horizontal axis of the image frame. A best estimate for the pitch angle relative to the velocity vector in each video frame is computed as
best = ' V cos( x )

(9)

Merging Data From Two Cameras to Estimate True Pitch and Yaw

The pitch value estimated at this point (best) is only representative of the pitching motion relative to the velocity vector in the plane that contains the LOF and is perpendicular to the camera location. In order to estimate the conventional pitch () and yaw () of the projectile in 3D (referred to as angle of attack and side-slip angle in flight mechanics) it is necessary to use two TT systems located at different positions. To reduce geometric dilution of precision while optimizing the range in which the bullet remains in view, it has been found that locating the cameras about 40m downrange, and placed 35m away from the azimuth of fire works well for analyzing 155mm projectiles at a quadrant elevation of 800mils (45) (Fig.5). The algorithm to merge the pitch and yaw analysis from opposing cameras begins by taking the pitching motion history (best) and the position estimates from the video analysis of each camera. The position history estimate from each camera is averaged for all frames and assumed to be located along the LOF. In each time increment where the projectile was successfully segmented by both cameras, the camera view plane

Figure 4. Observed pitch angle corrections.

268

along the LOF and the pointing vector from the projectile location to each camera is calculated as
v v v v N = S LoF CamXYZ , r = s proj S LoF Cam XYZ

(10)

Here SLOF is the pointing direction of the line of fire, sproj is the downrange distance of the projectile along the line of fire from the position estimate, CamXYZ is the position of the camera relative to the cannon trunnion using a coordinate system (where the X direction points horizontally down the azimuth of fire, the Y axis points in the crossrange direction, and the Z direction is up). The projectile pointing vector in the camera view plane in XYZ coordinates is then calculated as
v nrot =
XYZ camera

v RN

(11)

where the rotation matrix

XYZ camera

R is constructed using the quaternion


(12)

v v v Q = [cos(0.5 best ) rX sin(0.5 best ) rY sin(0.5 best ) rZ sin(0.5 best )]

Finally, the attitude vector of the projectile is found according to


v v v iXYZ = nleft ,rot nright ,rot

(13)

where the subscripts left and right represent the left and right cameras. The resolved pitch and resolved yaw values relative to the line of fire are calculated utilizing the LOS corresponding rotation by a QE angle XYZ R
= tan 1 (i3 / i1 ) , = tan 1 (i2 / i1 ) , i123 =
Quantifying Epicyclic Motion

LOS XYZ

v RiXYZ

(14)

An additional capability of the proposed method is to characterize the epicyclic motion of the projectile. In free flight, the nose of a projectile gyrates or cones around its velocity vector at two distinct frequencies [13]. The slower of these frequencies is known as precession and the faster frequency is known as nutation. In the short amount of travel captured by the launch video, only 1-2 complete nutation cycles are captured. A three step procedure for quantifying the epicyclic motion assuming linearized aeroballistics is as follows. 1. Subtract the average value for the pitching motion () from each of the history

Figure 5. Launch geometry, QE = 45 (800mils).

269

points calculated, and find a least-squares fit to a single sinusoid featuring a first estimate for the fast frequency (f) and its corresponding phase shift (f)
zero _ mean = f sin( f t + f )

(15)

2. Assume that velocity (V) and spin rate (p) are constant during the segment of flight recorded in the video. The projectile spin rate is calculated by taking the average velocity from the automated velocity analysis and converting it to spin rate
p = 2 V ( N twist d proj ) 1 [ rad sec ]

(16)

where dproj is the projectile diameter and Ntwist is the twist rate of the cannon rifling (for U.S. 155mm cannons Ntwist = 1rev/20cal). For axially symmetric spinning projectiles, the relative rates of the fast and slow modes of oscillation are related by the spin rate through the ballistic parameter P
1 P = IxI y p

, s = P f

(17)

where Ix and Iy are the axial and transverse moments of inertia of the projectile, respectively. Using the f, f, and the f calculated in step 1, a second least-squares fit is performed to determine an estimate for s, and the s
true = f sin(f t + f ) + ssin((P f )t + s )

(18)

3. Perform a final least-squares fit allowing all variables in Eq. (18) to adjust to compute the best estimate for the pitching motion. If the epicyclic motion has been correctly identified, then step three can be repeated to fit the yaw-direction data (true) by changing only the phase shifts (s and f) as shown in the following section. Having quantified the epicyclic motion, important aeroballistic information can be determined about the flight of the projectile. The first value of interest is the ballistic parameter M, which is calculated from
M = 0.25( P 2 (2 f P )2 )

(19)

From the parameter M, the linearized pitching moment coefficient, Cm is found as


Cm = 2m proj ( air S proj d proj ) 1 k y 2 M
2 , k y2 = I y (d proj m proj ) 1

(20)

The ratio of the two ballistic parameters can be used to quantify the gyroscopic stability of the projectile for the given launch velocity
S g = 0.25 P 2 M 1

(21)

270

TABLE I. ANIMATION RESULTS AT DIFFERENT RESOLUTIONS


Pixels Along Length of Projectile Scale Ratio (vs. Typical Real Video) True: (hz) nutation (f) 0.63611 0.15000 precession (s) 413 341 227 112 53 121% 100% 67% 33% 16% (hz) % error (hz) % error (hz) % error (hz) % error (hz) % error 0.63603 -0.013% 0.63595 -0.025% 0.63622 0.017% 0.63600 -0.017% 0.46770 -26.475% 0.15000 0.000% 0.15002 0.013% 0.14997 -0.020% 0.15006 0.040% 0.00021 -99.860%

Figure 6. Calculated pitch and yaw motion from two cameras.

COMPUTER SIMULATIONS AND FIELD TEST RESULTS


Computer Simulations

To evaluate the performance of the automated method to quantify epicyclic motion, a virtual launch video was generated from a 3D CAD model of an M795 155mm projectile. The coloring, lighting, and frame rate were set to match conditions observed in real launch videos. The commanded motion exhibited roughly 2 nutation cycles and of a precession cycle in the 1,000 frame video which lasted 3.5sec (comparable to the cycles observed in a real 0.1sec launch video). Several different resolutions were investigated. Estimated values for the oscillation frequencies were calculated to within .05% error for pixel resolutions better than 112 pixels along the length of the projectile. Errors were significant at lower resolutions where there were only 53 pixels along the projectile length, which may indicate a lower bound of the algorithm. Running in MATLAB R2011a [14] on an HP EliteBook 8730w, the video analysis and post-processing required 55seconds. The results of this verification study are shown in Table I. An artillery field test of M795 155mm projectiles was conducted in which two TT systems were employed to evaluate the proposed method. Over 800 videos were recorded and analyzed. The average computation time was less than 4 minutes for each round (videos recorded at 2,000fps). A sample of the output results from one of those rounds is shown in Fig.6. The average velocity calculated during this shot was 765.5m/s.
Results from Real Video Analysis

As expected, the pitch angle measured by each camera in Fig.6 begins near zero as the bullet leaves muzzle of the cannon, and then exhibits smoothly oscillating sinusoidal motion in each camera plane. After converting the estimates from each camera to describe the resolved pitch () and yaw (), the pointing history is plotted on the right side of Fig.6 in the convention of Carlucci [15]. The history shows one entire clockwise nutation cycle and also gives an indication of the clockwise precessing motion. Figure 7 displays the results of the automated epicyclic motion analysis for the round in Fig. 6. The estimated oscillation frequency values for the round in Fig.7 were

271

17.6Hz and 4.9Hz for the nutation and precession rates, respectively. These values are higher than expected, but within the bounds of simulation estimates for the M795 projectile with a spatial angle of attack ( sp = 2 + 2 ) larger than 5 and occurring in the transonic region [16]. Stationary pitch and yaw cameras were also employed during this test. The cameras were oriented to measure the FMY, which was expected to occur around 25m beyond the muzzle of the cannon. Manual data reduction on the pitch and yaw cameras was received three weeks after the completion of the test. The FMY value determined from the pitch and yaw camera analysis differed from the spatial angle of attack at 25m measured by the automated method by an average of -.24 with a standard deviation of 0.80. These values are within the error bounds of the manual pitch and yaw camera analysis method [17]. Average velocity estimates from the automated method were consistently 0.9% lower than muzzle velocity radar estimates, which is consistent with the expected drag behavior of the projectile.
Limitations of the Automated Method

The automated method to quantify pitch and yaw described in this report requires two TT or Flight Followers which are costly systems, and require technicians to operate them. In addition, the algorithm does not function correctly if the projectile does not remain in view or if the size of the projectile resolution falls below one hundred and twelve pixels along the length of the bullet. Performance is degraded by poor focus of the image and by glint from the sun. In a few cases studied, the image of the projectile disappeared for several frames when the sun was in the background. Horizontal firings where objects appear in the background significantly hinder the segmentation algorithm. Efforts are underway to resolve this problem. There may also be time delay problems between the two TT tracker systems, or between the time recorded on each frame and the time recorded by the mirror rotation encoder. This would affect how the code is able to synchronize the data streams and compute the attitude vector in 3D. These delays are believed be on the order of less than one millisecond. The ability to capture longer segments of free-flight will improve the ability to

Figure 7. Automatically quantifying epicyclic motion.

272

characterize epicyclic motion and fit aerodynamic coefficients to the pitching and drag motion [18]. This could be accomplished by using sequential optical systems or by moving the cameras farther downrange and farther from the azimuth of fire (which has not been tested yet).
Future Work: Spin Rate Calculation

The ability to segment the projectile shape in a launch video has also allowed for experiments with new algorithms to automatically calculate spin rate of both spinstabilized and fin-stabilized projectiles that are painted with stripes. As shown in Fig.8, a column of pixels perpendicular to the projectile axis can be extracted from each frame. Hough Transform analysis can then be used to fit lines to the stripe boundaries on an image comprised of the pixel column from consecutive frames and determine the time at which each stripe passes through the projectile axis [19]. Efforts to calculate the spin rate of tactical projectiles without stripes are being investigated as well. This is done by searching for fiducial markings in specific locations on the projectile as it rotates. CONCLUSIONS The method described in this paper proved to be capable of segmenting an artillery projectile shape in each frame of a cannon launch video. Using two optical systems makes it possible to estimate projectile location, velocity, and pose in 3D. First maximum yaw estimates obtained by this method agree with estimates from conventional (manual) pitch and yaw camera analysis. The big advantage, however, is that the developed algorithm is capable of analyzing launch video within minutes of artillery projectile being fired, and offers a quick and precise estimate of projectile motion in the first few moments of free-flight. Future validation testing is scheduled to compare the estimates from the automated method to projectiles equipped with DFuze on-board measurement units.

Figure 8. Stripe history analysis for spin rate calculation.

REFERENCES
1. 2. 3. 4. FASTCAM SA5, Ultra High-Speed Video System. 2012. Photron USA, INC. San Diego, CA. Trajectory Tracker User Manual. 2012. Specialised Imaging. Temecula, CA. Itronix. 2012. DRSs Flight Follower System. Specification Sheet. www.itronix.com/pdf/DRS_ Flight_Follower.pdf. Accessed July, 2012. Davis, B., Guidos, B., Harkins, T. 2009. Complementary Roles of Spark Range and Onboard FreeFlight Measurements for Projectile Development. Army Research Laboratory Report No. ARLTR-4910.

273

5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24.

Use of Yaw Cards During Projectile Development. 2010. Arrow Tech Associates, Inc. Yaw Card White Paper. Davis, B., Harkins, T., Hepner, D., Patton B., and Hall, R. 2004. Aeroballisitc Diagnostic Fuze (DFuze) Measurements for Projectile Development, Test, and Evaluation. Army Research Laboratories, Aberdeen MD. ARL-TR-3204. Decker, R.J, Yakimenko, O.A. Hollis, M.S. Sweeney, P.J. 2011. On the Development of the Artillery Flight Characterization Electronics Rescue Kit. Proceedings of the 23rd Aerodynamic Decelerator Systems Conference. American Institute of Aeronautics and Astronautics, May 2011. TrackEye Motion Analysis Software. 2012. Photo-Sonics, Inc. www.photosonics.com/trackeye_ software.htm. Accessed July, 2012. Saroch, A. 2012.Optical Character Recognition. MATLAB source code. The Mathworks Inc. Szeliski, R. 2010. Computer Vision: Algorithms and Applications. Springer. Canny, J. 1986. A Computational Approach to Edge Detection. IEEE Trans. Pattern Analysis and Machine Intelligence, Vol.8(6), pp. 679698. Cootes, T.F., Taylor, C.J., Cooper, D.H., Graham, J. 1995. Active Shape ModelsTheir Training and Application. Computer Vision and Understanding. Vol. 61, No. 1, pp. 3859. McCoy, R. 1999. Modern Exterior Ballistics: The Launch and Flight Dynamics of Symmetric Projectiles. Schiffer Publishing. Atglen, PA. MATLAB. 2011. Version 7.12.0 (R2011a). Natick, Massachusetts: The MathWorks Inc. Carlucci, D., Jacobson, S. 2008. Ballistics: Theory and Design of Guns and Ammunition. CRC Press, Boca Raton, FL. Koenig, W. 2012. M795 Aero. Armaments Research Development and Engineering Center. Unpublished Data. Pitch and Yaw Report. 2012. Yuma Proving Ground, Data Reduction & Analysis Branch. Mission ID: 18SEP12. Unpublished Report. Tate, J. 2011. Extraction of Aeroballistic Coefficients from Flight Follower data. Ballistic Engineering Consulting Limited. Hough, P.V.C. 1959. Machine Analysis of Bubble Chamber Pictures. Proceedings of the International Conference of High Energy Accelerators and Instrumentation. Barnett, B. 1966. Trajectory Equations for a Six-Degree-Of-Freedom Missile Using a Fixed-Plane Coordinate System, TR-3391, Picatinny Arsenal, Dover, NJ. Driels, M. 2004. Weaponeering: Conventional Weapon System Effectiveness. AIAA, Reston, VA. Kolsch, M. 2010. Lecture Notes. Computer Science 4330, Computer Vision. Naval Postgraduate School, Monterey, CA. PRODAS. 2012. Version 3. Arrow Tech Associates. South Burlington, VT. Yakimenko, O. 2011. Engineering Computations and Modeling in MATLAB/Simulink. AIAA, Reston, VA, 2011.

274

Das könnte Ihnen auch gefallen