Sie sind auf Seite 1von 20

2015 AUVSI SUAS Competition

Eos Unmanned Air System

CUAir: Cornell University


Unmanned Air Systems Team
May 21st , 2015

Abstract
The Eos Unmanned Air System (UAS) is designed to complete a number of tasks in
support of an autonomous reconnaissance mission during the Association of Unmanned
Vehicle Systems International (AUVSI) Student Unmanned Air Systems (SUAS) Compe-
tition. These task include: manual takeoff and landing; autonomous flight and waypoint
navigation; manual sense, detect, and avoid; interoperability; target detection, localization,
and classification; autonomous search; automatic target localization; actionable intelligence;
off-axis target imagery and classification; emergent target re-tasking and identification; sim-
ulated remote intelligence center interaction; and autonomous air drop release with required
drop accuracy. The airframe has conventional topology with an electric tractor propeller,
high-aspect ratio wings, and a conventional tail. Imagery is gathered from a Point Grey
Flea3-8.8MP camera mounted on a pitch & roll gimbal, managed by an Intel NUC com-
puter, and transferred via two 5.8GHz Ubiquiti Bullet radios to a Netgear R7000 router
on the ground. A 2.4GHz Ubiquiti Bullet connects to the SRIC network. Flight control
and navigation is performed by a modified version of the ArduPlane software running on
the Pixhawk hardware platform. Between Eos and its sister plane, Icarus, over 30 flight
tests have been conducted. Unfortunately, the team experienced a crash recently, without
enough time to repair and prepare to compete, so Eos will not fly a mission at the 2015
SUAS Competition.
CONTENTS Eos Unmanned Air System

Contents
1 System Design Rationale 3

2 Payload System 3
2.1 Camera & Camera Gimbal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Air Drop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 Payload Computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.4 Airborne Power System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.5 Communication System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

3 Flight System 8
3.1 Aircraft . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.2 Navigation System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

4 Ground System 14
4.1 Catapult . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
4.2 Communication System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.3 Targeting System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.4 Automatic Detection & Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4.5 Localization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

5 Mission Operation 18
5.1 Safety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

6 Summary 19

CUAir: Cornell University Unmanned Air Systems Team 2 of 20


2 PAYLOAD SYSTEM Eos Unmanned Air System

1 System Design Rationale


The Eos UAS is designed to maximize performance on mission tasks, while also ensuring reliability. The
systems mission performance is most directly related to the ability of the system to capture images and
transfer them to the ground station for processing. Therefore the camera and network links were selected
first. A Point Grey Flea3-8.8MP machine vision camera with an 8mm lens was chosen for its small size, low
mass, and the ability to finely tune capture parameters. The camera then influenced the design of a dual-axis
brushless gimbal to actuate it. The need for high network reliability and bandwidth to transfer images to the
ground system led the team to two Ubiquiti Bullet M5s coupled with custom data-handling software to utilize
bandwidth and increase fault tolerance. To facilitate integration with the imagery system, implementation
of a sense, detect, and avoid system, and ease of tuning the control parameters, the open-source Ardupilot
firmware was selected as Eos navigation system. The custom airframe is designed to support these payloads,
while ensuring ease of access and providing 25 minutes of flight time. Additionally, the airframe is designed
to be catapult-launched and belly-landed to simplify the autopilots role in autonomous takeoff and landing
and to allow for testing even with adverse ground conditions. The ground station, being far less space- and
weight-constrained than the aircraft itself, is designed to supplement the airborne system with additional
computing power, persistence, and human interaction where necessary. The operational roles and layout
followed from the ground station design to provide necessary input, add manual redundancy, and ensure safe
operation of the UAS.

2 Payload System
The payload system consists of all airborne components of the UAS which are not required for autonomous
flight. It performs the imaging, SRIC, and air drop tasks defined in the mission specification by employing a
machine vision camera, a two-axis gimbal, a mini servo, and a 64-bit Intel computer with custom software.
The Cornell team has decided not to attempt the infrared target task due to budgetary constraints and the
negative effect that the extra weight would incur on other aspects of the mission. In support the mission tasks,
the Cornell team has also devised an airborne power system to provide multiple independently-controllable
rails of efficiently regulated power from a single battery, and communications system to reliably transfer
images and commands between the payload system and the ground system.

2.1 Camera & Camera Gimbal


The Cornell team decided to capture many images at regular intervals to guarantee full coverage of the
ground. To be effective, the imaging system must support a minimum resolution that guarantees that
targets can be resolved, and a minimum capture rate that guarantees ground coverage. The calculations for
these minimum thresholds are shown in Equations (3), (4), and (7) for this years imaging system, which
uses a Point Grey Flea3-8.8 with a Tamron M118FM08 lens for 8 megapixels of resolution.
Equations (3) and (4) show that this imaging system can provide a resolution of 1.49 inches per pixel.
The competition rules define targets to have a minimum feature length of 2 inches; therefore, the system
can resolve the smallest target features. Equation (7) shows that the system must capture images at a rate
of at least 0.5 images per second; the camera system is capable of imaging at up to five images per second.
However, the manual targeting operators cannot process five images per second, so the system is restrained
to capture one image per second, which is still more than sufficient to guarantee full ground coverage.
The Cornell team identified three types of cameras that could have been used for imaging targets:
point-and-shoot cameras, machine vision cameras, and digital single lens reflex (DSLR) cameras. The team
experimented with all three types of cameras and determined that the machine vision camera type is the
optimal choice. When compared to machine vision cameras, point-and-shoot cameras produced lower-quality
images at a greater weight and form factor, but at a lower cost. DSLR cameras produced similar or higher-
quality images compared to machine vision cameras, but at a significantly greater weight and size. Although

CUAir: Cornell University Unmanned Air Systems Team 3 of 20


2 PAYLOAD SYSTEM Eos Unmanned Air System

Figure 1: Images of Targets from the Machine Vision Camera: The left image shows full image. The right
image is a close-up of a target with high contrast (green on red).

machine vision cameras are more expensive and complicated than the DSLR camera the team has used in past
competitions, it was decided that the reduced weight and form factor of machine vision cameras outweigh the
increased cost and development time. After a trade study considering cost, imaging performance, interface
type, weight, and form factor, the Flea3-8.8 was chosen as the teams imaging system.

! The distance per pixel (D) defines the distance


F OV interval resolvable in an image. This quantity is
C (F OV , h) = 2h tan (1)
2 a function of the coverage (C ) and the number of
C (F OV , h) total pixels (P ). C is a function of the angular
D(F OV , h, P ) = (2) field of view (F OV ) and the imaging height (h).
P

Dhorizontal = D(39.6 , 500ft, 4096) The Point Grey Flea3-8.8 with a Tamron
feet M118FM08 lens has a horizontal field of view of
= 0.0879 (3)
pixel 39.6 degrees, a vertical field of view of 30.02 de-
Dvertical = D(30.02 , 500ft, 2160) grees, and a resolution of 4096x2160. The maxi-
mum required imaging height is 500 feet.
feet
= 0.1241 (4)
pixel

The Flea3-8.8 camera is controlled by FlyCaptures proprietary FlyCap software. This software allows
image capture, image transfer to the payload computer, and in-flight modifications to various camera prop-
erties, such as shutter speed and brightness. Custom payload software was developed to integrate FlyCap
with the rest of the system. An additional advantage of the Machine Vision camera is the ability to use
hardware triggering to command the camera to capture an image at a precise moment. This functionality
is used to make it easy to precisely synchronize the time of capture with the stream of telemetry data by
causing the capture to occur the instant a new piece of data arrives in the system.
A two-axis, stabilized gimbal system enables the Eos UAS to image off-axis targets and maintain a
downward gaze during turns and altitude changes. The competition rules require that the Eos UAS capture
an image of a target that lies up to 500 feet to the left or right of the aircraft while the aircraft is at an
altitude of 300 feet. This specification requires a field of view of up to 60 degrees to either side of the aircraft
to image the target. The imaging system gives a camera field of view of 20 degrees to either side while the

CUAir: Cornell University Unmanned Air Systems Team 4 of 20


2 PAYLOAD SYSTEM Eos Unmanned Air System

S The image capture frequency (f ) defines the min-


f = (5) imum capture rate needed to guarantee ground
C
S coverage. This quantity is a function of the vehi-
f (S , F OV , h) = (6) cle speed (S ) and the coverage (C ).
C (F OV , h)
The camera is mounted such that the vertical di-
rection of the image is the direction of motion
factual = f (35kts., 30.02deg, 200f t)
and is thus the dimension of interest for the fre-
images quency calculation. The minimum required imag-
= 0.5508 (7)
s ing height is 200 feet.

Figure 2: Gimbal System Design and Implementation: The left image is the CAD of the gimbal control
board. The middle image is the fully assembled gimbal control board. The right image is the CAD of the
gimbal mechanism

camera is pointed straight down. Therefore, the gimbal system must be able to roll the camera by at least
40 degrees. To capture multiple images of the off-axis target, the gimbal must also pitch the camera forward
and backward as the aircraft flies by the target. The team further requires the gimbal to keep the camera
pointed at the ground during roll and pitch changes that occur during turns, ascents, and descents. This
requirement improves the total ground coverage of captured images and reduces the distortion in images
processed by the ADLC and MDLC systems.
The gimbal system consists of a gimbal mechanism, a control board, and an interface software on the
payload computer. It has three operating modes: stabilized, point-at-GPS, and retract, which can be changed
by the imagery system through a serial interface. The gimbal system employs its own inertial measurement
unit (IMU) with 6 degrees of freedom to measure the orientation of the aircraft with respect to a global
North-East-Down (NED) frame to allow the controller to update at 100Hz. The gimbal controller receives
a 5Hz update from the autopilot with the aircrafts heading and position which allows the controller to
compensate in the yaw drift of the gyroscope and to point at a given GPS position on the ground. Feedback
from magnetic encoders with 0.1 degree of accuracy allow a proportional-integral-derivative controller to
track the desired motor angles as determined by the state and the mode.
The gimbal mechanism itself was constructed from laser-cut pieces of aircraft-grade birch plywood to
produce a light-weight yet stiff design. Both the weight and the stiffness are critical to the performance
of the larger system because the aircraft must remain light so that it can be catapult-launched safely, and
any error in the cameras orientation has a large impact in the accuracy of our localization algorithm. The
gimbal mechanism is mounted rigidly to the rear wall of the fuselage using four screws. The physical gimbal
and control board are shown in Figure 2.

CUAir: Cornell University Unmanned Air Systems Team 5 of 20


2 PAYLOAD SYSTEM Eos Unmanned Air System

2.2 Air Drop


The Air Drop system must drop a relief canister such that it lands within some threshold distance from
a given target. To accomplish this, a process on the on-board computer uses the planes position, ground
velocity, altitude, wind speed to determine the time to release the relief canister, or if it can even reach the
target at all.

Figure 3: Logic for determining the best time to trigger AirDrop.

When armed from the ground station, the master aerial software node (MASN) requests a drop time
from the airdrop planning module (ADPM). The ADPM uses the aircraft state data to simulate a drop and
determine where the relief canister would land relative to the aircraft. The ADPM then finds the time at
which the relief canister would land closest to the targetthe black dot in Figure 3and the distance that
this drop position would be from the target center. The team sets a threshold radius for each pass, so that
a second attempt can be made if the first is not satisfactory.

2.3 Payload Computer


The payload computer is responsible for coordinating the operations of the payload system: it is used for
obtaining and transmitting data from the air system back to the ground as well as acting on requests
from the ground, in the case of air drop. The payload software runs on the payload and is responsible for
orchestrating these tasks. The first thing that it does is establish a WebSocketa two-way network protocol
over TCPconnection with the ground server. During normal operation the payload system responds to
requests from the ground system, updating its internal state as appropriate. During network failure, the
payload system can continue to operate, capturing images and controlling the gimbal, but will not airdrop
until the connection is re-established.
The primary requirements for the hardware of the payload computer are: enough processing power to
compress raw images from the camera, compatibility with hardware drivers, and reliability. To provide
compatibility with hardware drivers, the team limited its search to Intel x86 architecture computersthe
newest versions of which are all capable of compressing images quickly enough. Initially, the Portwell NANO-
6060 was chosen, for its size and weightthe lack of a case helped both of these issues. However, the team
found it to be unreliable, partially due to the fact that it was often placed in a dusty environment or in
contact with other components. The slightly heavier Intel NUC was then selected as a replacement because
of its protective case, yet relatively small form factor. The NUC has proved itself in the past as the teams
on-board computer in last years system.

CUAir: Cornell University Unmanned Air Systems Team 6 of 20


2 PAYLOAD SYSTEM Eos Unmanned Air System

2.4 Airborne Power System


The airborne power system is responsible for providing regulated power to all electronics on-board the
aircraft with the exception of the propulsion motor. This includes the gimbal, machine vision camera,
payload computer, Ubiquity bullets, and the servos for the controls of the plane.
All of these systems require either 22V, 12V, or 6V, so a 6S (22.2V), 3300-mAh lithium polymer battery
provides power, which is then regulated to obtain the correct voltages necessary. An isolated 12V Murata
DC-DC step down switching regulator provides a 12V line, but TI adjustable switching regulators are use to
provide the 6V line and another 12V line because those two are used for custom circuitry, whose specifications
could change.
Though switches under one of the aircraft hatches provide direct power cut-offs, the airborne power system
also includes an ATMega1284P microcontroller which can toggle power to any payload as commanded from
the ground station through the on-board computer. This allows the team to start components in a specific
order, reset components from the ground without needing to land, or to shut down componentssuch as
servoswhile the aircraft is on the ground to conserve power.

2.5 Communication System


The wireless communications system in the Eos UAV consists of three logically distinct wireless links; the
navigation linkused solely to interface with the autopiot, a data linkused to transfer images and payload
commands, and the Simulated Remote Intelligence Center (SRIC) linkto perform the SRIC task. Figure 5
shows an high level overview of the communication system.

Data Link The primary task of the data link is to transfer images taken during the mission to the ground
where they can be processed. It must be able to provide a constant, high-bandwidth connection so the
images are conveyed with minimal delay. Two long range Ubiquiti Bullet M5HP (Bullet M5) WiFi to
Ethernet bridge were selected to provide this link for their size, simplicity, reliability, and previous success.
The Bullet M5s are interfaced with the Intel NUC Computer over Gigabit Ethernet and retransmit data
using the IEEE 802.11n WiFi standard operating at 5.8 GHz with a max transmit power of 25 dBmusing
WPA2 encryption. To ensure a constant connection with the ground, the Bullet M5s are designed to enter
a continuous loop where they will attempt to connect to a predefined network if the connection is dropped.
The use of two radios also provides some element of redundancy, where if one stops working the other
will carry out all data transfers, albeit in a limited capacity. Through previous system iterations the team
learned that if one 2.4 GHz and one 5.8 GHz link are used for data transfer, the lower frequency acts as a
bottleneck for all data transfers. Using two 5.8 GHz links eliminates this issue. With these changes, speeds
of 50-60 Mbps are achievable at distances up to a kilometer, with speeds up to 130 Mbps achievable at close
range. The minimum bandwidth required to transfer images in real time is roughly 30 Mbps, the current
communications system exceeds requirements.
Both Bullet M5s are connected to omnidirectional 3dBi monopole antennas and thus radiate energy in a
near spherical pattern. This allows for a constant wireless connection regardless of the aircrafts orientation.
The data link antennas are oriented vertically to maximize their horizontal range and placed at each wingtip
to increase physical separation and reduce potential multipath interference. Given the metallic radiating
elements in the aircraft the radiation simulations revealed the patternsuch as in Figure 4in which Eos
antennas will transmit energy and analogously data. These results were taken into account when determining
antenna placement.

SRIC Task The Simulated Remote Intelligence Center (SRIC) task requires the Eos UAS to connect to a
remote wireless hot-spot in the competition field. While connected it must download a message from a remote
computer attached to the network, and upload an image to the same computer. The task is automated by the
use of a UNIX script which uses the SSH protocol to modify a Ubiquiti Bullet M2s configuration on-the-fly

CUAir: Cornell University Unmanned Air Systems Team 7 of 20


3 FLIGHT SYSTEM Eos Unmanned Air System

Figure 4: Radiation Pattern with two 5.8


GHz links located at the wingtips of the
aircraft Figure 5: Communication System Block Diagram

to connect to the SRIC network. The script then loops waiting to connect to the network. Once connection
is confirmed, the script opens an FTP connection to the SRIC computer and download the SRIC message.
It then transfers an image to the computer in a binary fashion, literally writing the image byte by byte to
the FTP server. The script then asks for user feedback about the validity of the mission from a user on
the ground before terminating. The script is built to be as modular as possible and allows for automatic
execution by the Intel NUC as well as remote execution from the ground.

3 Flight System
The flight system is comprised of the aircraft and navigation system whose primary responsibilities are to
carry the payload system into the air and around the field so that they can perform their missions. The
majority of the requirements on these systems are derived from the payloads themselves rather than directly
from mission specifications.

3.1 Aircraft
Though critical to the completion of the mission, the airframe is not directly assessed during the mission.
The primary objective of the airframe is to facilitate other systems in accomplishing mission tasks. As such,
some of the major drivers for the airframe design are accessibilityto ease integration and troubleshooting,
stability and controllabilityto facilitate autopilot tuning and autonomous control, efficiencyto ensure
that the system can remain aloft and utilize the mission time provided, durability & repairabilityto allow
for uninterrupted testing, and the ability to catapult launch and belly-landto make autonomous launch
and landing significantly simpler for the autopilot.

3.1.1 Design
Eos airframe is constructed from a carefully selected set of different composites, foams, and woods, including
fiberglass, carbon fiber, Kevlar, Garolite, Pactiv R10 Unfaced Polystyrene Foam, birch plywood, and poplar
plywood. The team utilized wet layups of fiberglass-sandwiched foam for the majority of the aircraft body,
and strategically integrated Kevlar on the bottom of the fuselage to provide abrasion resistance. Carbon

CUAir: Cornell University Unmanned Air Systems Team 8 of 20


3 FLIGHT SYSTEM Eos Unmanned Air System

Figure 6: Eos CAD Rendering

fiber spars were integrated in the wings and tail near joints for added strength. The team used laser-cut
plywood ribs to join the composite parts and serve as breakpoints in the event of a crash.

Payloads & Accessibility Eos is designed to carry and protect 6.7 pounds of electronics and batteries,
as required by the payload system. The width and height of the fuselage were determined using the mea-
surements of the largest piece of equipmentthe camera gimbalwhich required 6.5in of width and 6in of
height.
The team designed the aircraft such that the entire top of the fuselage is removable. The wings are held
00
on by a single 1/4 20 thumb screw and sliding support to make it easy to remove, and the remainder of
the top of the fuselage is covered by two magnetic hatches. With this design, to access a payload, at most
one screw needs to be removed, but payloads that require high accessibility are mounted below the hatches
instead of the wings.

Stability & Controllability As the team determined desired flight characteristics, stability emerged as the
most imperative. Foremost, the team uses a high-wing configuration, which also increases the usable space
within the fuselage and decreases the likelihood of a wing strike on belly-landing. The wings also feature a
two-degree dihedral, as this number has worked for the team in the past and is recommended from various
sources. Further, Eos design incorporates two degrees of washoutthe tip has a shallower angle of attack
than the rootso the root section of the wing will stall before the tip, assuring that the pilot retains roll
control of the aircraft.

Efficiency The entire geometry of Eos was designed to be sleek and aerodynamic. Though the payloads
require a relatively wide and tall fuselage, the team created a long and slender nosecone and empennage to
increase the aircrafts efficiency. For the wings, the team used XFoil to find that the MH114 airfoil was best
suited for the missionthe MH 114 has a very high lift to drag ratio, while still having very gentle stall
characteristics. This airfoil has also been proven to work well as the team has used it for all of its aircraft
thus far. To increase the efficiency of Eos wings specifically, the team increased the aspect ratio, resulting in
a much longer, more slender wing. The aspect ratio the team chose for Icarus was 12, allowing for not only
a longer flight time but also a more stable flight. Further, Eos wings have a linear taper with the tip chord
as half the root chord to mimic an elliptical wingreducing dragwhile remaining easy to manufacture.

CUAir: Cornell University Unmanned Air Systems Team 9 of 20


3 FLIGHT SYSTEM Eos Unmanned Air System

Figure 7: Airframe Features: Catapult Mounting Holes (left); Modular Design (center); Wing Mount and
Hatches (right)

Durability & Repairability To enhance the durability and repairability of the airframe, during the design
process the team considered a number of scenarios that could potentially damage the UAS. These include
rough belly-landing, wing-tip strikes, camera damage, and full crashes.
One case is when a wingtip contacts the ground on landing. This scenario would produce a large rearward
force on one wing, which could easily cause some component to break. Rather than design a system to
withstand this load, the team created a strategic break-point to absorb the impact: the wing is held in place
00
by a 1/4 20 thumbscrew in the front and by a piece of plywood in the rear. If a wing strike occurs, the
rear piece of wood will snap and allow the striking wing to rotate backward around the thumbscrew. The
plywood can be easily replaced so testing and operation of the UAS are barely effected.
Another scenario that Eos was designed for was an especially fast vertical descent, which produces large
impact forces on the nosecone and fuselage. Again, rather than designing the system to survive, the team
planned for ease of repair. Both the nosecone and the fuselage are created from female molds, which allow
backup parts to be made in a span of days rather than weeks. In the event that the aircraft impacts
the ground in a crasheven more violent than a rough belly landingthe aircraft is designed such that
the plywood connection braces with take the brunt of the impact and leave the payloads and composite
components in relatively good shape.
Belly-landing is especially difficult for this mission because it requires that the camera be exposed to
produce good imagery. To mitigate the risk of damaging the camera, the empennage is designed with a lip
on the back that will impact the ground before the camera. Furthermore, the gimbal rotates the camera
upwards during landing, to prevent debris from hitting the lens.
Apart from designing for repairability after a mishap, precautions we taken so that the aircraft would
not get damaged in the first place. Eos was designed for its wings, vertical stabilizer, horizontal stabilizer,
empennage, and fuselage to all separate so that it could be transported in foam-filled ruggedized cases.
Additionally the wings are designed as two halves that come together and connect onto the fuselage using a
bolt and slide-in wooden piece. Figure 7 shows Eos separated into its components.

Catapult Launch & Belly-Landing As discussed earlier, the team decided to leverage its mechanical skill to
produce a system that simplified the requirements on the navigation system autonomous launch and landing.
Hand-launch was deemed to unreliable and too restrictive on aircraft size, so catapult launch emerged as a
simple was of getting the aircraft airborne so the navigation system could just fly it. Eos design incorporates
four holes for the arms of the catapult to hook into. The connection points are reinforced internally using
Garolite, and are carefully placed so that the aircraft is being mostly pulled from in front of its center of
gravity, rather than pushed, which could cause it to jump off the catapult.

CUAir: Cornell University Unmanned Air Systems Team 10 of 20


3 FLIGHT SYSTEM Eos Unmanned Air System

For landing, the team chose to forgo landing gear and allow the aircraft to skid on its belly. This means
that the autopilot does not need to perfectly orient wheels with the aircrafts direction of travelfailure to
do this would result in the aircraft flipping over. The arched kevlar bottom creates an extremely durable
surface that disperses the landing load to the fuselage walls, resulting in a fuselage strong enough to endure
many belly landings. After the motor is turned off, the propeller folds against the edge of the nose cone to
avoid splintering upon impact.

3.1.2 Testing & Evaluation


After designing and manufacturing Icarusa test aircraftand Eos, the team successfully completed 25
test flights. Through these test flights, the team was able to assess how well Eos accomplished the primary
adjectives of stability, and controllability, efficiency, durability & repairability, and the ability to catapult
launch and belly-land.

Table 1: Eos Aircraft Flight Characteristics

Characteristic Spec Characteristic Spec


Wing Span 108in Payload Weight 2.5 lbs
Length 64in Takeoff Weight 18.2 lbs
Height 16in Cruise Speed 31 KIAS
Wing Area 853.4in2 Min Turn Radius 100 ft
Aspect Ratio .5 Flight Time 30 min

Flight Characteristics After takeoff, Eos can easily climb with a vertical speed of 25 KIAS and can fly
level as slow as 27 m/s without flaps. When flying, Eos can have a turn radius of 100 feet. At cruise speed,
the throttle is set to 45 percent, which results in a 25 minute flight time before the 9 cell, 5000mAh battery
reaches 25 percent charge. Upon landing, the flaps slow Eos down to 21 m/s, which allows Eos to reach a
full stop in less than 30 feet after initial touchdown.
When flying Eos, there was no sideslip, indicating a properly sized tail and a stable aircraft. When Eos
flew horizontally with throttle cut, Eos did not initially point up or down, further enforcing its stability. The
two degree dihedral also proved the self stabilizing affinity, when Eos flew is very low wind conditions. The
large flaperons provided very sensitive controllability, enabling Eos to turn very quickly when piloted by the
autopilot, and subsequently hitting waypoints that required Eos to rapidly change directions. This ability
helped Eos avoid moving obstacles with ease.

Repairability & Durability Throughout the aircrafts life, various repairs were needed after test flights. In
one instance, radio interference resulted in the teams aircraft loosing control and hitting the ground. The
left wing snapped in half, all ribs broke, the nosecone buckled, and the empennage sheared off from the
fuselage. Even though the left wing snapped, the team believed that the wings would have been in much
worse shape if not for the breakaway point between the wing and the fuselage. This point broke as projected,
and allowed the wings to snap from the fuselage, dissipating some of their energy. Despite this seemingly
devastating crash, the airframe team was able to rebuild Icarus in that night, and get Icarus back in the
sky the following morning. This accomplishment was due both to the teams strong work ethic and to the
modular design and repairability of the aircraft. The ribs were easily reprinted and glued together, while
the wings were easily repairable by a one lamination. The nosecone was patched, unbuckled, and laminated
within two hours and the empennage did not need any repairs. This one-day rebuild proves how repairable
the teams design was. Eos also proved to be extremely durable, surviving many belly landings without

CUAir: Cornell University Unmanned Air Systems Team 11 of 20


3 FLIGHT SYSTEM Eos Unmanned Air System

a need for repair. When landing over rocks, no abrasions were inflicted on the fuselage, and, even when
dropping five feet, on the ground, Eos sustained no damages.

Catapult Launch & Belly-Landing The Garolite-reinforced catapult connection holes, located at the bot-
tom of the fuselage, interacted with the catapult perfectlydistributing its load to the rest of the fuselage.
Through all flights, Eos did not experience any damage from the catapult. As described in the durability
discussion above, Eos survived all belly-landings with minimal abrasions.

3.2 Navigation System


The waypoint navigation and autonomy mission requirements both necessitate the use of a navigation system.
Autopilot systems are critical yet complex components, which make developing a custom system prohibitively
difficult for the Cornell team. However, there are many commercial and open source systems that can fulfill
the mission requirements. These systems eliminate development time, but the number of choices requires
time to identify, test and fully integrate the best system. The following sections contain a trade study
of available commercial and open source systems, and a summary of the testing and integration of the
ArduPilot PixHawk autopilot navigation system. The new Sense, Detect, & Avoid (SDA) element of the
mission requires the autopilot ground station integrate with custom interoperability software, which seemed
troublesome with the propriety Cloud Cap Piccolo system that the Cornell team has used in previous years.
Therefore, the team decided to pursue a more open solution that has the potential to be well-integrated with
all UAS systems in the future.

3.2.1 Trade Study & System Selection

Table 2: Comparison of Navigation Systems [1, 2, 3, 4, 5, 6]

Piccolo II Kestrel osFlexPilot Paparazzi ArduPilot Mega


Waypoint Navigation X X X X X
Rolling Takeoff & Landing X
Support DGPS X X X [future]
Search Grid X
Expandable X X X
ITAR-Governed X X
Provide Support X X X
Ease of Transition (1-10) 10 7 7 2 4
Cost ($) 0 15000 6900 700 500

Both research and past experience were utilized to choose a navigation system. In previous years the
team has used both the Kestrel and Piccolo II navigation systems. The Kestrel performed well enough but
was difficult to use, did not support rolling takeoff and landing, and was not highly robust or reliable. The
Piccolo II system performed far better. The autopilot was able to achieve stable autonomous flight within a
month of using the system. Further, it supports autonomous rolling takeoff and landing although the Cornell
team had difficulty taking advantage of this feature in past years. However, the Piccolo systems code is not
editable, and its application program interface (API) is limited and often difficult to use. Thus, the team
would be unable to easily interface with the navigation system, which is essentially required by the mission.
There are a few open source autopilot options, most notably, ArduPilot . ArduPilot is supported by
hundreds of developers and an even larger community, in addition to official support from 3D Robotics, the
primary producer of this open-source autopilot hardware. Furthermore, the ArduPilot supports autonomous

CUAir: Cornell University Unmanned Air Systems Team 12 of 20


3 FLIGHT SYSTEM Eos Unmanned Air System

takeoff and landing and many other features that will be valuable in future years. There are two main
platforms that are capable of running ArduPlane, ArduPilots plane control software: the ArduPilot Mega
(APM) and the PixHawk flight system. The team chose the PixHawk system since it has a more powerful
processor that will enable the team to expand the autopilot system in the future. The entire autopilot system
is significantly less expensive than the Piccolo system, so unlike in years past, the team was able to replicate
the system and create test flight platform in addition to the competition flight platform. This extra test
plane allowed for testing of risky maneuvers such as landing that could damage the competition platform.
This test flight platform also allowed for immediate integration. The team could therefore test autopilot
systems while the competition platform was being built and integrated.
The entire autopilot system needed to be integrated seamlessly with other software on the plane and on
the ground. The team evaluated the available ground station options that supported MAVlink, the commu-
nication protocol used by ArduPilot, including QGroundControl, APM Planner and Mission Planner. These
grounds stations did not have a robust API and were often unreliable, crashing multiple times during testing.
Therefore, the team decided to revamp an existing open source ground station called MAVProxy. MAVProxy
is a command-line and console based ground station built in Python that can handle communication to the
ArduPilot. MAVProxy is open-source and thus the team could manipulate it and build custom modules.
The team first created a robust web API that enabled other systems, including the imagery system, to easily
access autopilot data. In addition, the team designed two web-based interfaces that connect to the API. The
first is a Graphic User Interface to view autopilot status and easily control the system. The second interface
lets judges view the status of the autopilot. The team also developed a custom module to connect to the
interoperability system. The most important aspect of MAVProxy is its flexibility that will enable the team
to expand in the future.

Figure 8: The main view for the ground station

3.2.2 Implementation & Integration


Tuning To prepare for an autonomous flight of Eos or Icarus, a model of the plane was built to fly in
XPlane, another flight simulator. XPlane took the place of FlightGear in the simulator due to its easy to

CUAir: Cornell University Unmanned Air Systems Team 13 of 20


4 GROUND SYSTEM Eos Unmanned Air System

use PlaneMaker software. The model was then tuned in the simulation by manual editing PID parameters.
Once the tuning was perfected in the simulation, the autopilot was added to the aircraft.
After a manual takeoff, the autopilot would take over flight and go into AutoTune mode. Using the PID
values from the simulation as a base, the autopilot self tuned while the pilot flew aggressive maneuvers.
AutoTune takes roughly ten to twenty minutes of flight time. Once AutoTune was completed, the plane was
landed manually. In subsequent flights the navigation system was tuned, auto takeoff and auto landing.

Flight Pattern All sections of the flight plan are well-defined by the AUVSI SUAS Competition except
that of the search grid. For this portion of the mission, a rectangular sweeping pattern will be used that
gradually moves down the length of the search grid. The desired spacing between passes on the search grid
is defined by the total distance imaged along the roll axis, found to be 180 ft. This distance was computed
as part of the distance-imaged-per-pixel calculation shown in Equations (3) and (4). Thus, a rectangular
sweeping pattern with passes separated by at most 180 ft will be used.

Communication All communication to the Autopilot will go over a 900MHz XBee encrypted paired radio
link. The protocol for the communication, MAVLink, is highly robust to noise and is designed to go over a
slow connection to ensure a long distance connection. The XBee link can go up 3 miles, however this range
was not tested.

Failsafes Both failsafes are implemented in the Autopilot. After 30 seconds of loss of Radio Control signal,
the plane changes to Return To Launch mode. The plane stays in this mode for 3 minutes. If radio control
is not regained after those 3 minutes, the plane performs aerodynamic termination. If during any of the 3.5
minutes of fail safe, radio control link is regained, the plane will return to the autonomous mission. Both
failsafes have been tested on the ground, neither have been tested in the air.

3.2.3 Testing
The open-source nature of the ArduPilot system allows for changes to its software, and in some ways
actually requires them. Since the ArduPilot software is not a commercial product, the codebase contained
small bugs and was poorly documented in many places. The system, therefore, required extensive testing
and documentation in order to perform reliably. These issues were first addressed in a simulated environment
before being flight-tested on a real aircraft.
The beginning of the year was spent assembling a simulation suite that connected a simulated autopilot
to FlightGear, a flight simulator. The simulation runs the same code as the physical autopilot and is able
to simulate all processor and sensor aspects such as registers, communication delays and sensor noise. The
team wrote a conneciton layer that passed servo values from the simulated autopilot to FlightGear, which
simulated the flight dynamics. The connection layer then took the attitude and GPS values from FlightGear
and passed them bak to the autopilot as simulated sensor values. The plane tested in the simulator was
provided by the ArduPilot community and very closely resembled our Autopilot Test Aircraft (Bixler V3).
This simulation system allowed the team to test and tune the aircraft before the initial flight.
Secondly, software was tested on a test platform that was light and inexpensive to allow for relatively
safe crashes in the event of software issues. The test aircraft also allowed the autopilot team to work with
a real aircraft with similar flight characteristics to the competition plane, while the mechanical team built
and assembled competition aircraft, Eos.

4 Ground System
The ground system for the Eos UAS is comprised of the equipment and software necessary for wireless
communication, image processing, in-flight re-tasking, and manual override of autonomous airborne systems.

CUAir: Cornell University Unmanned Air Systems Team 14 of 20


4 GROUND SYSTEM Eos Unmanned Air System

Eos ground system also includes a pneumatic catapult to provide consistent, reliable launches.

4.1 Catapult
The Eos UAS employs a catapult for takeoff and belly landing in order to facilitate fully autonomous
takeoff and landing. Previous iteration of the Cornell teams UAS have used fixed landing gear to perform
takeoffs and landings from a runway. While our autopilot system is capable of autonomous rolling takeoff
and landing it was unreliable at best. Therefore, the Cornell team decided to explore other take off and
landing approaches, such as hand and launching. A hand launching approach suffers from the unreliability
and inconsistency of the human operator and it also limits the size of the UAV. A catapult launch was
selected for its reliability, consistency, and proven effectiveness with commercial and military UAV systems.
Additionally, catapult launch and belly-landing allows the system to be tested even when ground conditions
are unfavorable. The Cornell team designed, manufactured, and tested a portable, reliable, and safe linear
catapult system driven by a compressed air piston-pulley propulsion system.

Figure 9: Rear of the catapult


frame, also picturing the cart Figure 11: Front of the frame, also
Figure 10: Pneumatic Catapult picturing the pulley, springs, and
connection to reservoir

Eos catapult is composed of the cart, the rail, the pneumatic system, and the control system. The rail
is made of two lengths of galvanized steel square tubing connected by an inner coupling piece. Within the
interior of the rail rests the cylinder for the piston which will convert the pressure from the compressed air
into a force on the cart. On the rear end is the bracket for a safety pin that attaches to the back of the
cart. On the front end are the support legs, shock-absorbing springs for the cart, and the pulley mechanism.
Beside the catapult sits the reservoir and the catapult pressure control box (CPCB). The CPCB controls
the pressurization of the reservoir and the firing of the catapult. The catapult remote control unit (CRCU)
is connected by a CAT5 cable to the CPCB and displays information about the state of the catapult of
the user. The CRCU also allows the user to control the pressurization and launch of the catapult from a
distance.
During launch, the pneumatic system releases compressed air, which forces a piston rear-ward down the
center of the rail, pulling a cable that loops around a pulley at the front, and accelerates the cart and aircraft.
Once the cart reaches the end of the rail, the cart is stopped by two large springs, and the arms that hold
the aircraft are free to rotate forward, allowing the aircraft to continue moving straight.

CUAir: Cornell University Unmanned Air Systems Team 15 of 20


4 GROUND SYSTEM Eos Unmanned Air System

4.1.1 Safety Features


The catapult incorporates a number of important features to ensure safety. Every component of the pneu-
matic system is rated to at least 200 psi, significantly higher than the operating pressure of 110 psi. A 175
psi release valve is connected to the reservoir to prevent the system from approaching the maximum rating
of its components. Additionally, the 12V pump that is used to pressurize the reservoir is only capable of
reaching 150 psi. An analog pressure gauge is mounted to the face of the pressure control box along with
a manual pressure release valve so that if all electronics fail, the pressure can be read, and the reservoir
can still be safely depressurized. During typical operation, the RCU allows pressurization, monitoring, and
launch to be performed from a safe distance. The current battery voltage, safety pin status, target pressure,
and reservoir pressure are all displayed on the RCUs screen. Only once the target pressure is reached the
microprocessor will stop the pressurization, so that the safety pin can be removed via a string. The covered
switch will then light up to indicate it is live and the catapult is ready to fire.

4.1.2 Testing Results


Numerous tests were performed incrementally to confirm the safety features and operational ability of the
catapult. Firstly, the reservoir was pressure tested to 175 psi while filled with water to ensure that any
failure would not be catastrophic. The safety pin was tested by test-firing the catapult at 150 psi with the
pin installed; after this, there were no signs of damage to either the pin or the catapult. Prior to launching
a real aircraft, an old fuselage with outfitted with the proper mounting holes and test-launched. High speed
video confirmed that the proper velocities were being achieved, and release occurred as designed.
Finally, a full takeoff was performed. The catapult successfully launched Icarus, which throttled up after
leaving the cart. Upon landing, the aircraft and catapult were assessed, and no damage was found on either
system. This field test also verified the transportability of the catapult system setup and breakdown was
consistently performed in under two minutes.

4.2 Communication System


The ground portion of the communications system allows software on the ground to communicate with the
aircraft payload system to transfer imagery and send commandsfor air drop and SRIC, for example. To
achieve the high-gain needed for high-bandwidth, long-range communication, the ground communications
system utilizes three directional patch antennas. To be effective, these antennas are mounted on an antenna
pointing machine, which keeps the aircraft within the antennas 60 degree dynamic range. These antennas
are connected to a NETGEAR R7000 router which provides the 5.8 GHz 802.11n network to which the Bullet
M5s in the aircraft connect. The data is then transferred over Gigabit Ethernet to a NETGEAR GS108T
network switch, to which all ground computers responsible for manual target recognition and running ground
software are connected.
The antenna pointing machine is controlled by an Arduino Mega microntroller board using two H-bridges
to control DC motors mounted on the machines pan and tilt axes. It receives azimuth and elevation angles
from a script that reads the aircraft and ground station locations from the navigation ground station.

4.3 Targeting System


The Cornell team developed a distributed computer system to solve the challenges of the SUAS mission.
The system is designed to be tolerant of network failure, temporary power failure of the payload application,
temporary failure of the server, and permanent failure of any of the clients. It has been thoroughly tested
in these circumstances and has been shown to perform correctly.
The system is based on a server-client architecture and uses web technologies heavily. On the ground,
a single server handles web requests and interfaces with the persistent database, PostgreSQL. In the event
of power failure on the machine running the server, the machine simply needs to be restarted and it will

CUAir: Cornell University Unmanned Air Systems Team 16 of 20


4 GROUND SYSTEM Eos Unmanned Air System

resume exactly where it left off. To accomplish this, the server is stateless. All internal state is offloaded
to PostgreSQL, which handles the complex persistence and recovery logic automatically. The ground server
can interface with any number of clients.
The client application was designed to be run in a web browser using standard Javascript, HTML and
CSS. This takes advantage of the ubiquity of the browser as an application platform; deploying the application
takes literally milliseconds on nearly any computer. Each client enables its operator to perform the targeting
task and, by interfacing with the server, to combine work done on multiple clients. Also, the client is capable
of issuing commands to the payload application.

4.4 Automatic Detection & Classification


The algorithm for the automatic detection, localization, and classification (ADLC) task consists of four
stages. The first stage finds small sections of the image that are highly likely to contain targets. The second
step computes the confidence that a region is a target, and what color the target is. After this the target
shape and alphanumeric are determined. Finally, duplicate targets are removed and the highest confidence
targets are returned.
The region-of-interest algorithm uses a number of image filters to generate a grayscale image where
intensity indicates how interesting a pixel is. The first filter takes the difference between the hue of a pixel,
and the average hue of the pixels in a 51x51 box around the pixel. This highlights pixels that have a different
color to their neighboring pixels. The second filter is identical, except with the saturation component of the
image. This removes sections of the image with very similar saturation, such as runways. The pixel-wise
product of these two filtered images is taken, resulting in one grayscale image. Contours are found in the
image that have a certain size and circularity. The bounding box of these contours are then passed to the
next stage of the algorithm.
The target classifier determines the confidence that each region is a target by determining the color of
each pixel in the region, by using the hue, saturation, lightness (HSL) color space. Then for every color,
the average distance in pixels to a number of concentric circles centered on the region are calculated. The
vector of distances for each color is normalized to have a magnitude of one. The alphanumeric color is the
color with the smallest normalized distance to one of the small concentric circles. The target color is also
the smallest normalized distance, but to a larger circle. The confidence that a region is a target is calculated
using the normalized distances of the alphanumeric and target colors.
To determine the shape of a target, first the region is segmented using the K-means algorithm. This
allows us to determine which pixels belong to the background, target and alphanumeric. The algorithm
then finds the corners and sides of the shape. A number of metrics such as number of parallel sides, and
ratio of perimeter to area are then calculated. The calculated metrics are compared to a list of known shape
properties to determine which shape it most likely is.
In the last stage, duplicate regions are merged together, and the six highest confidence targets are
returned. Regions with similar longitude and latitudes are grouped together, keeping the information from
the region with the highest confidence of the group. The regions are then sorted by confidence and the six
highest are returned.
The result of these components, is a pipeline with dozens of tunable parameters that drastically affect
the accuracy and precision of our algorithm. To find the optimal set of parameters for our algorithm, we use
Limited-memory BFGS, a quasi-Newton method of optimization. L-BFGS runs our algorithm on a test set
of old flight images, estimating the gradient of the parameters and adjusting them accordingly, until we find
the minimum point in parameter space.

4.5 Localization
In the Eos ground system, after the location of a target is identified in an imagewhether through manual or
automatic meansits GPS location is determined through a single localization algorithm. Though thresholds

CUAir: Cornell University Unmanned Air Systems Team 17 of 20


5 MISSION OPERATION Eos Unmanned Air System

for scoring are specified in the mission outline, the team did not utilize them, but simply attempted to devise
a correct algorithm that would produce good results when error is factored in. This consolidated localization
algorithm estimates the GPS location of each target in three distinct steps. First, for each sighting of a
target in an image, the location of the target relative to the aircraft is estimated by calculating a pointing
vector from the aircraft to the target.
Each pointing vector is initially constructed pointing straight down, but then is rotated to account for
the cameras orientation and the targets location in the image. The rotation of the camera in world space is
measured by an IMU mounted on the camera, and the state is captured by the gimbal controller at the same
instant the camera is triggered. This ensures that the position and orientation of the camera are as accurate
as possibleany timing error means that this data is incorrect. The implicit rotation from the cameras
line of sight to the intended pointing vector is provided by the targets location in the image. In the second
step, for each of the calculated pointing vectors, the GPS location of each target is estimated. From this
aircraft location the target location is derived via vector addition with the pointing vector. As a final step,
the GPS location estimates from each sighting of a target are combined into a single final estimate. The
data is cleaned of outliers or impossible points, and then the average of the location estimates is calculated.

5 Mission Operation
Ten people operate the Eos UAS. First, there is a mission lead, who does not have a predefined task other
than to coordinate the other operators and serve as a unified interface to the team for the evaluators. Next,
there is a safety pilot who performs manual takeoff, monitors the flight while under autopilot control, and
performs manual landing. He has the ability to take manual control at any time during the flight if necessary.
By his side is the spotter whos only role is to assist the safety pilotby adjusting trims and ensuring that
the runway is clear, among other things. The remainder of the operators are seated at some position on the
ground station and interface with some software. The autopilot operator is responsible for the autonomous
navigation and flight portion of the mission, and will assign flight plans and re-task the aircraft according to
the direct mission requirements and requests from other operators. He is assisted by the autopilot assistant,
who is tasked with monitoring the status of the autonomy software during re-tasking. For completing the
imagery task, the Cornell team has an imagery lead who is responsible for changing camera settings, merging
target sightings into logical targets, providing actionable intelligence, and providing the target information to
the evaluators at the end of the mission. The remaining four operators are responsible for using the MDLC
application to identify and tag targets in the imagery; they may also assist the imagery leadfrom their
own workstationin completing any of his tasks. These operators are responsible for monitoring automated
tasks as well, to ensure that the mission can still be completed in the event of a failure. One monitors the
ADLC program, another monitors the antenna tracking machine, yet another monitors the completion of
the SRIC task, and the last is responsible for arming the air drop system as well as executing the manual
override in the case that the automated drop fails.
The mission timeline consists of three phases: setup, execution, and teardown. Setup of the ground
station involves placing the tables and chairs, setting up the individual computer nodes, connecting all
network cables to switches and computers, connecting all power cables to power strips or uninterruptable
power supplies, initializing the antenna tracker, and placing the sunshades on the table to decrease sun glare
on the computer monitors. The execution phase starts with the mission clock. The safety pilot completes a
short preflight test to verify the operation of the payload components, manual aircraft control, and navigation
system sensor readings. This preflight test takes around 5 minutes to complete and helps to ensure mission
success upon takeoff. The next step is a manual takeoff followed by 20 minutes of flight time.

CUAir: Cornell University Unmanned Air Systems Team 18 of 20


6 SUMMARY Eos Unmanned Air System

5.1 Safety
Adequate safety precautions must be observed while testing an operating an aircraft, especially systems
with some degree of autonomy. The Cornell team strongly emphasized the safe operation of the flight
vehicle through all phases of design, development, and flight operations to mitigate the risk of personal
harm or property damage. The team has designed a system with redundancies, failsafesas defined by
mission requirements), and appropriate factors of safety that significantly reduce potential risks during
system operation. During flight operations, the safety pilot retains the ability to instantly regain full manual
control of the aircraft at any time by toggling a switch on the pilot transmitter. The safety pilot is an AMA-
licensed pilot with years of experience flying model aircraft. A spotter stands by the safety pilot at all times
to observe the surrounding areas for other aircraft or hazards, maintain flight line safety, and communicate
to the other ground system personnel over a handheld radio.

6 Summary
The Cornell University Team has designed the Eos Unmanned Aerial System to optimize mission perfor-
mance by creating subsystems that complement one another. Each system, as well as the whole UAS, was
implemented and thoroughly tested to ensure safe and reliable mission operation. Through thoughtful design
and testing, the team is confident that the Eos UAS would have been able to complete the following mission
tasks: manual takeoff and landing; autonomous flight and waypoint navigation; manual sense, detect, and
avoid; interoperability; target detection, localization, and classification; autonomous search; automatic tar-
get localization; actionable intelligence; off-axis target imagery and classification; emergent target re-tasking
and identification; simulated remote intelligence center interaction; and autonomous air drop release with
required drop accuracy.
Unfortunately, while perfecting tuning parameters shortly before team members left campus, the Eos
UAS entered an unrecoverable flat spin and crashed. As a result, the team will not be flying at the 2015
SUAS Competition.

CUAir: Cornell University Unmanned Air Systems Team 19 of 20


REFERENCES Eos Unmanned Air System

References
[1] Cloud Cap Technology. Cloud cap technology documentation. http://www.cloudcaptech.com/
piccolo_II.shtm#history, 2013.
[2] Lockheed Martin. Kestrel flight system. http://www.lockheedmartin.com/us/products/procerus/
kestrel.html, 2013.
[3] Lockheed Martin. Product price sheet. http://www.lockheedmartin.com/content/dam/lockheed/
data/ms2/documents/procerus/procerus_pricing_022713.pdf, October 2013.

[4] Airware. osflexpilot. http://www.airware.com/products/osflexpilot/, 2013.


[5] Paparazzi. Paparazzi. http://paparazzi.enac.fr/wiki/Main_Page, March 2013.
[6] DIY Drones. Arduplane. https://code.google.com/p/ardupilot-mega/, 2013.

[7] Daniel Raymer. Aircraft Design: A Conceptual Approach. American Institute of Aeronautics and
Astronautics, Inc., Washington, D.C., 3rd edition, 1999.
[8] CUAir: Cornell University Unmanned Air Systems Team. Helios autonomous air system, May 2014.
[9] Mark Drela. Xfoil subsonic airfoil development system. http://web.mit.edu/drela/Public/web/
xfoil/, April 2008.

[10] Itseez. Opencv. http://opencv.org/, 2013.


[11] Thomas Davis. Linux ethernet bonding driver. https://www.kernel.org/doc/Documentation/
networking/bonding.txt, 2011.
[12] Ubiquiti Networks. Bullet m. http://dl.ubnt.com/datasheets/bulletm/bm_ds_web.pdf, 2011.

CUAir: Cornell University Unmanned Air Systems Team 20 of 20

Das könnte Ihnen auch gefallen