Sie sind auf Seite 1von 12

Infotech@Aerospace 2012 AIAA 2012-2417

19 - 21 June 2012, Garden Grove, California

Cyber Security for Aeronautical Networked Platforms –


What does it mean to me in commercial aviation design?
Chuck Royalty1
The Boeing Company, Seattle, WA 98124

Cyber attacks and cyber security are in the world news every day. Systems controlled by
embedded computers (Cyber-Physical System or CPS) are increasingly featured as being at
risk, with concern continually expressed over the vulnerability of critical infrastructure to
damage and disruption from attacks over networks. Digital control systems in aircraft are
Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

also CPS, and are likewise the subject of concern. Both FAA and EASA have created
regulations (Special Conditions) specifically to address such concerns, and RTCA and
EUROCAE have published process guidances. These guidances are expected to ultimately
establish standards of practice for regulators and the industry to develop and certify aircraft
for which network security is a requirement. However, the implications to project costs and
schedules, designs, and the practical aspects of implementation remain largely unknown to
aircraft system designers. This paper provides a practical understanding of how cyber
security impacts airplane computer system design, in the presence of existing safety,
development, and certification requirements and processes.

I. Introduction

U se of computers in aircraft systems is a long-standing example of computers as embedded systems. More


recently the term “Cyber-Physical System” (CPS) was coined to refer to the the usage of computers to interact
with machines that control or physically interact with their environment.1 Air transportation systems and airplanes
are major CPS instantiations, and safety and security are central requirements for enabling high confidence CPS.2, 8
For the typical airplane system designer the potential effect of cyber security requirements on system design and
certification activities is particularly worrisome. Several important open questions remain to be answered for the
airplane system designer, including: What will security do to costs and schedules? Can one certify designs? How
does one determine, and show, that a design has the necessary security? How does one determine whether the
design needs security? For that matter:
• Why and when did security become part of safety-related systems?
• What aspect of my design or system makes regulators believe I need to address security?
• What is “cyber security” anyway?
• What causes my system to be exposed to cyber security threats?
• What does it mean to me as a system designer? To my requirements? To my design? To my
implementation?
• How would I even begin to address it?
• My system only has interfaces to other installed systems. Why would I need to worry about security?
• My system performs no security functions. Security isn’t an issue, right?
• I’m a system designer. Security is a software problem, right?
• I’m a software designer. Security is a system problem, right?
• I’m a complex hardware designer. Security isn’t my problem, right?
• I’m required to use ARP4754, DO-178B, DO-254 for my complex computer-based systems. How do I add
security to that?

This paper provides a pragmatic view on how cyber security impacts design and development of software-based
embedded systems in aircraft, specifically designs developed using DO-178B [3], DO-254 [4], and ARP4754 [5] or
ARP4754A [6]. Starting with the premise that someone (say, a regulator) says that for a system design to be “safe”
it must address network or cyber security, the paper offers insight to help answer questions listed above.

1
Associate Technical Fellow, Boeing Commercial Airplanes, P.O. Box 98124 M/C 02-98, Seattle, WA 98124
1
American Institute of Aeronautics and Astronautics

Copyright © 2012 by The Boeing Company. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.
The remainder of this paper is organized as follows. Section II motivates the need for cyber security of
aeronautical networked systems. Section III describes the considered operational environment for the networked
airplane and its computer systems. Section IV presents regulations that exist for safety and security for the
networked airplane. Section V presents the challenges in bridging safety and security considerations for airplane
computer systems. Section VI proposes an approach of introducing security considerations for airplane computer
systems as access control requirement and function. Section VII provides an example for securely design a safe
airplane computer system. Section VIII presents conclusions.

II. The Need for Cyber Security of Aeronautical Networked Systems

A. Internet-Based Threats for Aeronautical Networked Platforms


Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

Internet
Internet

System Monitor
Threats
and Control

Figure 1: Example networked embedded systems using the vulnerable Internet

Fig. 1 illustrates networked, if not isolated, systems that are increasingly connected to the Internet for different
criticial, essential, or non-critical services and applications. These systems do not intentionally communicate with
each other or with anyone except the people who are supposed to control them. Unfortunately, the Internet has well
known vulnerabilities. e.g., identify spoofing Access by unauthorized people, viruses and malware are persistent
threats to IT systems which are embedded our homes to national critical infrastructures. Increasingly the impact of
Internet-based threats on these networked embedded systems. For example:
“After a security breach at the United States Chamber of Commerce last year, the Chamber discovered that its
office printer, and even a thermostat in a Chamber-owned apartment, had been communicating with an Internet
address in China.”, New York Times, January 22, 2012
It is interesting to note that the thermostat was part of the HVAC system; and the HVAC system was part of the
building – and who knows what the system and component designer thought should be connected to each other and
to the Internet. That’s a pretty big capital asset to have strangers wandering around in. So, if this was an airplane,
we’re going to look at what the designers should have done.

B. Emergence of the Airplane as a Networked System


As long ago as the early 90’s, designs were being developed to allow airplanes to connect to external networks.
In the early 2000’s, serious standards activity began on architecture concepts for networked aircraft. Security, in
particular was seen as a required new aspect of airborne networks – both at the edge of the airplane and between
systems.
The AEEC Aircraft Data Networks and Security subcommittees (since combined into the Network Infrastructure
and Security (NIS) committee) have published specifications and reports that define and address issues relating to

2
American Institute of Aeronautics and Astronautics
the increased use of networks on aircraft, including security. Two important documents were published in 2005,
ARINC 664 Part 5, “Aircraft Data Network Part 5 – Network Domain Characteristics and Interconnection” and
ARINC 811, “Commercial Aircraft Information Security Concepts of Operation and Process Framework”.6, 7
Aircraft to ground network communications were considered in the most detail, but in particular the ADN
subcommittee noted that inter-system connections tend to be pervasive and also need to be considered with respect
to network security threats. If you develop or work on airborne computer systems and this is completely new
information, these documents provide important background. Certainly not everyone working in airplane systems
development needs to be a security expert, but every developer should have a basic understanding and appreciation
for the changes that have been occurring in airplane systems technology, along with the implications of those
changes. Hence the committees concluded that security needs to begin to be engrained in the system development
culture just as safety.

III. Safe and Secure Operational Environments of a Networked Airplane


Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

Fig. 2 shows that aircraft are designed (and crew/servicing personnel trained) to adequately address and mitigate
environmentally created risks to safety. Part of that safety risk is posed by people, and part of the mitigation of that
risk is a physical perimeter established and maintained by airport authorities, operators and governments such that
many threats (people wanting to cause harm) outside the perimeter are prevented from reaching aircraft in order to
cause harm. This is the original and primary security model for aircraft. It is so ubiquitous that it often is taken for
granted while also not being formally and uniformly defined.
Note that mixing people with other aspects of the environment into Fig. 2 below is a bit like “mixing apples and
oranges” when most people think of safety vs security. However, as will be shown, there’s substantial commonality
in dealing with natural environmental threats and with people as threats.

People (Public)
Physical Perimeter
Screened, Trained People
Lightning Pressure

Vibration Communication
Signals

Temperature

Screened, Ticketed People

Ice

RF Fields Acceleration
Ionizing Water
Radiation

Figure 2: A physical security model of aircraft

With both types of threats, the airplane need be designed only to mitigate the threats it will be exposed to. If it
can be known or confidently asserted that there will be no exposure to a threat, then the design typically need not
address it. However, designers are not free to assume whatever they want about the airplane’s environment. In
most areas the environment is carefully defined for each relevant discipline. With respect to people and the potential
threats they represent, the airplane and systems design does not need to address the most severe of those threats
because there are general “external agreements” (enforced by regulations) that permit the airplane designer to rely
on them to a specific degree. Specifically, the external agreements are enforced within the physical perimeter so
that the capability of individuals to cause harm is limited.
As long as the design of the airplane generally matched this operational model, designers were fairly free to
ignore the security issues that emerged when the Internet became a reality. Once networking and Internet the
airplane design space, the situation changed, as will be discussed next. However, we note that the focus of this
paper will be on the airplane computer system design problem and not on airplane design.

3
American Institute of Aeronautics and Astronautics
A. Airplane Computer System and Its Environment
As shown in Fig. 3, within the airplane, systems are designed to work within a specific worst case environment,
again with a physical perimeter that serves to preclude exposure to some threats, or to reduce the severity of them
(e.g., temperature extremes). This system happens to be in the Electrical Equipment (EE) Bay - note that passengers
(screen, ticketed people) do not appear within the inner area because the internal physical perimeter excludes them.

A Computer System and Its Environment


People (Public) Internet threat
RF Ionizing
Radiation
Screened, Trained People
Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

Vibration
Lightning
Temperature Digital System

Acceleration
Cooling

Electrical Communication
Power Signals

Figure 3: Potential threat model for airplane computer system

In Fig. 3, the blue lines, which connect the Public, Internet threat and Passengers to systems inside the Physical
Perimeter, illustrate a relatively newly-recognized threat to airplane systems that is specifically associated with IT
systems and network connections. People and computers attempt to compromise the operation of other computers
through electronic interfaces – both wired and wireless. Networks contribute to the issue by making it possible for
people and computers to relatively easily exchange messages with other computers. The blue ovals and lines in the
figure illustrate paths by which people who are physically isolated from the computers that need to be protected may
actually gain unauthorized access to them. At that point, an improperly designed airplane system may be
compromised by a remote entity.
It shouldn’t be forgotten that systems don’t work in isolation. They communicate with each other, with people
(such as flight and cabin crew) on board, and with off-board systems and people outside the airplane. A
compromised computer may serve as a means to compromise additional computers within its “reach”, in large part
because other people and systems assume that it would never attempt to attack them, and hence make no attempt to
design against that kind of unanticipated behavior.

IV. Regulations for Airplane Safety and Special Conditions for Networked Airplane Security
Aircraft designs are approved for sale and use in every country based on that country’s regulations. In the U.S.
the Code of Federal Regulations (CFR), Title 14, Part 25 addresses the aspects of design that are the focus of this
paper. Digital systems are typically required to show compliance to two regulations of particular interest, excerpted
below:

14 CFR 25.1301
(a) Each item of installed equipment must—…
(4) Function properly when installed.

14 CFR 25.1309
(b) The equipment, systems, and installations … must be designed to ensure that they perform their intended
functions under any foreseeable operating condition.
4
American Institute of Aeronautics and Astronautics
(d) … The analysis must consider—
(1) Possible modes of failure, including malfunctions and damage from external sources.

14 CFR 25.1301 and 1309 are two key regulations for systems. These two regulations say that systems must
work under all conditions in which they are expected to operate or be exposed to and must fail in predictable,
understood ways. Failure must be anticipated and addressed – high impact failures must have a very low probability
of occurrence. Low impact failures may have a higher probability of occurrence. Essentially, failure modes and
mechanisms must be included in the design. This is the basis for the minimum standard for airplanes to be declared
“fit for use”. Figuring out what the failures are, how they impact the airplane, crew and passengers, and how likely
they are to occur is at the heart of system development. It affects the system architecture, impacts the work statement
to develop the systems, and ultimately determines if an airplane can be certified.
Several years ago, the FAA decided that the two regulations didn’t adequately anticipate threats to safety from
malicious access when networking began to be prevalent. Whether they did or not, there certainly was no guidance
Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

as to how to show that exposure to malicious activity over networks was properly addressed to meet the
requirements of 25.1301 and 25.1309. As a result the FAA created new regulations, called Special Conditions, to
impose new certification requirements.3, 4 An analysis of those FAA requirements reveals that the safety objective
of the Special Conditions is effectively the same as 1301 and 1309, but specifically adds the requirement that
network security threats to systems must be included as a consideration for design and analysis. In essence, the
Special Conditions, declare network security threats to be forseeable in some system architectures. Hence, network,
or cyber, security became a subject of aircraft system certification. In doing so, some complications emerged as
discussed next.

V. Bridging Safety and Security of Airplane Computer Systems

A. Fundamental Gap between Safety and Security


The FAA’s primary mission, and especially the focus of regulations addressing aircraft certification, is safety:
“Our continuing mission is to provide the safest, most efficient aerospace system in the world”
(http://www.faa.gov). In pursuit of that mission, the FAA sets minimum standards for evaluating the design and
production of aircraft to determine whether they are fit for their intended use. These standards overwhelmingly
concentrate on the safety of the aircraft and occupants.
As a discipline, however, cyber security is concerned with a much broader range of impacts than physical injury
or damage, and the question of securing computer systems that may have physically hazardous failure modes is a
relatively immature aspect of cyber security. Cyber security as a discipline covers the entire spectrum of computer
usage, but by far the most experience and attention is on computer systems that process information unrelated to
physical hazards. Hence, although computer system security may have similar abstract objectives across all uses of
computers, the actual implementation issues may be very different (e.g., compare a desk calculator with a hydraulic
actuator), because domain-specific skills and experience dramatically affect the success of product development.
We have seen that aircraft development requires great attention to preventing physical harm from failures. In
fact, security as a discipline is also entirely focused on how to prevent harm from failures. The difference tends to
be in what causes are typically considered. In commercial aircraft product development and certification, relatively
little attention has historically been paid to intentional causes of hazards. Development guidance for the regulations,
in fact, explicitly excludes sabotage as a cause relative to compliance with 14 CFR 25.1309. Operational
requirements are the primary area in which defense against malicious activity is addressed. In cyber security as a
discipline, on the other hand. the primary focus tends to be intentional causes of failures. But the distinctions are
relative.
When we look at the overlap between the general safety concerns for aircraft systems and cyber security
concerns in other industries, we find one relatively small subject area – malicious activity, addressed by typical
cyber security processes, but not generally addressed by airplane design for the type certificate (see Fig. 4). In the
end, though, there’s one airplane, and it needs to address the impact of every aspect of operation. All of the
development activities need to produce a single result. If security and safety disciplines have overlap in their work
statements but have inconsistent scopes, this can lead to wasted work and, worse, potential gaps in the product.

5
American Institute of Aeronautics and Astronautics
Safety / Security Overlaps

Safety
Assessment Scope
Airworthiness Effects from:
• Environment
• Failures
• Human error
Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

• External dependencies Typical IT


Forseeable in • Malicious Activity Certification &
operating Accreditation
environment?
(and not mitigated by
environment) Security Assessment
Scope

Figure 4: Safety and security overlaps and differences

Logically, it makes sense to consider how to extend the safety development and assessment process to
encompass that additional item when it’s necessary to do so. In particular, when the opportunity for malicious
computer activity is present, it is also forseeable today that someone will attempt it (Fig. 5). Consider what happens
with this process, however. Security isn’t something that some small group does for everyone – rather everyone has
some role to play.

Safety / Security
Assessment Scope
for Cyber

Considers effects from forseeable


Airworthiness events:
• Environment
• Failures
• Human error
• External Dependencies
• Malicious Activity Forseeable in some system
architetures

Figure 5: Malicious activity in airworthiness certification consideration

It’s worth noting here that the “environment” that is typically considered in safety and the “environment”
considered in security possess markedly different characteristics. In particular, the specific security threats of a
system environment are largely artificial (human origin) and intentional, while safety processes largely consider
natural and unintentional (“human error”) threats. Since intentional threats are goal oriented, they will change to
exploit vulnerabilities, weaknesses, and system errors as such things become known. Recognizing this distinction is
important to design and continued safe / secure operation, but is not specifically addressed in this paper outside of
the inferences that can be drawn from the discussion in sections VI and VII.
6
American Institute of Aeronautics and Astronautics
B. Challenges from Complex Airplanes
Historically, development standards for airplanes and systems didn’t explicitly consider cyber security, and the
question is often asked, “Why not use existing security development and assessment processes in addition to our
usual safety practices?” There is no single, and arguably no “right” answer. However, development of an airplane
generates enormous quantities of engineering data across many organizations – often in different companies. Our
development standards define processes and methods that require the collection of this data from wherever it’s
generated to support the analyses (see Fig. 6). This must be done because, while many systems and functions
contribute to safely flying the airplane, the loss or malfunction of a relatively small number of them can result in
catastrophe. So an enormous amount of effort goes into determining exactly what, and how bad, any particular
failure or malfunction and all likely combined failures and malfunctions will be (recall 14 CFR 25.1309).

Complication – Airplanes are very complex


Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

Airplane

Systems


Items Data

Development & Assessment


Processes

Figure 6: Complex processes for safety assured airplane development and assessment

Security design and analysis tasks also look at data from top to bottom and potentially across all systems and
functions. The scope of security activities is the same as the scope covered by airplane development processes, and
include many of the same failure considerations – which we’ll discuss later. So one obvious way of introducing
security considerations is to do so within the data and process framework that already covers that same scope,
particularly since the certification aspects related to security are primarily interested in one question: Is the airplane
safe when considering security threats?

C. Potential Common Aspects between Safety and Security


It is widely recognized that most product security vulnerabilities result from errors in requirements, design, and
implementation. The Department of Homeland Security provides a web site titled “Build Security In”. The Mission
page of that web site speaks of the need for software assurance:

“Because software is essential to the operation of the Nation’s critical infrastructure and it is
estimated that 90 percent of reported security incidents result from exploits against defects in the
design or code of software, ensuring the integrity of software is key to protecting the infrastructure
from threats and vulnerabilities, and reducing overall risk to cyber attacks.”

Thus, the assurance, or lack thereof, that systems operate properly is traceable to 90 percent of reported cyber
security incidents. (https://buildsecurityin.us-cert.gov/bsi/mission.html)

The elimination of errors in those areas is a key focus for aircraft system development as well. Both safety and
security rely on development:
• Correct, complete requirements and
• Correct design & implementation.

7
American Institute of Aeronautics and Astronautics
That is, systems must satisfy their requirements and not perform unintended functions. Naturally, systems also
rely on correct operation, but correct operation relies on understanding what the designer assumed and requires. So it
turns out that development guidance for safety and development guidance for security have a lot in common,
although it’s often not expressed in the same way. But another aspect of 25.1309 emerges: How much assurance is
needed in any situation, and how does anyone know it’s been achieved? That’s addressed through a complex process
of architecture development, allocation, and analysis, by the assignment of assurance levels to functions, systems,
software, and hardware. This is the subject of SAE ARP 4754A, RTCA DO-178B (and C), and RTCA DO-254.
The assignment of assurance levels results in development processes that provide varying levels of confidence
that the requirements are correct and that they are correctly implemented. This confidence ranges from assurance
levels of none (level E) to very high (level A), depending on the effect that a failure or malfunction of the item in
question will have on flight safety. Note that a low level of assurance does not automatically mean that systems are
more failure prone. However, industry experience and regulatory requirements indicate that complex systems and
software require greater process rigor as the impact of their failure increases in order to assure that safety
Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

requirements are satisfied. The implication is that, in general, systems and software developed to lower assurance
levels will fail more frequently than higher assurance ones, and that test and inspection alone are not sufficient to
assure that complex systems and software will be reliable. Processes developed specifically for security follow the
same principle.
Note the key separation of correctness as an objective from assurance of correctness. Correctness is always a
requirement. Correctness is always specified. This is true at all levels of assurance.

VI. Cyber Security Considerations for Airplane Computer System Design: Access Control
Cyber security for CPS includes the ability to ensure that both data and the operational capabilities of the system
can only be accessed when authorized.5 This provides a basis for security of aircraft systems within the development
and certification process. Security of an airplane, hence, includes the ability to ensure that both data and the
operational capabilities of the aircraft can only be accessed when authorized, and further, that, security of a system
installed in an airplane includes the ability to ensure that both data and the operational capabilities of the system can
only be accessed when authorized.
The next question is what is “authorized” access with respect to airplane systems? Because aircraft undergo a
rigorous development process, the roles and responsibilities related to access to aircraft are also well-defined. This
includes interfaces, functionality, and operation, along with training, licensing, and regular auditing of personnel and
organizations involved in operation and maintenance. Therefore it’s relatively straightforward (if very lengthy) to
define what type of person or system is authorized to access every function and interface on an aircraft, what level of
training they require, what procedures they should follow, and what potential harm may occur from a failure to
properly perform any access.
In practice there are two modes of access: the most common mode requires a person or equipment to physically
contact the aircraft. The other mode, and the one of primary interest when networks become prevalent, is logical or
electronic access, and particularly that which requires no physical access, such as that gained through radio
frequency digital interfaces or software controlled packet switching networks.
Given that “authorized” access can be defined for every interface and function, if forseeable operating conditions
include the possibility of electronic access by unauthorized parties, then requirements should address how that
access is adequately controlled or blocked, or is innocuous under all conditions.
The question has been asked if that’s really all there is to security, and the answer is “yes,” unless you think that
makes it easy, in which case the the answer is “no.” That is, awareness of the problem is necessary, but awareness
alone is not sufficient to address the problems. It is common for developers to create access controls that fail to
work properly because they don’t understand the technology they’re using or the threat that their system may face.
They may design a system with operational requirements that are impossible to meet. Knowing when one does not
have the experience to solve the problem one faces is a key part of design, although it is not addressed in this paper.
To summarize, to properly develop aircraft systems two elements are needed: The correct requirements and a
correct implementation. Confidence that each is correct is addressed in aircraft system development guidance as
noted in ARP4754A, DO-178B and C, and DO-254. However, we’re looking at a new kind of requirement, access
control, specifically because aircraft systems in the past have not traditionally performed access control functions.
Designing access controls, particularly logical controls, for electronic systems is not easy to do or to demonstrate.
The good news is that much of what plagues IT systems in security – errors in design, implementation,
configuration, operation, and maintenance – already have rigorous and proven means to address in airplanes. But if
the requirements are wrong, the system won’t work properly, and the fact is that access control requirements are

8
American Institute of Aeronautics and Astronautics
hard to get right. If the implementation is flawed, the system won’t work properly. If the operation and
maintenance is incorrect, the system won’t work properly – but because that’s already true of every system and
function, security doesn’t particularly require a new development approach as much as good execution of our
processes beginning with the appropriate requirements.

VII. An Abstract Example of Safe and Secure Airplane Computer System Design
The example that follows is designed to illustrate some specific points about: requirements, implementation,
development assurance, and considerations external to the design. It is an abstract theoretical example created in this
paper to provide insight into the complex design issues stated above. The purpose is to illustrate specific aspects of
access control design.
Communication
Signals
Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

Wireless
interface

L1

Portable L2 L3
Minor Major
computer (D) (C)
interface

Figure 7: Example used to illustrate access control design for embedded computer systems of airplane

As shown in Fig. 7, the example consists of three Line Replaceable Units (LRU) – L1, L2, and L3. These LRUs
contain software & complex hardware. They are installed in an aircraft Electrical Equipment (EE) bay, which
means that for this design example they are protected from physical access by anyone other than authorized
mechanics. Airport security meets Transportation Security Administration (TSA) requirements. The labels “Minor”
and “Major” indicate the failure condition categories of the LRU, and the “(D)” and “(C)” indicate the development
assurance level of LRUs L2 and L3.
The LRUs are connected by communication wiring as shown in Fig. 7. The wiring is installed within the EE
bay. The communication protocol is bidirectional, but takes place only over the point-to-point wiring. These are not
networks. The wire on the left is for a connector, also located in the EE bay, to which portable test equipment, such
as a laptop, may be connected for maintenance. The symbol at the top of L1 is a wireless interface for which there is
no specified limit on communication range; but, for the example range is assumed to be hundreds of feet.

A. Risk
The objective of aircraft development and certification is the achievement of an acceptable level of risk. Risk is
generally defined as the likelihood of an event combined with the the impact of the event. The certification process
for aircraft establishes maximum allowable values for risk – requiring assurance that the maximum likelihood of
increasingly severe safety events is below specific limits (see 14 CFR 25.1309).
As noted, this requirement applies to failures in any forseeable operating condition for equipment covered in
25.1309. This specific example, of course, is specifically concerned with risk due to access control failures. As
discussed above, access control is a security function. The losses due to failures generally resulting from security
failures are loss of availability, loss of integrity, and loss of confidentiality. Any of these could relate to safety, but
loss of integrity and availability are the common risks to safety.
Access control always involves some system or person performing the control function as well as some external
entity (person, equipment, or signal) on which the control acts. When considering the various modes of failure
9
American Institute of Aeronautics and Astronautics
related specifically to access control functions, there are two general areas in which failure may occur: the function
of a system in the airplane and the actions of the party (person or equipment) attempting to gain access.

B. How Does Access Translate to Hazard


Note that, just as you design a system to operate within a “worst case” environment, so it is with security. The
problem, of course, is that, in general, security threats are adaptive – they evolve as the attacker learns, becoming
more sophisticated and effective; often more severe. The airplane design, however, can at best change slowly. So
security design requirements (such as an access control mechanism) will be paired with requirements for the
operational environment. In general, the operational security environment must adapt to changing threats in order to
ensure that the airplane operates within its designed threat environment. When the required operational security
environment can no longer be assured, the airplane design must potentially be changed or removed from service.
Hence, it is important to document and communicate the airplane’s operational security requirements to the
operator.
Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

Consider the operational requirements for the example. Based on TSA and FAA regulations, physical protection
and access control must be maintained on and around aircraft – in this example it is assumed that the aircraft is
operated by a commercial airline. The designer could arguably assume that this physical protection would always be
in place, and that nothing need be said about access to equipment or wiring in the EE bay, where this equipment is
located. However, the appropriate practice to ensure that the assumed design operational environment is satisfied
would be to communicate to the operator that security of the aircraft systems requires that physical access control be
assured. The aircraft is required by regulation to conform to its type design. The wiring is part of the type design,
and the wiring absolutely enforces access between systems. Hence, the “authorization” for access over, say, the L2 –
L3 interface is granted by the design, which is operationally assured by the airworthiness regulations. It may fail –
for instance wires may break. But the regulatory reqiurements dictate that such threats are already considered and
addressed appropriately. Because there is a large regulatory framework which regulates airports and aircraft access
and operations, any design requirements that rely on such controls are expected to be reasonably reliable.
But how about the electronic test interface to L2 provided in the EE bay? Because it’s within the physical
perimeter physical access control for the L2 interface is arguably as well-assured as for any other equipment in that
area. However, that electronic interface will be connected by an authorized person to a portable computer. Portable
computers often malfunction without their owners’ knowledge through exposure to electronically transmitted
malicious software (e.g., infection by viruses). Such infections are difficult to prevent and may persist for long
periods without being discovered. Therefore, while it may be stated as a requirement to an operator that test
equipment must be properly configured and free of viruses or other malware, it is prudent to note that sometimes
operators will fail to do so. That is, the reliability of the accessor in this case is not guaranteed. The designer may
need to consider how such failures will impact the aircraft system and design it appropriately to detect the situation
or prevent undue harm from such a failure. In this case the person performing the access may be completely well
intentioned, but due to failures in complex systems he/she unintentionally exposes the aircraft to intentional
potential harm.
Finally, there is the wireless network interface to L1. This interface is exposed to access by equipment that is
outside the physical perimeter established around the aircraft. It may in fact not be possible to make the interface
inaccessible in service and still perform the desired system function. Therefore no physical access control prevents
access to this interface. When it’s operational the airplane system must be completely responsible for deciding
which access attempts to authorize and which to deny. This is an area that is very easy to design and operate
incorrectly – that is, to fail to properly consider all the requirements or understand the nature of the threat the
networked system is exposed to.
Consider just the simple failure impacts. There is some negative impact if access is incorrectly denied; there is
some negative impact if access is incorrectly granted. Absent any additional controls or limitations, access that is
incorrectly granted may result in deliberate misuse of the functions provided over the interface, with whatever
impact results from such improper use. One may assume that the affected system will experience the worst case
malfunction possible from inputs to that interface. Because L1 is connected to L2, failures of L1 will impact L2 –
again potentially to the most severe impact possible over the L1 – L2 interface. However, note that the example
declared that the safety assessment for L2 established that its worst case failure or malfunction is minor, while the
worst case failure or malfunction of L3 is major. The implication of properly designing these LRUs is that no
possible failure or malfunction of L2 can result in more than a minor failure condition for the aircraft. As noted
above, the L2-L3 interface is operationally assured by well-established airworthiness regulations, hence preventing
any unanticipated failures in L3 due to the designed interface with L2 as long as the appropriate evaluations are
performed as described in the next section. Overall, the example shows that while the designer may properly require
10
American Institute of Aeronautics and Astronautics
and assume that access within the EE bay will be fully controlled by operational procedures and security, it is
entirely forseeable that the system will be exposed to malicious access attempts on an interface to which access
cannot be physically controlled, such as the wireless interface to L1.

C. What Does Security “add” to Development?


Pragmatically, most security considerations can be addressed as normal design assurance dependencies. It must
be remembered, however, that when security is considered, an LRU must not only be considered as reliable as its
assurance level indicates, but that its resistance to malicious inputs is also only that reliable. For instance, if a
system contains software, the possibility that the software will be altered is only assured to the same level as the
system. It’s long been a premise of software as a technology that software does not “fail” predictably. Yet many
system designers will assert that some failures are unlikely and consequently will assume that they need not protect
against them. However, if a designer assumes that failures may also involve intentional redesign of software that is
specifically intended to maliciously induce the failures that were deemed to be unlikely, then he/she will be less
Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

likely to leave security vulnerabilities in systems. Fortunately the basic considerations are relatively easy to express:
When evaluating a system in which the possibility of intentional misuse (due to access control failures) cannot
be precluded, and given a hazard assessment, are all worst case failures truly considered? Specifically, reviewing a
system hazard assessment for the possibility of intentional misuse:
1. Is any hazard more severe than the hazard assessment determined without considering misuse?
2. Considering misuse, is there any hazard that is missing from the assessment?
If the answer to either question is yes, then security failures may result in hazards that the system developer has
not considered, and security controls should be evaluated to determine if they adequately mitigate risks of
intentional harm due to access control failures.

VIII. Conclusions
In this paper, we considered the increase in networked embedded computer systems in the modern aircraft. This
emerging breed of beneficial systems augment physical access contol to the aircraft with airborne computer system
access control functions. The benefits of access over network interfaces of airborne computer systems, however, is
at the cost of exposure of potential vulnerabilities. The paper noted that the access control decisions are
fundamentally a security function for mitigating vulnerability exposure, but these access control decisions are also
subject to failure and intentional disruption. We identified that a system making access control decisions has to be
evaluated for effectiveness, including failure impact and required reliability. The subsystems relying on a system
that makes access control decisions are also impacted. Furthermore, even if an existing system has no new
interfaces, the addition of airborne computer systems with network interfaces will require that the system may have
to evaluated for the impact of failures of the interconnected systems. When it becomes a requirement (e.g., Special
Condition on an airplane type), there is potential for: new system requirements and a re-evaluation of independence
assumptions;an increased statement of work because of the new compliance activities; and, increased certification
risk. Evaluating the requirements and addressing the certification planning up front may prevent costly redesign
activities late in the project. Given certification planning is done up front, the proper design for cyber security can
potentially be handled within a normal development process.

IX. Acknowledgments
The Author thanks Dr. Radhakrishna Sampigethya for encouragement and significant assistance in editing and
producing this paper.

X. References
1
R. Poovendran, K. Sampigethaya, S. Gupta, I. Lee, K. V. Prasad, D. Corman, J. Paunicka, Special issue on Cyber-Physical
Systems [Scanning the Issue], vol. 100, no. 1, pp. 6–12, January 2012.
2
R. Poovendran, K. Sampigethaya, et. al., “Community report on the 2008 High Confidence Transportation CPS workshop,”
http://www.ee.washington.edu/research/nsl/aar-cps/ [cited 29 May 2012]
3
Federal Aviation Administration, 14 CFR Part 25, Special Conditions: Boeing model 7878 airplane; systems and data
networks security isolation or protection from unauthorized passenger domain systems access, [Docket No. NM364 Special
Conditions No. 250701SC], Federal Register, Vol. 72, No. 71., 2007, http://edocket.access.gpo.gov/2007/pdf/E7-7065.pdf
4
Federal Aviation Administration, 2007, 14 CFR Part 25, Special Conditions: Boeing model 7878 airplane; systems and
data networks security protection of airplane systems and data networks from unauthorized external access, [Docket No.
NM365 Special Conditions No. 250702SC], Federal Register, Vol. 72, No. 72., 2007, http://edocket.access.gpo.gov/2007/pdf/07-
1838.pdf
11
American Institute of Aeronautics and Astronautics
5
Banerjee, et al, Ensuring Safety, Security, and Sustainability of Mission-Critical Cyber–Physical Systems, Proceedings of
the IEEE, 2011
6
ARINC 664 Part 5, “Aircraft Data Network Part 5 – Network Domain Characteristics and Interconnection”
7
ARINC 811, “Commercial Aircraft Information Security Concepts of Operation and Process Framework”U.S Department
of Homeland Security, “Build Security In: Setting a higher standard for software assurance” https://buildsecurityin.us-
cert.gov/bsi/mission.html [cited 29 May 2012]
8
K. Sampigethaya, R. Poovendran, S. Shetty, T. Davis, C. Royalty, Future e-enabled aircraft communications and security:
The next twenty years and beyond, Proceedings of the IEEE, Special issue on Aerospace Communications and Networking in the
Next Two Decades, vol 99, no. 11, pp. 2040-2055, December 2011.
Downloaded by WESTERN MICHIGAN UNIVERSITY on March 25, 2016 | http://arc.aiaa.org | DOI: 10.2514/6.2012-2417

12
American Institute of Aeronautics and Astronautics

Das könnte Ihnen auch gefallen