Sie sind auf Seite 1von 155

Comments Due August 26, 2005 – 15:00 EDT

Please return the comments


Please use MSSoftware
Word Review Assurance
Facilities and Common Body of Knowledge Series
supply suggested wordings.
Send to :
Kim Caraway (caraway_kimberly@bah.com)
Copy Sam Redwine (redwinst@jmu.edu)

Secure Software Assurance

Common Body of Knowledge for


Development, Sustainment, and
Acquisition
DRAFT

Workforce Education and Training Working Group


Department of Defense and Department of Homeland Security

Issued: 14 August 2005 – For Review and Comment Only


Comments Due: 26 August 2005 by 15:00 EDT (Friday)
Please use Microsoft Word Review Facility

Edited by Samuel T. Redwine, Jr.

Volume I in the Software Assurance Common Body of Knowledge Series


Software Assurance Initiative Common Body of Knowledge Series

Secure Software Assurance


Common Body of Knowledge for
Development, Sustainment, and
Acquisition

Software Assurance
Workforce Education and Training Working Group
US Departments of Defense and Homeland Security

Samuel T. Redwine, Jr., Editor


Software Assurance Common Body of Knowledge Series

National Cyber Security Division, Information Analysis and


Infrastructure Protection, Department of Homeland Security
Joe Jarzombek PMP, Director for Software Assurance
joe.jarzombek@dhs.gov

Prior to official release to public in non-draft form, permission is not granted for usage of all or
portions of this document.

Please always ensure proper acknowledgement is given.

Copyright © 2005 Samuel T. Redwine, Jr., Rusty O. Baldwin, Mary L. Polydys, and Daniel P.
Shoemaker

The editor and authors assert their moral rights regarding this document.

NO WARRANTY

THIS MATERIAL IS FURNISHED ON AN "AS-IS" BASIS. THE EDITORS, AUTHORS,


CONTRIBUTORS, COPYRIGHT HOLDERS, MEMBERS OF THE WORKING GROUP ON
WORKFORCE EDUCATION AND TRAINING, THEIR EMPLOYERS, THE UNITED STATES
GOVERNMENT AND OTHER SPONSORING ORGANIZATIONS, ALL OTHER ENTITIES
ASSOCIATED WITH THIS REPORT, AND ENTITIES AND PRODUCTS MENTIONED WITHIN
THE REPORT MAKE NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED,
AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR
PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE
OF THE MATERIAL. NO WARRANTY OF ANY KIND IS MADE WITH RESPECT TO
FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT.

Use of any trademarks in this report is not intended in any way to infringe on the rights of the
trademark holder.

<To conform to government requirements legalese will probably need to be changed.>

Volume I in the Software Assurance Common Body of Knowledge Series


DRAFT– Restricted Distribution
For Review and Comment Only

0 Foreword
The Department of Defense (DoD) and Department of Homeland Security (DHS) Software Assurance
Initiatives encompass software safety and security and combine a number of disciplines. Currently, the
Initiatives are concentrating on achieving and assuring security properties and functionality. The
software assurance Workforce Education and Training Working Group, composed of government,
industry, and academic members, is currently taking a first step towards achieving adequate US
education and training on software assurance by defining the additional body of knowledge necessary
to acquire, develop, and sustain secure software beyond that normally required to produce and assure
software where safety and security are not concerns.
This draft is for review and comment and is not to be redistributed to anyone who does not commit to
provide timely comments by COB July 21, Thursday. A version for general distribution is planned for
October 2005.
Note the upcoming April 18, 2006 Workshop on Secure Software Engineering Education and Training
(www.jmu.edu/iiia/wsseet). The co-located IEEE International Symposium on Secure Software
Engineering (www.jmu,edu/iiia/issse) March 13-15 and Software Assurance Forum March 16-17,
2006 provide opportunities for additional public interaction in this area.
Editor:
Samuel T. Redwine, Jr.
Authors:
Rusty O. Baldwin
Mary L. Polydys
Samuel T. Redwine, Jr.
Daniel P. Shoemaker
Additional contributors included … Ford Motor Company provided staff time that helped in the
production of the section on Post-Release particularly its reference list.
The editor and authors want to thank Working Group members, particularly their co-authors, for the
many review comments received and the helpful issue discussions that occurred. In part, Sam
Redwine’s work was supported by …
Special thanks go to
• Joe Jarzombek without whose leadership and support this report would not exist
• Nancy Mead and others at the Software Engineering Institute, and Gary McGraw and others at
Cigital who shared their efforts in this area

Working Group membership:

The Working Group’s life extends beyond the production of this report, and its goals remain the same,
to help create a US workforce capable of developing, sustaining, and acquiring secure software – but
its specific activities may vary.
Any corrections or comments regarding this report should be sent to Kim Caraway
caraway_kimberly@bah.com with a copy to Sam Redwine – redwinst@jmu.edu.

i
DRAFT
Comments Due 21 July 2005

0 Preface
0.1 Introduction
This preface provides a history of the motivations, activities, and documents design rationales for this
report – Secure Software Common Body of Knowledge. Those interested in the substance rather than
the rationale will find it useful to proceed directly to the Introduction on page 1.
The number and variety of attacks by persons and malicious software from outside organizations,
particularly via the Internet, are increasing rapidly, and the amount and consequences of insider attacks
remain serious. Over 90% of security incidents reported to the CERT Coordination Center result from
defects in software requirements, design, or code.
{GRS – somewhat interesting, but it does not appear necessary, why bother?}In 2003, under the
leadership of Joe Jarzombek, then Deputy Director for Software Assurance, Information Assurance
Directorate, Office of Assistant Secretary of Defense (Networks and Information Integration), DoD
launched a Software Assurance Initiative. In 2004, the Department of Homeland Security joined in
collaboration with this Initiative. Joe Jarzombek moved in March 2005 to Director for Software
Assurance, National Cyber Security Division, Information Analysis and Infrastructure Protection,
Department of Homeland Security but retains his leadership role in the collaborative interagency
Software Assurance Initiatives. This report was produced under his leadership.
The Software Assurance Initiative issued an initial report in October 2004, and held three open,
jointly-sponsored Software Assurance Forums and a number of individual working group (WG)
meetings. Among the working groups is one on education and training including members from
government, industry, and academia. The Initiatives’ leadership has also made a number of
appearances at professional events to publicize the Initiatives and the problems it is addressing, in
hopes of soliciting further collaboration.
Driven by awareness of the rampant world-wide explosion in exploitation of software vulnerabilities,
there is a growing demand for low-defect, secure software, in both the defense and commercial
sectors. Current, commonplace software specification, design, implementation, and testing practices
provide users with software containing numerous defects and security vulnerabilities. Government and
industry need processes that effectively and efficiently acquire develop, and sustain secure software,
means to justify confidence that these processes have been applied, and practitioners that are
motivated, disciplined, and proficient in their execution.

0.2 Initial Creation of a Secure Common Body of Knowledge


Currently, other aspects of the Initiative are concentrating on achieving and assuring security
properties and functionality.1 This includes not only software development but also the acquisition2
and sustainment processes. The Workforce Education and Training Working Group is addressing the
issues related to achieving adequate US education and training on software security, including skill
shortages within government and industry, and curriculum needs within universities, colleges, and
trade schools.

The approach taken by the WG was to ask the following questions:

1
Such software must, of course, also be satisfactory in other aspects as well such as usability and mission support.
2
Note that the US/UK/Australian Tri-Lateral SW Acquisition Initiative now has a SW Assurance work-strand

ii
DRAFT– Restricted Distribution
For Review and Comment Only
1. What are the engineering activities or aspects of activities that are relevant to achieving secure
software?
2. What knowledge is needed to perform these activities or aspects?

Initially, the subgroup addressing software development took the Software Engineering Body of
Knowledge (SWEBOK3) [Bourque and Dupuis 2004] as a starting point, describing of the knowledge
required for the engineering of software. It was recognized that from the perspective of obtaining
secure software, the SWEBOK lacks both breath and depth. For example, the SWEBOK Guide was
designated as the knowledge a software engineer with a bachelor’s degree plus four years experience
has. Thus, it clearly does not describe all that is needed to do a substantial software project excellently,
including producing secure software. Indeed, according to National Cyber Security Partnership
Taskforce Report on Processes to Produce Secure Software [Redwine 2004], even the average
Capability Maturity Model® (CMM®) Level 5 organization with one defect per KSLOC (thousand of
lines of source code) has at least ten times more defects than the minimum one might contemplate as a
starting point for producing secure software – and Level 5 organizations include persons with
capabilities considerably beyond a bachelors degree plus four years. Where an element of achieving
secure software is covered, there is also the concern that depth of coverage can also be an issue.

The bottom line is that saying something is or is not in SWEBOK Guide is not identical with saying it
is in the body of knowledge for engineering secure software or is covered adequately. Yet in the
absence of better alternatives, the SWEBOK Guide is being used as a starting point for the body of
knowledge for software security engineering.

The subgroups on Acquisition and Supply and on Sustainment and Operations have the even more
difficult problem of finding existing descriptions of their “unsecured” software bodies of knowledge.
The acquisition subgroup used federal acquisition documents and sustainment used the Maintenance
section of SWEBOK Guide and relevant standards as partial answers.

Additionally, the efforts to answer the question “What are the engineering activities or aspects of
activities that are relevant to achieving secure software?” benefited from a number of prior efforts:
• National Cyber Security Partnership Taskforce Report on Processes to Produce Secure
Software [Redwine 2004]

• Safety and Security Extensions for Integrated Capability Maturity Models [Ibrahim et al, 2004]

• IEEE Computer Society Software and Systems Engineering Standards Committee (S2ESC)
collection of IEEE standards

• ISO/IEC JTC1/SC7 WG9 Redefined its terms of reference to software and system assurance
(part of Systems Engineering System Life Cycle Processes)

• ISO/IEC 15026 to address management of risk and assurance of safety, security, &
dependability within context of system and software life cycles [ISO 15026]

• National Institute of Standards and Technology (NIST) FISMA Implementation Project

3
®SWEBOK is an official service mark of the IEEE

iii
DRAFT
Comments Due 21 July 2005

• The Common Criteria including the new version 3.0 issued in July 2005 [NIST 2005]

• The SafSec effort in the UK combining concern for safety and security [SafSec Introduction,
Guidance, Standards]

• A variety of safety related standards efforts as they address a number of issues shared by safety
and security, and the safety community has useful, relevant experience

Members of the WG include editors and authors within several of these efforts. Even with these prior
efforts, answering this question about relevant activities occupied much of the subgroups’ time for a
number of meetings. These involved a number of brainstorming sessions including a one-and-a-half
day session with group support software and a professional facilitator. The WG also benefits from the
work of other Software Assurance Initiative Working Groups such as the one on Software Assurance
Processes and Practices, which are also interested in the activities involved in producing and assuring
secure software.
The resulting lists need to be consolidated into some set of categories. Among the candidate sets of
categories are those from:
• ISO/IEC 15288 – System Life Cycle Processes

• ISO/IEC 12207 – Software Life Cycle Processes

• ISO/IEC 14764 – Software Maintenance

• IEEE Computer Society – Guide to the Software Engineering Body of Knowledge

• ISO harmonization proposal for ISO/IEC 15288 and ISO/IEC 12207

• IEEE Std 15288-2004 – Adoption of ISO/IEC 15288:2002 System Life Cycle Processes

• IEEE Std 1220-1998 – Application and Management of the Systems Engineering Process

• ANSI/EIA 632-1988 – Processes for Engineering a System

• IEEE/EIA 12207.0-1966 – Industry Implementation of ISO/IEC 12207:1995

• IEEE/EIA 12207.1-1997 - Industry Implementation of ISO/IEC 12207:1995 - Life Cycle Data

• IEEE/EAI 12207.2-1997 - Industry Implementation of ISO/IEC 12207:1995 – Implementation


considerations

Early discussions revolved around using the first two of these, but later the last two gained support. In
the end, the WG picked a set of generic categories capable of being easily mapped to a number of
other standards, curricula, and body of knowledge efforts categories to speed future work and
acceptance.
Answering the second question, moving from activities to required knowledge, is sometimes easy and
sometimes difficult. While one is often tempted to state that the knowledge is, “The knowledge of
activity X,” this is usually unsatisfactory except in specific cases.
The WG has yet to decide to what extent the knowledge included in the body of knowledge it produces
will exclude knowledge identified by the second question but already identified by existing standards

iv
DRAFT– Restricted Distribution
For Review and Comment Only
for software engineering. In the end, the question of what to leave out or make only a passing mention
of was answered pragmatically by what was believed to be known (or at least known of) by most of
the members of the intended audiences for this report.
At its fifth meeting in June 2005, the WG began efforts to identify the knowledge needed by the
previously identified activities. Considerable effort is expected before a satisfactory knowledge set is
finally settled upon. Crucial to this effort is review by experts not intimately involved with arriving at
the original draft set.
The WG has benefited from the experiences of the SWEBOK project. For example, the WG has
established one or two authors for each section of its report following the SWEBOK experience that
showed that arrangements with more co-authors worked poorly.
The group decided that its initial output should have a level of exposition between that used by the
SWEBOK Guide – essentially all prose and that used by DoD information assurance standards
Committee on National Security Systems (CNSS) Training Standards 4011 and 40124 – essentially
lists. Members of the WG felt that lists would be too sparse for an audience most of whom lacked prior
knowledge of the area but that at the lowest level of detail lists were often adequate and more
amendable to production within the short timeframe available for its initial report.
Initially, the Workforce Education and Training Working Group, therefore, provides an inclusive list
of the “additional” knowledge needed to acquire, develop, and sustain secure software including
assurance (objective grounds for confidence) of its security properties and functionality. To better aid
industry, government, and academia in targeting their education and training curricula, as well as to
help self-study efforts, the WG may eventually need to provide indications of what knowledge is
needed by those in different roles.
The WG’s intents to ensure adequate coverage of requisite knowledge areas in contributing disciplines
to enable professionals in several disciplines, such as software engineering, systems engineering,
program management, etc., to identify and acquire competencies associated with secure software.
Because of this wide coverage and applicability, the intended product is officially called the,
“Software Assurance Common Body of Knowledge.”

Samuel T. Redwine, Jr.


James Madison University, Harrisonburg, Va. July 2005

0.3 References
[Bourque and Dupuis 2004] Bourque, Pierre, and Robert Dupuis (Editors). Guide to the Software
Engineering Body of Knowledge. 2004 Edition. Los Alamitos, California: IEEE Computer Society,
February 2004.
[Ibrahim et al, 2004] Ibrahim, Linda, et al. Safety and Security Extensions for Integrated Capability
Maturity Models. Washington D.C.: United States Federal Aviation Administration, Sept. 2004.
[ISO 12207] ISO/IEC Std. 12207:1995, Information Technology - Software Life Cycle Processes,
International Standards Organization, 1995.
[ISO 12207b] ISO/IEC Std. 12207:1995/2002, Information Technology - Software Life Cycle
Processes (Amendment 1), International Standards Organization, 1995.

4
See http://www.cnss.gov/full-index.html

v
DRAFT
Comments Due 21 July 2005
[ISO 14764] ISO/IEC Std. 14764:1999, Information Technology – Software Maintenance,
International Standards Organization, 1999.
[ISO 15026] ISO/IEC Std. 15026:1998, Information Technology - System and Software Integrity
Levels, International Standards Organization, 1998.
[ISO 15288] ISO/IEC Std. 15288:2002 E, Systems Engineering – System Lifecycle Processes,
International Standards Organization, 2002 .
[NIST 2005] The National Institute of Standards and Technology. Common Criteria v. 3.0, July, 2005.
[Redwine 2004] Redwine, Samuel T., Jr., and Noopur Davis (Editors). Processes for Producing
Secure Software: Towards Secure Software. vols. I and II. Washington, D.C.: National Cyber Security
Partnership, 2004.
[SafSec Introduction] Why SafSec? SafSec: Integration of Safety and Security.
http://www.safsec.com/safsec_files/resources/50.6_Why_SafSec.pdf.
[SafSec Guidance] SafSec Methodology: Guidance Material. SafSec: Integration of Safety and
Security. http://www.safsec.com/safsec_files/resources/50_3_SafSec_Method_
Guidance_Material_2.6.pdf.
[SafSec Standard] SafSec Methodology: Standard. SafSec: Integration of Safety and Security.
http://www.safsec.com/safsec_files/resources/50_2_SafSec_Method_Standard_2.6.pdf
[US DoD 2004] U. S. Department of Defense. Interim Report on “Software Assurance: Mitigating
Software Risks in the DoD IT and National Security Systems”. Official Use Only, September 2004.

vi
DRAFT– Restricted Distribution
For Review and Comment Only

0 Table of Contents
0 FOREWORD.................................................................................................................................................................I
0 PREFACE.....................................................................................................................................................................II
0.1 INTRODUCTION.............................................................................................................................................................II
0.2 INITIAL CREATION OF A SECURE COMMON BODY OF KNOWLEDGE......................................................................................II
0.3 REFERENCES...............................................................................................................................................................V
0 TABLE OF CONTENTS.........................................................................................................................................VII
1 INTRODUCTION.........................................................................................................................................................1
1.1 SCOPE........................................................................................................................................................................1
1.2 PURPOSE....................................................................................................................................................................5
1.3 RELATED AREAS ........................................................................................................................................................5
1.4 DOCUMENT ORGANIZATION...........................................................................................................................................6
1.5 COMMON REFERENCES.................................................................................................................................................7
1.6 REFERENCES...............................................................................................................................................................8
1.7 FURTHER READING......................................................................................................................................................8
2 NATURE OF DANGERS.............................................................................................................................................9
2.1 INTRODUCTION............................................................................................................................................................9
2.2 DANGEROUS OPERATIONAL EVENTS...............................................................................................................................9
2.3 DANGEROUS EFFECTS.................................................................................................................................................10
2.4 ATTACKS..................................................................................................................................................................10
2.5 MISTAKES AND HAZARDS...........................................................................................................................................12
2.6 CONCLUSION.............................................................................................................................................................12
2.7 REFERENCES.............................................................................................................................................................13
2.8 FURTHER READING....................................................................................................................................................13
3 FUNDAMENTAL CONCEPTS AND PRINCIPLES.............................................................................................14
3.1 INTRODUCTION..........................................................................................................................................................14
3.2 BASIC CONCEPTS AND PRINCIPLES...............................................................................................................................14
3.3 SECURE SOFTWARE ...................................................................................................................................................19
3.4 CONTEXT..................................................................................................................................................................21
3.5 REFERENCES.............................................................................................................................................................21
3.6 FURTHER READINGS...................................................................................................................................................22
4 DEVELOPMENT.......................................................................................................................................................24
4.1 INTRODUCTION..........................................................................................................................................................24
4.2 MORE ON FUNDAMENTAL CONCEPTS AND PRINCIPLES.....................................................................................................25
4.3 ETHICS, LAW, AND GOVERNANCE................................................................................................................................32
4.4 REQUIREMENTS FOR SECURE SOFTWARE .......................................................................................................................35
4.5 SECURE SOFTWARE DESIGN........................................................................................................................................49
4.6 SECURE SOFTWARE CONSTRUCTION..............................................................................................................................64
4.7 SECURE SOFTWARE VERIFICATION, VALIDATION, AND EVALUATION.................................................................................71
4.8 SECURE SOFTWARE TOOLS AND METHODS....................................................................................................................85
4.9 SECURE SOFTWARE PROCESSES....................................................................................................................................90
4.10 SECURE SOFTWARE PROJECT MANAGEMENT................................................................................................................96
5 POST-RELEASE ASSURANCE...............................................................................................................................98
5.1 INTRODUCTION..........................................................................................................................................................98
5.2 OPERATIONAL ASSURANCE (SENSING)..........................................................................................................................98

vii
5.3 KA2 ANALYSIS......................................................................................................................................................101
5.4 KA 3 RESPONSE MANAGEMENT (RESPONDING) .........................................................................................................104
5.5 KA 4 INFRASTRUCTURE ASSURANCE..........................................................................................................................108
6 ACQUISITION/SUPPLY OF SECURE SOFTWARE.........................................................................................111
6.1 MORE ON FUNDAMENTAL CONCEPTS..........................................................................................................................112
6.2 PROGRAM INITIATION AND PLANNING.........................................................................................................................117
6.3 CONTRACT FINALIZATION AND MANAGEMENT—ACQUIRER...........................................................................................124
6.4 REFERENCES...........................................................................................................................................................126
7 BIBLIOGRAPHY.....................................................................................................................................................127
8 STANDARDS............................................................................................................................................................137

viii
1 Introduction
1.1 Scope
The Department of Defense (DoD) and the Department of Homeland Security (DHS) Software
Assurance Initiatives and efforts have encompassed software safety and security and combined a
number of disciplines. Currently, the efforts are concentrating on achieving and assuring security
properties and functionality. The Initiatives’ Workforce Education and Training Working Group,
composed of government, industry, and academic members produced this document as a first step
towards achieving adequate US education and training on in this area by defining the additional body
of knowledge needed to develop, sustain, and acquire either custom or off-the-shelf (OTS) secure
software beyond that required to produce and assure software where safety and security are not
concerns.
The substantial costs of a vulnerability1 to software producers result from a number of activities –
initial testing, patching, remediation testing, and distribution, as well as negative impact on reputation.
A recent study showed that among a set of major venders, announcing a product vulnerability was
followed by an average 0.6% fall in stock price, or an $860 million fall in the company's value.2 Thus,
producers can suffer serious costs and consequences.
Increasingly losses of confidential data result in identity theft and significant fraud losses to firms and
customers. Changes in laws and regulations now often result in such data losses becoming public. More
seriously, established firms are losing business and at least one, ChoicePoint may be forced out of business.
This follows ChoicePoint stock falling 20% in the period the incident was disclosed.3
The imperative nature of the problem is evidenced by the number and variety of attacks by persons and
malicious software from outside organizations, particularly via the Internet, while the number and
consequences of insider attacks remains serious. Over 90% of security incidents reported to the CERT
Coordination Center result from defects in software requirements, design, or code. Because today,
security problems involving computers and software are frequent, widespread, and serious; and since
cyber security is an imperative concern for DHS and DoD, the choice was made to initially concentrate
Software Assurance Initiative efforts on security.
This report concentrates on the knowledge associated with producing and acquiring secure software. It
mentions only in passing physical, operational, communication, hardware, and personnel security.
These are important topics in cybersecurity but outside the scope of this report. Concentrating on
software still covers the bulk of the security vulnerabilities being exploited today – the ones in
software. While the public prominence of software security problems is a recent development,
software security has long been studied, and, while open questions remain, considerable bodies of
research and practices to address them exist.
1.1.1 Software Security Goals and Properties
The goal of software security is the preservation of security properties including confidentiality,
integrity, and availability (CIA). Confidentiality, preventing unauthorized disclosure, and integrity,
1
“Vulnerability: A flaw or weakness in a system's design, implementation, or operation and management that could be exploited to
violate the system's security policy.” SANS Glossary of Terms Used in Security and Intrusion Detection, SANS Institute, 2001
2
this article appears in new scientist magazine issue: 25 june 2005; Written by Celeste Biever http://www.newscientist.com.
This and similar questions have been explored at Workshops on Economics of Information Security since 2002.
http://infosecon.net/workshop/index.php
3
This incident was not necessarily the result of a software flaw.

1
preventing unauthorized alteration, require a mechanism firmly establishing identities – authentication
– and one only allowing permitted actions – access control. Preserving availability is about preventing
unauthorized destruction or denial of access or service.
Other security properties include the ability to later reestablish all acts that occurred and their related
actors, accountability, and having relevant actors unable to effectively deny an act occurred, non-
repudiation. Thus, security is a question of systems properties and security mechanisms or
functionality to aid in preserving them.
Security is not just a question of security functionality; the properties desired must be shown to hold
wherever required throughout the system. Because security properties are emergent systems properties,
security is an omnipresent issue throughout the software lifecycle. [McGraw 2003]
In addition to a software system’s preservation of these properties within its digital domain, it can
contribute to other systems, organizational, or societal security goals including:
• Establishing the authenticity of users and data
• Establishing accountability of users
• Providing usability including transparency to users to gain acceptance and resulting security
• Providing the abilities to deter and mislead attackers, detect attacks when they happen, alert
users when they occur, continue service, confine their damage, rapidly recover from them, and
repair software to prevent future attacks, as well as investigate to identify and convict the
attackers
As well as this ability to tolerate and recover from effects of attacks, the ability of a system to employ
multiple layers of defense is desirable. Deciding the extent of security-oriented development effort and
functionality is a risk management decision. Once decided upon, the required security properties need
to be explicitly defined. Neither in the physical world nor in software can security be absolutely
guaranteed. Thus, when this report speaks of “secure software” the actual meaning is “highly secure
software realizing – with justifiably high confidence but not guaranteeing absolutely – a substantial set
of explicit security properties and functionality including all those required for its intended usage.”
[Redwine 2004, p. 2] One can also state this in a negative way as “justifiably high confidence that no
software-based vulnerabilities exist that the system is not designed to tolerate”.
1.1.2 Software Process
The first books enumerating steps to produce software appeared in the early 1960’s – if not earlier.
Software process has been an active area of work in industry, research, and government ever since –
within this has been significant work on processes for high-dependability systems. Today, a plethora
of books contain general-purpose practices and processes. These range from lightweight processes
placing few requirements on developers to heavyweight ones that provide a high level of guidance,
discipline, and support. [Boehm 2003] Generally and not surprisingly, success in producing high-
dependability systems aimed at safety or security has been greater with software processes closer to
the heavyweight end of the spectrum and performed by highly skilled people.
Three things aid in reliably producing secure software: [Redwine 2004, p. 3]
1. An outstanding software process performed by skilled people
2. A sound mastery of the relevant security expertise, practices, and technology
3. The expert management required to ensure the resources, organization, motivation, and
discipline for success

2
Across these aspects skilled people may be the highest leverage element. Achieving these skills,
however, requires considerable knowledge beyond that already required for simply a good software
engineering process.
1.1.3 Secure Software Knowledge

The knowledge involves falls naturally into three categories the nature of attacks, how to defend,
and the computing systems environment in which the conflict takes place. The report also concerns
itself with the acquisition of secure software. Thus the coverage includes
• Attack: entities, objectives, strategies, techniques, and effects
• Defense: roles, objectives, strategies, techniques, and how to develop and sustain software to
defend
• Arena: aspects of environment for operational software with security implications
• Acquisition of secure software
How to defend includes the entire secure software system lifecycle with approaches and pitfalls
from concept through disposal covering all aspects
Technical
Managerial
Work environment and support
The bulk of needed engineering knowledge remains unchanged throughout original development and
sustainment as well as whether the system is “acquired” or developed for in-house use. However,
sustainment and acquisition both have unique aspects. In addition, the acquisition phase has a large
impact on the success of the system, and much of the funds expended on software go toward
sustainment. Therefore, the initial work has been divided into five parts
1. Nature of Dangers – attacks and non-malicious acts in the arena of conflict that may threaten
security
2. Fundamental Concepts and Principles – needed knowledge across all areas
3. Development – engineering, management, and support directly involved in producing secure
software
4. Sustainment or Post-Release – unique considerations after initial deployment
5. Acquisition – how to successfully purchase or otherwise acquire secure software
And the organization of this report reflects this division.
Because the knowledge for development is recorded elsewhere in bodies of knowledge, de facto and
official standards, curricula standards, and textbooks this document only covers the “difference”
between quality “insecure” software and secure software. Alternately, one could say this report
presumes a baseline of currently generally accepted knowledge and creates a new baseline that
subsumes it plus the knowledge needed for development sustainment and acquisition of secure
software by identifying this additional knowledge.
This additional secure software knowledge identified includes
• Knowledge for doing the most rigorous approaches including extensive use of formal methods
• Some knowledge relevant to less rigorous approaches
• An outline of the knowledge often relevant for dealing with legacy software deficient in
security

3
• Proven knowledge – although the degree of prior use may vary
• Knowledge currently useful or expected to be needed or useful in the near future, whether
proven in fielded systems or not
While this last bullet calls for well-informed judgments, in a rapidly changing field all such
compendiums as this must look ahead to avoid being outdated when published. On the other hand,
experts should find little here to surprise them since much of the knowledge necessary has existed for
over 15 years.
What is included in the set of activities related to “unsecured” software and excluded as below the
bottom edge of this report’s concern varies by the three parts – development, sustainment, and
acquisition. An approximations of development knowledge might be the sum of the
• Knowledge identified in the Guide to the Software Engineering Body of Knowledge
(SWEBOK) [Bourque and Dupuis 2004]
• The knowledge contained in the SW-CMMI document through Level 5 – while not a body of
knowledge, significant knowledge is in the document text itself
• Knowledge identified as required in the ACM/IEEE-CS undergraduate curriculum for
Software Engineering [ACM 2004]
Mere mention in these of a topic does not necessarily imply the topic is excluded. For example, when
security requirement exist, some topics already in the SWEBOK Guide may needed to be carried to an
extreme or have additional aspects. The SWEBOK Guide is described as the knowledge that should be
possessed by a software engineer with a bachelor’s degree plus four years experience. This clearly
does not include all that is needed to do a substantial software project excellently. Indeed, according to
National Cyber Security Partnership Taskforce Report on Processes to Produce Secure Software
[Redwine 2004], even the average Capability Maturity Model® (CMM®) Level 5 organization with
one defect per KSLOC (thousand of lines of source code) has at least ten times more defects than the
minimum one might contemplate as a starting point for producing secure software – and Level 5
organizations include persons considerably beyond a bachelors degree plus four years.
Depth of coverage is also an issue to be handled with care. For example, threat analysis is a major
concern when producing secure software, and, while mentioned in the SWEBOK Guide, it is
mentioned only in passing in one sentence within the Software Quality section, “For software in which
safety or security is important, techniques such as hazard analysis for safety or threat analysis for
security may be used to develop a planning activity which would identify where potential trouble spots
lie.” Clearly, while the SWEBOK Guide does list a number of relevant references, a software
assurance body of knowledge should contain a more detailed breakdown of the knowledge required to
do threat analysis.
Some of the knowledge excluded from sustainment knowledge includes
• Knowledge identified in SWEBOK Maintenance chapter
• Knowledge contained in relevant sections of ISO 12207
• Knowledge in ISO/IEC Std. 14764 Information Technology – Software Maintenance
Knowledge presumed in acquisition area includes the knowledge in official regulations, standards and
guidelines; and in the curricula of the National Defense University or similar master’s level civilian
universities – excluding those elements directly addressing safety or security.

4
1.2 Purpose
As a report produced by the Software Assurance Initiatives’ Workforce Education and Training
Working Group, this report is a needed preliminary step towards addressing the issues related to
achieving adequate US education and training on software security including skill shortages within
government and industry and curriculum needs within universities, colleges, and trade schools. This
interim version primary goal is to obtain feedback from a wide community including first attempts a
usage leading to a version 1.0.
While the ultimate goal is to improve the software development workforce, its members are not the
direct target audience of this report. Rather its intended audiences some of the document-related goals
for them include
• Experts – receive feedback: modifications or validation
• Educators – influence curriculum
• Trainers – extend/improve contents of training of current workforce
• Acquisition personnel – aid in acquiring (more) secure software
• Evaluators and testers – recognize any content beyond what is covered in their current
evaluations and consider its implications for evaluations of persons, organizations, and
products.
• Standards developers – encourage and facilitate enclosure of security-related items in standards
Of course, it may, nevertheless, be of use to those practitioners interested in self-education in secure
software engineering or acquisition of secure software and willing to read the references as well as
those interested in workforce policy.
You should review this report, consider what it means for you, and decide if you can use it in its
present form or transform it into a form useful to you. One might chose to use this report or decide to
wait until the Software Assurance Initiative (or others) expands its contents into a product more useful
for them – e.g. one including competences or mapped to an existing curriculum. In any case, feedback
from you is important to improving future versions of this report and additional Software Assurance
Initiative products.
Specific initial targets for influence include universities and training organizations willing to be trial
users or early adopters, and influencing content of the SWEBOK Guide.

1.3 Related Areas


While the term “software assurance” potentially refers to the assurance of any property or functionality
of software, the Initiative encompasses safety and security and integrates practices from the disciplines
in Error: Reference source not found as well as others such as communications and computer
engineering. Currently, however, the Initiative is concentrating on achieving and assuring security
properties and functionality while recognizing that secure software must, of course, also be satisfactory
in other aspects as well, such as usability and mission support. [IAD 2004]
As mentioned previously, the knowledge to develop, sustain, and acquire “unsecured” software is not
included in this report. In addition, a number of areas related to or included in the knowledge needed
to produce secure software are not included in Information Systems
Assurance Project Mgt
detail. Some underlying fundamentals are Engineering

simply presumed, such as


• Discrete mathematics Software Software Software
Acquisition Assurance Engineering
• Probability theory

Safety &
Security 5

Figure 1: Disciplines Contributing to Software Assurance


• Organizational studies
Others more directly involved include
• Computer engineering
• Network security
• Personnel security
• Operational security
• Criminology, legal system, and law and regulatory enforcement
• Intelligence
• Counter-intelligence
• Military strategy and tactics
• Usability engineering
• Executive management
For these, elements that are directly relevant are noted, although sometimes briefly or at a high level of
abstraction. Two related areas have a number of aspects mentioned, but are not covered in their
entirety.
• Systems engineering
• Security engineering
In many situations, relevant concerns in secure software development, sustainment, and acquisitions
could easily be labeled “systems” concerns. With this in mind, this report sometimes uses the term
“software system”.
Finally, domain knowledge of the area applied, be it operating systems or banking, is quite important
in practice but out of the scope of this report.
While a number of related areas of knowledge have been identified above and mentioned in the body
of this report, knowledge area descriptions in the body do not cover details of particular products or
operational activities since such coverage exists elsewhere. Examples of areas that are not addressed in
detail include
• Specific details of Windows, Unix, Linux, router, and telephone switching operating systems
• Static and dynamic routing tables, network operations, and TCP/IP protocol operations, and
routing protocols as they relate to traffic flow on the internet over access provided by common
carriers
• The Java-orient framework J2EE and Microsoft’s .NET
• Rules of evidence, search and seizure, and surveillance laws, and investigative methods and
procedures
While always recognizing the limits of their own expertise, persons producing secure software need to
be able to communicate with persons in all related disciplines as well as with stakeholders of many
kinds including particularly users and owners of assets to be protected, and the owners and operators
of the software and systems involved.

1.4 Document Organization


In addition to a Preface and this Introduction, this reports contains four main parts covering
respectively

6
1. Dangers –- introduction to the nature of attacks and other dangers software (secure or
otherwise) faces
2. Fundamental Concepts and Principles – needed knowledge across all areas
3. Development – engineering, management, and support directly involved in producing secure
software
4. Sustainment or Post-Release – unique considerations after initial deployment
5. Acquisition – how to successfully acquire secure software without developing it
These are followed by a bibliography that gathers the entries in the references and lists of additional
readings that appear at the end of each significant section including each section within Development,
and closes with a list of standards. Choosing references faced some problems. With a few exceptions
such as secure coding, secure software engineering and assurance do not have a set of comprehensive
or specialty textbooks that can be used as references. Though the field started in the 1960’s, it suffered
from the lack of commercial interest during the 1990’s and early 2000’s. For example, possibly the last
comprehensive introductory text on highly secure software systems was [Gasser 1988].
Thus today, many of the relevant references pre-date 1990 or appear in conferences, workshops,
technical journals, or even less accessible places. Efforts were made to provide good recent references,
but many old (by high-tech standards) references are still the relevant ones, and those expecting
packaged in easily assimilated practitioner-oriented materials will sometimes be disappointed. Despite
these problems, ten common references are listed at the end of this Introduction and used where
possible across all the sections to partially reduce the number of items users need to acquire.
After reading the next section, Fundamental Concepts and Principles, readers especially interested in
one of the three major topical areas – Development, Post-Release Sustainment, and Acquisition – may
go directly to their major topic of interest. They may find, however, that to avoid excessive duplication
the section also references other parts. The report ends with a combined list of references and
additional reading.
Finally, readers desiring a deeper understanding of the motivations and history leading to this report
can find information in the report’s Preface.

1.5 Common References


The ten common references for use across all sections are listed below. Seven of these are available for
free including three of the books.
1.5.1.1 Common References Spanning All Sections
[Avizienis 2004] Avizienis, Algirdas, Jean-Claude Laprie, Brian Randell, and Carl Landwehr. “Basic
Concepts and Taxonomy of Dependable and Secure Computing,” IEEE Transactions on Dependable
and Secure Computing, vol. 1, no. 1, Jan.-Mar. 2004, pp. 11-33.
[Bishop 2003] Bishop, Matt. Computer Security: Art and Practice, Addison-Wesley, 2003.
[Bourque and Dupuis 2004] Bourque, Pierre, and Robert Dupuis (Editors). Guide to the Software
Engineering Body of Knowledge. 2004 Edition. Los Alamitos, California: IEEE Computer Society,
Feb. 16, 2004.
[Gasser] Gasser, M. Building a Secure Computer System. Van Nostrand Reinhold, 1988.
[Ibrahim et al, 2004] Ibrahim, Linda, et al. Safety and Security Extensions for Integrated Capability
Maturity Models. Washington D.C.: United States Federal Aviation Administration, Sept. 2004.
[Meier 2004] Meier, J.D., Alex Mackman, Srinath Vasireddy, Michael Dunner, Ray Escamilla, and
Anandha Murukan, Improving Web Application Security: Threats and Countermeasures, Microsoft,

7
2004 http://download.microsoft.com/download/d/8/c/d8c02f31-64af-438c-a9f4-
e31acb8e3333/Threats_Countermeasures.pdf
[Redwine 2004] Redwine, Samuel T., Jr., and Noopur Davis (Editors). Processes for Producing
Secure Software: Towards Secure Software. vols. I and II. Washington, D.C.: National Cyber Security
Partnership, 2004.
[Sommerville 2004] Sommerville, I. Software Engineering. 7th ed. Pearson Education, 2004.
[Viega 2005] Viega, J. The CLASP Application Security Process. Secure Software, 2005.
[Whittaker 2004] Whittaker, J. A. and H. H. Thompson. How to Break Software Security: Effective
Techniques for Security Testing. Pearson Education, 2004.

1.6 References
[Bourque and Dupuis 2004] Bourque, Pierre, and Robert Dupuis (Editors). Guide to the Software
Engineering Body of Knowledge. 2004 Edition. Los Alamitos, California: IEEE Computer Society,
Feb. 16, 2004.
[McGraw 2003] McGraw, Gary E., “On the Horizon: The DIMACS Workshop on Software Security”,
IEEE Security and Privacy. IEEE, March/April 2003.
[Redwine 2004] Redwine, Samuel T., Jr., and Noopur Davis (Eds.). Processes for Producing Secure
Software: Towards Secure Software. vol. I and II. National Cyber Security Partnership, 2004.
[ISO 14764] ISO/IEC Std. 14764:1999, Information Technology – Software Maintenance,
International Standards Organization, 1999.

1.7 Further Reading


[Anderson 2001] Anderson, Ross J., Security Engineering: A Guide to Building Dependable
Distributed Systems. John Wiley and Sons, 2001.

8
2 Nature of Dangers
2.1 Introduction become events that the software may try to
prevent and react to.
Because without them security would, of Microsoft has a list based on an acronym,
course, not be such a serious concern, dangers STRIDE listing a set of dangerous attacker actions
particularly attacks will frequently appear in
this report and in aspects of the knowledge • Spoofing identity
needed to develop, sustain, and acquire (more) • Repudiation
secure software. This section provides an • Tampering with data
introduction and necessary background to
understand software security. The knowledge • Information disclosure
identified in this section is relevant to persons • Denial of service
developing, sustaining, and acquiring secure • Elevation of privilege
software.
However, to gain an adequate appreciation of the
The definition and subtleties of security will be kinds of dangerous events one might want a
explored at length in the following sections. longer more explanatory list. A number of
For the purpose of beginning to understand the sometime overlapping examples of such events
dangers being addressed, the reader needs to are [MILS PP]
understand that security is often spoken of as a
• An attacker
composite of the three of attributes –
confidentiality, integrity, and availability. o Eavesdrops on, or otherwise captures, data
Security often requires the simultaneous being transferred across a network.
existence of 1) availability for authorized o Gains unauthorized access to information
actions only, 2) confidentiality, and 3) integrity or resources by impersonating an
with absence of ‘unauthorized’ system authorized user.
alterations. [Avizienis 2004, p. 13] o Compromises the integrity of information
Security is not as simple as these sounds. by its unauthorized modification or
Neither confidentiality nor integrity of assets destruction
can be achieved unless identities can be firmly
o Performs unauthorized actions resulting in
established – authenticated – and only
an undetected compromise of assets.
authorized actions – access control. Even if an
action is considered legitimate when it occurs, o Observes an entity multiple uses of
the properties of later being able to establish all resources or services and, by linking these
acts that occurred and their related actors, and uses, deduces confidential information.
having relevant actors unable to effectively o Observe the legitimate use when the user
deny an act occurred are known as wishes kept confidential their use of that
accountability and non-repudiation. [Landwehr resource or service to be.
2001] • An authorized user
2.2 Dangerous Operational o Accesses without permission from the
person who owns, or is responsible for, the
Events information or resource.
A large variety of technical events might result o Abuses or unintentionally performs
in bad security-related outcomes. These authorized actions resulting in undetected
compromise of the assets

9
o Consumes shared resources and • Denial
compromises the ability of other • Repudiation
authorized users to access or use those
resources. • Destruction
o Intentionally or accidentally, observe • Usurpation
stored information not authorized to Elevation of privilege increasing the attackers
see. authorized access and actions is a common
o Intentionally or accidentally transmits intermediate step.
sensitive information to users not Microsoft offers a suggestive list as amemory aid
authorized to see it. based on the acronym DREAD for use when
o Unwittingly introduces a virus estimating the risk and damage
compromising assets. • Discoverability: by adversary
• A user may participate in the transfer of • Reproducibility: as successful exploit
information (either as originator or • Exploitability: effort and expense to mount
recipient) and then subsequently deny attack
having done so. • Affected users
• The integrity of information may be • Damage potential
compromised due to errors – by user or
hardware, or during transmission. Consideration of consequences and risk
management will occur in many sections of this
• Administrators or other privileged users report. They drive many of the decisions related to
compromise assets by careless, willfully secure software.
negligent or hostile actions.
The consequences of attacks can be severe. The
• Human mistake; or failure of software, introduction noted the monetary costs to
hardware, or power results in abrupt developers and operators of vulnerabilities and
interruption of operation and loss or successful attacks disclosing personal information.
corruption of security-critical data. National defense can be severely affected as well.
• Security-critical parts subjected to Every day financial institutions move immense
physical attack compromising security. amounts of money electronically every day and
Protecting from or reacting to all the events large amount of cargo is transported using
listed above will seldom fall to a single piece computer generated guidance and documents. In
of software, but any could be relevant to a both cases poor software security could (and have)
given product. Nevertheless, the list provides resulted in large exposures to loss.
insight into the kinds of compromising events
that may occur. 2.4 Attacks
This subsection covers a variety of information
2.3 Dangerous Effects about attacks and attackers their motivation,
Disclosure, impersonation, successful methods, and consequences. First attackers and
deniability, contamination, or destruction or their motivations are discussed, followed by
denial of access consideration of the different attacks across the
lifecycle, and ending with a special subsection on
• Disclosure malicious software or “malware”.
• Contamination
• Deception
• Disruption
• Delay

10
2.4.1 Kinds of Attackers and capabilities, resources, intentions, persistence, and
Motivations degree of risk aversion. They also may be
outsiders or someone inside or having a
Attackers could have any of a number of relationship with the individual or organization
motivations such as that is the target of the attack.
• Notoriety
• Acceptance 2.4.2 Attacks during Lifecycle
• Ego Attacks can occur during the development of the
• Financial gain software system, its deployment and sustainment,
• Activism operation and use, and loss or disposal.
• Revenge Attacks during development might include attacks
• Espionage where the attacker
• Information warfare
• Changes product from outside
Particular kinds of attackers, however, tend to
have certain motivations. Some categories of o Electronic intrusion
attacks and their typical motivations are o Physical intrusion (combined with
Amateur hackers: recreation, reputation, sense electronic)
of belonging, and learning • Changes product from inside
Disgruntled individual: revenge, favorable o Inserts agent
divorce, whistle blowing, sabotage, stalking, o Corrupts someone already in place
psychological damage, and sympathy
• Changes or disrupts development process
Individual or small groups of criminals: money
including credit card fraud and identity theft o E.g. failure to run or to report a test
Social protestors: publicity, hindering and o Categorizes a vulnerability’s defect report
disruption, and social or political change as not a vulnerability
Commercial competitor: competitive • Insider
intelligence, industrial espionage, recruitment, During deployment an attacker might
subversion, commercial advantage or damage,
• Change product or update after approval and
tacit collusion, and misinformation
before distribution
Organized crime: money including fraud,
extortion, blackmail; theft, and identity theft; • Usurp or alter means of distribution
recruitment, corruption, and subversion; • Change product at user site before or during
intimidation and influence including extortion installation
and blackmail; intelligence on politics, law Attacks during operation and use were the implicit
enforcement, criminal competition, and subject of the Dangerous Operational Events
opportunities and risks for criminal activities; section. One danger that was not mentioned was
and industrial espionage disclosure of a vulnerability to attackers or public
Nation state: intelligence and counter- either by an insider or someone unrelated to the
intelligence, economic espionage, training, development, sustainment, or operation of the
preparation for information warfare, secure software system.
misinformation, sabotage, law enforcement and
deterrence, political influence, and general 2.4.3 Malware
hindering and disruption A number of means or paths of attack exist
Thus, no shortage exists of attackers and including
motivations. This list, however, is not • Intrusion: the subject of much of the section
exhaustive and attackers vary in their

11
• External or Perimeter Effects: acts that Malicious code is a common phenomenon with,
occur outside or at the defense perimeter for example, many new viruses each month.
but nevertheless have a damaging effect; While all are important, the two that most impact
the most common one is denial of service the development of secure software are the needs
from overload of a resource to not include either intentional or unintentional
• Insider: person with existing authorization vulnerabilities in the product.
uses it to compromise security possibly
including illegitimately increasing
2.5 Mistakes and Hazards
authorizations Availability and other security properties can be
• Subversion: changing (process or) product affected in a variety of non-malicious ways. Items
so as to provide a means to compromise for consideration include [MoD DefStan 00-56
security Part 2/3 2004, page 31]
• Malware: software placed to aid in • Systematic and random failures
compromising security • Credible failures arising from normal and
All of these but the last have been addressed – abnormal use in all operational situations
to greater or less length – in this section. • Scenarios, including consequential credible
Malware, however, presents substantial failures (accident sequences)
dangers. Categories of malware include • Predictable misuse and erroneous operation
• Viruses: delivery mechanism for malicious • Common cause and common mode failures
code or for denial of service attack.
• Interactions between systems, sub-systems, or
• Worms: self-propagating delivery components
mechanism for malicious code or for a
Denial of Service attack. • The operating environment
• Trojans/back doors: provide remote access • Mechanical, electro-magnetic, and thermal
to a system through a back door or open energy and emissions
port. • Kinetic or explosive events
• Time or Logic Bombs: weaken or destroy • Chemical, nuclear, and biological substances
systems under certain conditions such as at damaging equipment, operators, or users; or
a certain time or upon receipt of a certain denying areas.
packet • Procedural, managerial, and human factors
• Intentional software defects: maliciously activities and ergonomics
alter software to produce incorrect data, • Storage, maintenance, transportation, disposal
provide access, or other effects and other such activities
Three additional non-malicious categories are On analysis many of these may not result in a
closely related to malicious code and may have security requirement, but some may.
similar effects
• Unintentional software defects: these can 2.6 Conclusion
have the same effects as malware; currently Security is about dealing with certain dangers and
the most common source of vulnerabilities the preservation of certain properties such as
• Intentional extra functionality: particularly confidentiality and integrity in the face of attacks,
unused/unadvertised functionality mistakes, and mishaps.
• Easter eggs: code placed in software for
amusement of developers or users

12
2.7 References [Redwine 2004] Redwine, Samuel T., Jr., and
Noopur Davis (Editors). Processes for Producing
[Chirillo 2002] Chirillo, John, Hack Attacks Secure Software: Towards Secure Software. vols.
Revealed: A Complete Reference for UNIX, I and II. Washington, D.C.: National Cyber
Windows, and Linux with Custom Security Security Partnership, 2004.
Toolset, Wiley Publishing, Inc., 2002. [Skoudis 2002] Skoudis, Ed, Counter Hack: A
[Evans 2005] S. Evans and J. Wallner, “Risk- Step-by-step Guide to Computer Attacks and
Based Security Engineering Through the Eyes Effective Defenses, Prentice Hall, 2002.
of the Adversary,” Proc. 6th Ann. IEEE [Viega 2005] Viega, J. The CLASP Application
Systems, Man and Cybernetics Information Security Process. Secure Software, 2005.
Assurance Workshop (IAW 05), IEEE CS
Press, 2005, pp.158-165. [Whittaker and Thompson 2004] Whittaker, James
and Thompson, Herbert, How to Break Software
[Flickenger 2003] Flickenger, Rob, Wireless Security, Pearson Education, Inc., 2004.
Hacks, O’Reilly and Associates, Inc., 2003.
[Meier 2004] Meier, J.D., Alex Mackman, 2.8 Further Reading
Srinath Vasireddy, Michael Dunner, Ray
[Hoglund 2004] Hoglund, Greg, and Gary
Escamilla, and Anandha Murukan, Improving
McGraw. Exploiting Software: How to break
Web Application Security: Threats and
code. Addison-Wesley, 2004.
Countermeasures, Microsoft, 2004
http://download.microsoft.com/download/d/8/c [Howard 2005] Howard, Michael, David LeBlanc,
/d8c02f31-64af-438c-a9f4- John Viega, 19 Deadly Sins of Software Security,
e31acb8e3333/Threats_Countermeasures.pdf McGraw-Hill Osborne Media; 1 edition, 2005

13
3 Fundamental Concepts and Principles
3.1 Introduction 3.2.2 Dependability
Those developing secure software need to know Dependability is a qualitative “umbrella” term.
a number of terms, concepts, and principles that Avizienis et al. offer a concise set of definitions
span multiple activities and roles and underlie or for the properties they put forth as involved in
are prerequisite for understanding multiple dependability, including security [Avizienis
sections within Development – and some 2004].
activities in sustainment and acquisition. This “As developed over the past three decades,
section address dependability is an integrating concept that
encompasses the following attributes:
3.2 Basic Concepts and • Reliability: continuity of correct service.
Principles • Safety: absence of catastrophic
consequences on the user(s) and the
3.2.1 Assurance environment.
Even if one could do it, producing or buying a
• Maintainability: ability to undergo
perfectly secure product would not be enough to
modifications and repairs. …
allow one to rationally act like it is. One would
also need justified confidence it was perfect. • Integrity: absence of improper system
Even with products having degrees of quality alterations.
less than perfection, one needs the appropriate • Availability: readiness for service2.
justified confidence backed by arguments and When addressing security, an additional attribute
evidence to act on this fact concerning their has great prominence,
degree of quality.
• Confidentiality, i.e., the absence of
Assurance is about justified confidence.1 It can unauthorized disclosure of information.”
be about actions taken to provide a basis for
justified confidence either how something is 3.2.3 Security
done or evaluations, the arguments and evidence “Security is a composite of the last three of these
that justify confidence, or about the degree of attributes – confidentiality, integrity, and
justified confidence that someone or availability. Security often requires the
organization possesses. simultaneous existence of 1) availability for
Assurance could relate to any characteristic of authorized actions only, 2) confidentiality, and
software, but this report primarily addresses 3) integrity with ‘improper’ meaning
software system security assurance. ‘unauthorized’.”3 [Avizienis 2004, p. 13]
Security is not as simple as this sounds. Neither
confidentiality nor integrity can be achieved
unless identities can be firmly established –
1
Trust is related to confidence, but in life who or what one authenticated – and only allowable actions
trusts is often without explicit justification. Trust is permitted – access control. Even if an action is
conditional – who one trusts when in what situation, how
much to do what. In addition, the entity trusted may or may 2
Availability may include availability to share.
not be trustworthy. One prepares for violations of trust and
3
enforce accountability. An entity’s history of behavior is often Another definition of security is, “All aspects related to
reflected in his or her reputation. Particularly in ecommerce, defining, achieving, and maintaining confidentiality, integrity,
mechanisms exist for publicizing each buyer’s and seller’s availability, accountability, authenticity, and reliability“
accountability history and reputation. [ISO/IEC13335-1].
considered legitimate when it occurs, the who impose constraints or have interests
properties of later being able to establish all acts involved. These include
that occurred and their related actors, and having • National (possibly multiple nations) and
relevant actors unable to effectively deny an act international laws, regulations, treaties, and
occurred are known as accountability and non- agreements
repudiation. [Landwehr 2001]
• Specific communities (such as government
Security properties are emergent systems or the banking industry)
properties of the system. While necessary,
security functionality does not make a secure • Integrators of the software system into a
system because security being a systems larger product (e.g. OEMs or enterprise-wide
property implies, for example, that the individual application developers)
pieces of security functionality such as password • Authorized units within an organization
verifiers cannot be bypassed anywhere. • Policy owners (e.g. security, personnel,
3.2.4 Common Terms and Their procurement, and marketing policies)
Relationships • Acquisition and usage decision makers
(including RFP writers and issuers)
• System owners
• Asset owners
• Owners’ customers and suppliers
• System administrators
• End users
• Standards bodies
• Software certifiers and system accreditors
• Software developers
• Entities about whom the system contains
information
• The general public
Figure 2: Security Concepts and Relationships (Source:
Common Criteria) Not mentioned in this list is an important
stakeholder who strongly influences decisions,
Threatening entities, agents, or attackers may
the attacker. A given development effort may
possess or be postulated to possess certain
have more or less of these stakeholders involved,
capabilities, resources, and intentions creating
but it is the potential attackers and loss suffers
threats. Threats utilize vulnerabilities in the
who make security a concern.
system to perform their attacks. Adversaries use
specific kinds of attacks or “exploits” to take 3.2.6 Asset
advantage of particular vulnerabilities in the
An asset is anything of value to a stakeholder,
system. Systems may have countermeasures to
particularly to its owner or attacker, but also to
reduce certain vulnerabilities. See Error:
society or to the entity about whom data may
Reference source not found for relationships
relate. Secure software developers must identify
among these terms.
assets needing protection and for how long.
3.2.5 Stakeholders Assets may be categorized by attributes related
Software system security-related development to their confidentiality (e.g. Top Secret, Secret,
decisions can have a number of stakeholders Confidential, or Unclassified) by their integrity
level (e.g. accurate and up to date versus old and
with unknown accuracy); or by criticalness of secure software system. The assurance case
availability or acceptable level of unavailability provides significant stakeholders with a basis for
(e.g. unavailable one minute in every 30 days). their justified degree of confidence and decision
Normally, information may flow from lower making. To convince them successfully, risks
confidentially domains to higher confidentiality they perceive need to be addressed – if only to
ones and from higher integrity ones to lower show their non-existence – whether developers
integrity ones. Flowing the other way, however, believe them to be genuine or not.
requires explicit actions to downgrade Starting with the initial concept and
confidentiality level (e.g. declassification) or requirements, the assurance case subsequently
upgrade integrity level (e.g. validation of input). includes experienced or postulated possibilities
and risks; avoidance and mitigation strategies;
3.2.7 Information Assurance and an assurance argument referring to
The term “information assurance” is often used associated and supporting evidence from design
as and construction activities, verification activities,
1. A catch-all term for all that is done to assure tests and trials, etc., that may eventually include
security of information in-service and field data as appropriate.
2. The level of assurance or justifiable Modifications in the software system and its
confidence one has in that security security requirements cause changes to the
assurance case.
3.2.8 Assurance Case The assurance case provides a structure of goals
An Assurance Case4 is: and sub-goals (or claims and sub-claims) with
“A reasoned, auditable argument created to evidence and connecting arguments that show
support the contention that a defined system the achievement of the top-level goal(s) (or
will satisfy the … requirements.” UK MoD claim(s)). The [SafSec Standard] and [SafSec
DefStand 00-42 [Ministry of Defence 2003b, Guidance] documents address this structuring of
section 4.1] the assurance case. Other sources include
In our case, these requirements are the security [Ministry of Defence 2003b] and [Howell 2003]
requirements, but, while not covered herein, a The two potential purposes of assurance cases
combined assurance case could include security are to show that the
and safety [SafSec Standard]5 as well as possibly 1. Security requirements are valid, e.g. will
other related requirements. The assurance case’s result in meeting real world intentions and
argument must, of course, be supported by expectations
evidence that is identified with parts of the 2. System will meet its security requirements
assurance case. Such evidence might come in
Depending on its criticality and risk, the first
many forms including results from mathematical
may or may not be the subject of a formalized
proof checkers, analyses, reviews, and tests. (See
assurance case; the second should always be.
the Verification, Validation, and Evaluation
section for further information.) In the second, the top goal(s) must include the
security properties required, is likely to be the
The assurance case addresses the possibilities
software system’s security policy including the
and risks identified as obstacles to having a
security properties, and could be the entire set of
4
security-related requirements for the system.
Sometimes called an “assurance argument;” in this report the
term “assurance argument” or just “argument” is used for the
One example is the Common Criteria v. 3.0
arguments that connect the evidence to the assurance where the top goals are nominally composed of
conclusions. the combination of required security
5
SafSec is an example of secure software development functionality and properties (e.g. that this
benefiting from software safety having more successful recent functionality cannot be bypassed).
experience with producing high-confidence software than
have software producers where security was a concern.
As a living, top-level control document, an This certification process uses first an authorized
assurance case needs to be managed with its laboratory to do an assessment followed by a
status reported periodically including progress in government validation of any recommended
arguments and linked evidences. It remains with certification. [Pieto-Diaz 2003??] [??]
the software system throughout its life through
3.2.9.2 FIPS 140 Certification of
disposal. The assurance case is, therefore, a
progressively expanding body of evidence built Cryptographic Software
up during development, and responds as required The US government National Institute for
to all changes during development and Standards and Technology (NIST) performs
sustainment. certifications of cryptographic software to
The activities related to creating and evaluating standards set by Federal Information Processing
the assurance case are, obviously, also identified Standard (PIPS) 140. This has become a de facto
in the project plans. Finally, a description of the standard for cryptographic software considered
proposed assurance case would normally appear essential by almost all knowledgeable users of
in a proposal document during acquisition. such software.
3.2.9.3 BITS Certification
3.2.9 Security Certifications and
The Financial Services Round Table BITS
Accreditations certification program aims at achieving a set of
A number of certifications for software system security functionality suitable for banking and
security and accreditations of operational financial services and a certification process
systems (software, hardware, procedures, and simpler than that of the Common Criteria. See
people) exist. Historically, these have been http://www.bitsinfo.org/c_certification.html.
applied primarily to government systems.
3.2.9.4 System Accreditation
3.2.9.1 Common Criteria Certification The US DoD accreditation process is governed
The Common Criteria standard contains by DoD Instruction 5200.40, DoD Information
• Enumeration of security functionality (e.g. Technology Security Certification and
authentication, logging) Accreditation Process (DITSCAP) supplemented
• Evaluation Assurance Levels (EAL) 1 (low) by the DoD8510.1-M Application Manual.
through 7 (high) calling for increased levels On the civilian side of the US government the
of documentation, formality, and testing National Institute for Standards and Technology
has produced NIST Special Publication 800-37
• Methods to be used by those doing the
Guide for the Security Certification and
evaluations and certification
Accreditation of Federal Information Systems.
Associated with the Common Criteria are a set [Ross R. 2004]
of Protection Profiles – standard minimal
Internationally, ISO/IEC 17799:2005
security requirements, usually lists of required
Information technology. Code of Practice for
functionality – for a number of kinds of
Information Security Management combined
software.
with UK standard BS7799-2:2002 (anticipated
Historically, the Common Criteria and will become ISO/IEC 27001) form a basis for an
associated Protection Profiles have identified Information Security Management System
security-orient functionality to be required of (ISMS) certification (sic) of an operational
systems. [Common Criteria v.3.0 Part 2] The system.
Common Criteria now also calls for self-
protection and non-bypassability of security 3.2.10 Saltzer and Schroeder Principles
functionality. [Common Criteria v.3.0 Part 3, Saltzer and Schroeder published their list of ten
page 96-101] principles in 1975, and they have become
classics. [Saltzer and Schroeder 1975] Everyone
involved in any way with secure software • Social or political protestors
systems should be aware of them. • Terrorists
• Economy of Mechanism – design simplicity • Nation states
• Fail-safe defaults – deny access unless Software security involves intelligent, skilled
explicitly authorized adversaries. When possession, damage, or denial
• Complete mediation – check every access of assets is highly valuable to someone else, then
• Open design – visible for review and considerable skill and resources could be
analysis brought to bear. When poor software security
makes these actions relatively easy and risk free,
• Separation of privilege – a system should not even lower value assets become attractive.
grant permission based on only a single
condition One cannot simply use a probabilistic approach
to one’s analyses because, for example, serious,
• Least privilege – only grant those privileges intelligent opponents tend to attack where and
needed in order to complete task and only when you least expect them – where your
for duration of task estimate of the probability of such an attack is
• Least common mechanism – avoid shared relatively low.
mechanisms; e.g. executing code used to Where judging risks is difficult, one possible
access resources should not be shared across approach is to make them at least not
users unacceptable (tolerable) and “as low as
• Psychological acceptability – ease of use and reasonably practicable” (ALARP).6. In
operation employing the ALARP approach, judgments
• Work factor – cost commensurate with risks about what to do are based on the cost-benefit of
techniques – not on total budget. However,
• Recording of compromises – trails of additional techniques or effort are not employed
evidence once risks have achieved acceptable (not just
3.2.11 Security is about Conflict tolerable) levels. This approach is attractive from
an engineering viewpoint, but the amount of
Because many security-related issues and benefit cannot always be adequately established.
activities concern a conflict pitting the system
and those aiding in its defense against those who 3.2.11.2 Security is a System,
are attacking or may attack in the future, one Organizational, and Societal Problem
needs to bring to the situation all that one can of Security is not merely a software concern, and
what is known about conflicts. Below are some mistaken decisions can result from confining
of the key concepts. attention to only software. Attempts to break
3.2.11.1 Intelligent and Malicious security are often part of a larger conflict such as
business competition, crime and law
Adversary
enforcement, social protest, political rivalry, or
The threats faced may come from a number of conflicts among nation states, e.g. espionage.
sources
• Insiders, e.g. disgruntled or defrauding
3.2.11.3 Measure-Countermeasure
Despite designers’ positive approach to assuring
• Amateur hackers from outside
protection, in practice one also engages in a
• Customers or suppliers software measure-countermeasure cycle between
• Individual or small groups of criminals offense and defense.
• Industrial espionage
6
ALARP is a significant concept in UK law, and an excellent
• Organized crime engineering-oriented discussion of it appears in Annex B of
DEF STAND 00-56 Part 2 [Ministry of Defence 2004b]
3.2.11.4 Learn and Adapt • Mandatory access control
The attack and defense of software-intensive • Role based access control
systems is not a routine situation; it is a serious • Workflow access control
conflict situation with serious adversaries such
as industrial espionage by competitors, • Chinese wall access policy [??]
organized crime, terrorist organizations, and • High latency with provisions/prerequisites
nation-states. Analyses of past and current and obligations (e.g. credit card transactions)
experience improve adaptation of tactics and
strategy. 3.3.2 Security Functionality
Certain kinds of functionality are usually
3.3 Secure Software required in secure systems. Among the most
The section covers several specific areas of common are
fundamental knowledge about secure software • Identity management of users
required across development, sustainment, and • Access control mechanisms
acquisition. [Gasser 1988, Chapters 1-4][Bishop
• Cryptographic functionality
2003, Chapters 1-2]
• Logging and auditing
3.3.1 Security Policies • Controlling data flows (e.g. in and out)
Security policies for a system specify legitimate,
• Security administration
secure, states (or sometimes histories) of the
system – and what are not. In part, security • Facilities for secure installation and updating
policies for software systems devolve the
policies from societal (e.g. laws and regulations),
3.3.3 General Architectural Concepts
organizational, and larger encompassing system General Architecture concepts include:
policies as well as the effects of interfacing • Systems may support one or multiple
systems. The software systems security policy security levels or divisions
combines all of these, as well as the particular • Systems should detect legitimate and
policies needed for a specific software system. illegitimate actions recording both, and
Security policies address areas such as report, block, or hinder illegitimate actions:
• Data protection e.g., notification (e.g., alarm), audit log,
firewall
• Identification and authentication
• Tolerance may exist for some bad events
• Communication security
including security violations: e.g. intrusion
• Accountability tolerance, fault tolerance, robustness (input
• Self-protection of software tolerance), and disaster recovery. This may
• Underlying functionality such as random include doing the following for security
number generation and time stamping violations
The security policy for a system often includes o Forecasting
elements stated in terms of access control o Detection
policies. A number of conceptual access control o Notification
policies exist. [Bishop 2003, p. 381-406] These
o Damage isolation or confinement
policies can be used individually or in
combination and may have a variety of o Continuing service although possibly
mechanisms to support them. These generic degraded
access control policies include o Diagnosis of vulnerability
• Discretionary access control o Repair
o Recovery of the system to a legitimate • Ensuring coding avoids potential
state vulnerability producing practices
o Recording • Testing the system for security
o Tactics that adapt to attackers’ actions. • Providing a covert channel analysis
Finally, while not ultimately to be depended on, • Deploying securely
deception may be used for the purposes of aiding • Certifying software
concealment, intelligence gathering, and wasting
an attacker’s resources. • Accrediting operational system
Furthermore, production is performed in a secure
3.3.4 Secure Software Development environment with the appropriate controls.
Activities Some activities on the list commonly are
This report presumes knowledge of “unsecured” performed primarily for systems with high
software engineering activities. For secure security requirements, e.g. covert channel
software development, however, a number of analysis. This list does not reflect the full impact
activities are new or significantly modified from of security on activities. In practice, performance
“unsecured” development. In practice these vary of virtually every activity is affected including,
depending whether a heavyweight development for example, consideration of security in all
process is used [Hall 2002] [NSA 2002, Chapter technical reviews.
3], a lightweight one, [Viega 2005] [Lipner
2005a] [Howard 2002] [Meier 2004], or one 3.3.5 Tradeoffs
especially for the problems of legacy software Tradeoffs exist between security and efficiency,
[Viega 2005, p. 40].[Meier 2004, p. xxx]. speed, and usability. For some systems other
Generally, the kinds of activities done in significant tradeoffs with security may also exist.
heavyweight processes are a modest superset of Design decisions may exacerbate or ease these
lightweight processes but with many performed tradeoffs. For example, innovative user interface
with considerably more rigor. The activities used design may ease security’s impact on usability.
remedially for legacy software consist of about [Cranor 2005]
half of the lightweight ones. Note that legacy
3.3.6 Cryptographic Specialties
software may eventually require more serious
rework such as major rearchitecting. At least three types of cryptographic expertise
are essential. These are
Some activities used by one or more of these
processes are 1. Producing software to encrypt, decrypt, sign,
etc.
• Performing threat analysis
2. Cryptographic protocols
• Developing security policy
3. Cryptographic issues in system design
• Developing a formal top-level specification
All of these are areas of specialization.
• Developing a security test plan Experience shows the first two contain excessive
• Developing system based on the top level risks if performed by non-specialists.7 As a
specification providing assurance case result, knowledge for producing cryptographic
including assurance arguments and evidence software was deemed out the scope of this
that report.
o Design agrees with top level
specification and security policy 7
While presumably produced by specialists and ready for
o Code correspondences to design certification, the US Federal government certifiers of
cryptographic software report that substantial fractions of
submittals have serious problems and 90% have some
problem. (CCUF II 2005)
Experience shows the third may also have 3.4.1 Domain Knowledge
dangers if done by non-experts. The security
This report cannot, of course, enumerate the
problems of early versions of the IEEE 802.11
knowledge needed in all possible fields in which
wireless protocol are an example.
secure software is needed. Nevertheless,
Developers of secure software necessarily need adequate domain knowledge is required of
significant knowledge of the third. In situations developers, sustainers, and acquirers for proper
new to the developer, however, expert help can evaluation of risks and production of a suitable
reduce risks as does review by experts. secure system.
3.3.7 Cryptography Concepts 3.4.2 Industry Situation
Basic cryptographic concepts include [Anderson Most software runs in systems of multiple
2001 p. ??] [Bishop 2003 Chapter ??] machines networked together. In network
• Symmetric key encryption: same key used to security today a basic assumption often is
encrypt and decrypt software is insecure. Given this the question
• Public key encryption: publicly known key becomes, now what can we do about network
used to encrypt and secret key used to security? On the other hand, this report’s
decrypt common body of knowledge relates to achieving
(more) secure software. One needs to be aware
• Key management: methods used to originate, of the current industry range of practice in secure
distribute, and revoke keys; and identify software engineering and, in particular, the
their possessors common pitfalls and the state of practice
• Hashing: applying a difficult to reverse reflected in any off-the-shelf software one is
function considering.
• Non-repudiation: impossible to deny 3.4.2.1 Sources Exist for Information
performing action, e.g. sending message about Known Vulnerabilities and
given evidence in message sent Exploits
• Secret sharing: method of splitting secret The current status of the general ongoing
among parties8 computer security conflict is available from a
Cryptographic techniques can preserve number of sources, such as books (more
confidentiality and integrity detecting integrity organized and better for initial learning but less
problems as well as ensure authenticated and up-to-date), articles, vendors, services, and
non-repudiation. These are capabilities also databases. For examples, see [Whitaker 2004]
provided by access control and audit logging. and [Hoglund 2004], US CERT, and the CVE
Thus, they can sometimes substitute for access vulnerabilities list maintained by The MITRE
control (e.g. during network transfer of data Corp.
where access control of eavesdropping may be In addition, instances of vulnerabilities can be
impossible) or serve as a second line of defense. tracked through the websites of vendors of
products for their products and vendors of
3.4 Context information assurance products and services who
Several items of contextual knowledge have have general lists as well as Securityfocus’s
already been mentioned such as laws and bugtraq and the SANS Institute.
regulations and outside stakeholders, but two
others require noting. 3.5 References
[Anderson 2001] Anderson, Ross J., Security
8
“Secret sharing” is actually secret splitting in such a way that Engineering: A Guide to Building Dependable
one must know t out of n parts to know the secret and, if one Distributed Systems. John Wiley and Sons, 2001.
knows fewer parts, one knows nothing about secret.
[Avizienis 2004] Avizienis, Algirdas, Jean- [Ministry of Defence 2004b] Ministry of
Claude Laprie, Brian Randell, and Carl Defence. Interim Defence Standard 00-56 Safety
Landwehr. “Basic Concepts and Taxonomy of Management Requirements for Defence Systems
Dependable and Secure Computing,” IEEE Part 2: Guidance on Establishing a Means of
Transactions on Dependable and Secure Complying with Part 1, 17 December 2004
Computing, vol. 1, no. 1, Jan.-Mar. 2004, pp. 11- [NSA 2002] National Security Agency, The
33. Information Systems Security Engineering
[Bishop 2003] Bishop, Matt. Computer Security: Process (IATF) v3.1. 2002.
Art and Practice, Addison-Wesley, 2003. [Redwine 2004] Redwine, Samuel T., Jr., and
[Bourque and Dupuis 2004] Bourque, Pierre, and Noopur Davis (Editors). Processes for
Robert Dupuis (Editors). Guide to the Software Producing Secure Software: Towards Secure
Engineering Body of Knowledge. 2004 Edition. Software. vols. I and II. Washington, D.C.:
Los Alamitos, California: IEEE Computer National Cyber Security Partnership, 2004.
Society, Feb. 16, 2004. [SafSec Standard] “SafSec Methodology:
[Gasser] Gasser, M. Building a Secure Standard.” SafSec: Integration of Safety and
Computer System. Van Nostrand Reinhold, Security.
1988. http://www.safsec.com/safsec_files/resources/50
[Hall 2002] Hall, Anthony and Rodrick _2_SafSec_Method_Standard_2.6.pdf
Chapman. “Correctness by Construction: [Sommerville 2004] Sommerville, I. Software
Developing a Commercial Secure System,” Engineering. 7th ed. Pearson Education, 2004.
IEEE Software, vol. 19, no. 1, Jan./Feb. 2002, [Viega 2005] Viega, J. The CLASP Application
pp.18-25. Security Process. Secure Software, 2005.
[Hoglund 2004] Hoglund, Greg, and Gary [Whittaker] Whittaker, J. A. and H. H.
McGraw. Exploiting Software: How to break Thompson. How to Break Software Security:
code. Addison-Wesley, 2004. Effective Techniques for Security Testing.
[Howard 2002] Howard, Michael, and David C. Pearson Education, 2004.
LeBlanc. Writing Secure Code, 2nd ed.,
Microsoft Press, 2002. 3.6 Further Readings
[Ibrahim et al, 2004] Ibrahim, Linda, et al. [Avizienis 2004] Avizienis, Algirdas, Jean-
Safety and Security Extensions for Integrated Claude Laprie, Brian Randell, and Carl
Capability Maturity Models. Washington D.C.: Landwehr. “Basic Concepts and Taxonomy of
United States Federal Aviation Administration, Dependable and Secure Computing,” IEEE
Sept. 2004. Transactions on Dependable and Secure
[Landwehr 2001] Landwehr, Carl, Computer Computing, vol. 1, no. 1, Jan.-Mar. 2004, pp. 11-
Security, IJIS vol. 1, 2001, pp. 3-13. 33.
[Lipner 2005a] Lipner, Steve and Michael [Clark and Wilson 1987] Clark, David D. and
Howard, The Trustworthy Computing Security David R. Wilson. “A Comparison of
Development Lifecycle, Microsoft, 2005 Commercial and Military Computer Security
http://msdn.microsoft.com/security/default.aspx? Policies,” Proc. of the 1987 IEEE Symposium
pull=/library/en- on Security and Privacy, IEEE, 1987, pp. 184-
us/dnsecure/html/sdl.asp#sdl2_topic8?_r=1 196.
[Ministry of Defence 2003b] Ministry of [NRC 1999] Committee on Information Systems
Defence. Defence Standard 00-42 Issue 2 Trustworthiness, Trust in Cyberspace, Computer
Reliability and Maintainability (R&M) Science and Telecommunications Board,
Assurance Guidance Part 3 R&M Case, 6 June National Research Council, 1999.
2003.
4 Development
4.1 Introduction
This section on Development encompasses the production of software through initial deployment and
in doing so contains some areas that are used for both original development and post-release
sustainment. Because of the iterative nature of many software production efforts, the boundary
between “Development” including assurance and Post-Release Assurance is not always clearly drawn.
To maintain coherence overlap is inevitable. This is kept reasonable by not having in-depth treatment
of a sub-area in both sections, but rather having one section reference the other for details. This may
result in some switching back and forth by readers but avoids excessive duplication or conflict. (To
allow, however, later editorial decisions to be made with full information, for those topics where
boundary questions are currently not fully resolved, this early draft may contain more overlap than the
final version.)
This section contains subsections covering the “additional” knowledge needed to produce secure
software:
• More on Fundamental Concepts and Principles
• Ethics, Law, and Governance
• Requirements
• Design
• Construction
• Verification, Validation, and Evaluation
• Tools and Methods
• Process
• Management
First fundamental concepts and principles are elaborated – this time intended solely for those involved
in development of secure software. Because these provide one set of inputs to requirements, a short
section addressing laws, regulations, policies, and ethics related to secure software systems precedes
the sections on requirements; design; construction and verification, validation, and evaluation
activities. Some aspects of these are further detailed in a tools and methods section before a process
section cover their arrangement, introduction, and improvement. A final section addresses difference
in managing secure software developments. Within a larger context, these sections cover the additional
technical, managerial, and supporting activities and interactions related to producing secure software.
4.2 More on Fundamental Concepts and Principles
This section adds development-related concepts, Desired or required privacy1 is a motivation for
principles, and details to those in the prior confidentiality, anonymity, and non-existence of
fundamentals section, which spanned data as well as avoiding some potential effects of
development, sustainment, and acquisition. lack of data integrity or accuracy, e.g. falsehoods
damaging reputation.
4.2.1 Security Properties
Progressively more in-depth treatments of
4.2.1.2 Integrity
security properties are available in [Landwehr To maintain system integrity one needs to keep
2001], [Redwine2005a] and [Bishop 2003]. system in legitimate states or conditions.
[Avizienis 2004] contains information on “Legitimate” must be specified – an integrity
characterization and categorization. security policy could be conditional. For
example, it might allow the system to enter
4.2.1.1 Confidentiality illegitimate states during a transaction, as long as
Computing-related confidentiality topics include it returns to a legitimate state at the end of the
access control, encryption, traffic analysis, transaction. Two key sub-problems within
covert channels, anonymity, hiding, and integrity are
intellectual property rights protection. • Is something unchanged?
4.2.1.1.1 Traffic Analysis • Were all of the implemented changes
The main issues in traffic analysis are authorized?
detectability and analyzability. Factors include Checking that data is unchanged can only have
concealment of origin and destination of meaning in terms of the question, “Since when?”
communications and the level of traffic loads or In practice this usually means that one must
the non-randomness of their variations. [Pfleeger query: “In whose possession?” (This possession
and Pfleeger 2003, p. 410, 453] [Redwine 2005a, may or may not be at a specified time.)
p. 11] Kinds of items where proper privileges and
4.2.1.1.2 Covert Channels authorization can be of concern include
Covert channels, abnormal means of • Creating
communication, use timing of overt messages, • Viewing
storage locations not normally used, or
• Changing
availability of resources to convey messages
covertly. Covert communication channels are • Executing
measured by the bit rate that they can carry. • Communicating
[Bishop, p. 462-469] [Redwine 2005a, p. 11] • Deleting/destroying
4.2.1.1.3 Anonymity In discussing integrity-related change
Anonymity concerns concealing one’s identity, authorizations, changes commonly concern
activity, attributes, relationships, and possibly • Credentials
existence. Issues include concealing the identity
• Privileges
that particular data relates to and who is
communicating with whom, including • Data
determining that the same (but unidentified) • Software (possibly considered data)
entity is involved in exchanging the messages –
• The point(s) or paths of execution
linkage.

1
Including protection from cyberstalking
4.2.1.2.1 Identification and Authentication 4.2.2 More on Software Security
To authenticate is to verify identity. Principles
Authentication may involve something one The list below (except for “Defense in Depth”)
knows or possesses, a context or location, or an follows the principles proposed by [Saltzer and
inherent characteristic. Authentication is a Schroeder 1975] and liberally quotes edited
different problem than establishing an initial selections from that text. [Viega and McGraw
identification, which may be done by the entity 2002] has a particularly accessible discussion of
claiming an identity, by inference, or by them. These principles have relevance
recognition. throughout secure software development
4.2.1.2.2 Non-repudiation including requirements; design; construction;
Non-repudiation involves taking responsibility and verification, validation, and evaluation.
for one’s actions. One should not be able to 4.2.2.1 Defense in Depth
disclaim one’s deed after the fact or deny an This principle ensures that an attacker must
event related to one. For example, not being able compromise more than one protection
to repudiate being the sender, authorizer, or mechanism to successfully exploit a system.
receiver of a message. Several means of This raises the cost of an attack, and may
achieving non-repudiation involve cryptographic dissuade an attacker from continuing the attack.
signatures. ISO/IEC 13888 Information
technology – Security techniques – Non- 4.2.2.2 Least Privilege
repudiation addresses both symmetric and Every entity in the system should operate using
asymmetric techniques. the least set of privileges necessary to complete
4.2.1.3 Availability the job. This principle limits the damage that
results from an accident or error. It also reduces
Along with reliability, engineering for the number of potential interactions among
availability has a long history in computing. privileged programs, so that unintentional,
Many traditional approaches and means of unwanted, or improper uses of privilege are less
prediction exist, but all under the conditions of likely to occur.
non-maliciousness. This is also true of the
traditional area of disaster recovery. As with all 4.2.2.3 Fail-Safe Defaults
security properties, achieving a specified level of This principle calls for basing access decisions
availability is a more difficult problem when on permission rather than exclusion. Thus, the
maliciousness must be considered. Some of the default situation is lack of access, and the
old approaches and almost all the means of protection scheme identifies conditions under
calculation no longer work. which access is permitted. To be conservative, a
Denial of service attacks from outside – design must be based on arguments stating why
particularly distributed ones originating from objects should be accessible, rather than why
many computers simultaneously – can be they should not.
difficult to successfully overcome. Local attacks 4.2.2.4 Economy of Mechanism
that attempt to take over, exhaust, or destroy
resources also are a threat. Any mechanism “Keep the design as simple and small as
designed to deny illegitimate access can tempt possible” applies to any aspect of a system, but it
attackers to discover a way to use it to deny deserves emphasis for protection mechanisms,
legitimate access. since design and implementation errors that
result in unwanted access paths may not be
Speed of repair or recovery can sometimes be noticed during normal use.
factors in availability. Lessons can be learned
from.
4.2.2.5 Complete Mediation 4.2.2.9 Psychological Acceptability
Every access to every object must be checked for It is essential that the human interface be
its authorization. This principle, when designed for ease of use, so that users routinely
systematically applied, is the primary and automatically apply the protection
underpinning of the protection system and mechanisms correctly.
implies that a foolproof method of identifying
the source of every request must be devised. It 4.2.3 System Security Policies
also requires that design proposals to allow Areas potentially covered by software system
access by remembering the result of an authority security policies include
check be examined skeptically. • Confidentiality, integrity, and availability
4.2.2.6 Open Design policies
Security mechanisms should not depend on the • Identification and authentication
ignorance of potential attackers, but rather on the • Non-repudiation
possession of specific, more easily protected,
• Cryptography
keys or passwords. This permits the mechanisms
to be examined by a number of reviewers • Access policies
without concern that the review may itself • Accountability and forensics policies
compromise the safeguards. System security policies may be layered from
The principle of open design is not universally the more general to the more (application)
accepted. The notion that the mechanism not specific. The mechanisms supporting policy
depend on ignorance is generally accepted, but enforcement may also be layered, for example,
some would argue that its design should remain with a bottom separation layer providing
secret since a secret design may have the information flow and data isolation facilities so
additional advantage of significantly raising the higher layers can define and enforce policies
price of penetration. themselves such as the top layer’s application
specific policies.
4.2.2.7 Separation of Privilege
Policies may exist separately across a distributed
A protection mechanism that requires two keys
system requiring their composition or
to unlock it is more robust and flexible than one
reconciliation.
that allows access to the presenter of a single
key. By requiring two keys, no single accident, Organizations may have security policies that are
deception, or breach of trust is sufficient to not emergent properties, but requirements to use
compromise the protected information. a certain mechanism, technique, or product.
While these, of course, must also be conformed
Redundancy is also used in the traditional
to by the system, they do not present the same
“separation of duties” in human financial
design and verification difficulties as such
processes, e.g. the person who fills out the check
properties as confidentiality or non-
and signs it are two different people.
bypassability.
4.2.2.8 Least Common Mechanism
4.2.4 Specification Properties
Minimize the security mechanisms common to
more than one user or depended on by all users At the technical level a software system security
or levels of sensitivity. Every shared mechanism property might be formally specified as a
represents a potential information path between • Safety property: restriction on allowed
users and must be designed with great care to system states
ensure it does not unintentionally compromise • Liveness property: states that must reached;
security. required progress or accomplishment
Often it is also convenient to state some security • Perform penetration testing and test security
specifications in terms of information flow functionality
constraints or requirements including • Provide a covert channel analysis
• Access control mechanisms and policies • Perform ongoing monitoring of security
• Restrictions on information communicated threats, needs, and environment
• Separation properties • Perform changes securely maintaining
• Covert channels limitations conformance to – possibly revised – security
requirements while continuing to provide
These properties can be stated as conditions that
complete assurance case
must be true of the system.2
• Provide assurance case including arguments
4.2.5 Secure Software Development and evidence showing software system
Activities meets security requirements
A number of activities overlap with and some • Deploy securely
are different from “unsecured” development. Initially
Those that are new or changed for secure
Updates
software include [NSA 2002, Chapter 3]
[Howard 2002] • Certify software
• Perform threat analysis and assure it quality • Accredit operational system
• Adequately identify and characterize • Provide for secure
possible threats Support services
• Develop security policy that meets user, Outsider reporting of vulnerabilities
organizational, and societal requirements • All of this must be
• Develop a formal top-level specification Included in planning and tracking
• Show top-level specification agrees with Done in a secure environment operated securely,
security policy preferably providing counter-intelligence and
• Develop a security test plan forensic support
• Develop an implementation of the system Possessed of an adequate set of trustworthy tools
based on the top-level specification that have been received, installed, stored,
providing assurance case with argument and updated, and executed securely
evidence (preferably including proofs) that Performed by trustworthy people
• Design Placed under appropriate controls against inside
and outside mistakes, mishaps, and malicious
Agrees with top-level specification and security
actions
policy
Have adequate assurance these are true
Contains only items called for in top-level
specification • Collect and analyze security-related
Lacks vulnerabilities to identified threats measurements
• Code • Improve security-related
Correspondences to design and security policy Processes
Contains only items called for in design Personnel-related activities and skills
Lacks vulnerabilities to identified threats Artifact quality
A large number of these and other necessary
2 activities could also have been motivated merely
If specified formally, this can allow static analysis of designs
and code potentially adding creditable assurance evidence. by strong desires for suitable and correct
products and services. Security needs, however, 4.2.6.5 Deception
provide even more motivations as well as new While deception should never be relied on
requirements for production and products. exclusively, it has useful purposes such as
Because of this, some effective organizations gathering intelligence and misleading attackers
establish processes across the organization that so as to deter or confuse them or to cause
integrate security concerns and reuse them. wasting of their time and resources.
4.2.6 General Architectural Concepts 4.2.6.6 Damage Confinement and
4.2.6.1 Multiple Levels Isolation
Several schemes exist for multi-level or multi- If error states or damage occur, one mitigation
lateral security. approach is to limit the potential damage by
localizing it or reducing its size or effects. This
• Multi-Level Security (MLS): both a generic may be done by detecting errors and ensuring
term and a name sometimes given to a correct values rather than the errors propagate to
specific scheme where all accesses are the rest of the system or externally. It can also be
checked by a single software trusted done, for example, by removing valuable assets
computing base from online.
• Multiple Independent Levels of Security
(MILS): See [Vanfleet 2005] 4.2.7 Safety and Security
• Multiple Single Levels of Security (MSLS): Software safety has more successful recent
no intermixture experience with producing high-confidence
software than have software producers where
[Neumann 1995] discusses a number of
security was a concern. Today, security is a
approaches.
concern for most software as software has
4.2.6.2 Detect and Report, Block, or become central to the functioning of
Hinder organizations and much of it is directly or
Detection of an error state, or attempted or actual indirectly exposed to the Internet or to insider
security policy violation is needed before action attack as well as to subversion during
(e.g. notification (e.g. alarm) or recording in development, deployment, and updating. The
audit log). Related mechanisms deriving from safety community’s experience provides lessons
network security include firewalls, intrusion for software security practitioners, but the
detection systems, and intrusion prevention engineering safety problem differs from the
systems. security one in a critical way – presumed non-
existence of maliciousness.
4.2.6.3 Tolerance
4.2.7.1 Probability versus Possibility
Intrusion tolerance, fault tolerance, and
robustness (input tolerance) are important The patterns of occurrences of “natural” events
concepts. relevant to safety are described probabilistically
for engineering purposes. The probability of a
4.2.6.4 Adaptive Response natural event contrasts with the need for concern
Run time adaptive defense can include moving for the possibility of an intelligent, malicious
processing elsewhere, switching to another copy action. This distinction is central to the
of data, and changing levels of trust of objects difference between facing safety hazards versus
and subjects including changing access rights security threats.
because of behavior. Intrusion tolerance can be 4.2.7.2 Combining Safety and Security
aided by adaptive defenses.
When both are required, a number of areas are
candidates for partially combining safety and
security engineering concerns including
• Goals their attacks. Hopefully, in the not too distant
• Solutions future for outsiders and today for insiders
reducing anonymity thereby making deterrence
• Activities more effective will somewhat calm the current
• Assurance case riot of illicit behavior.
Goals/claims Specifically in secure software development, the
Assurance arguments social norms and ethics of members of the
development team and organization as well as
Evidence
suppliers deserve serious attention. Insiders
• Evaluations constitute one of the most dangerous populations
The SafSec effort provides guidance on how to of potential attackers, and communicating
do this. [SafSec Standard] [SafSec Guidance] effectively with them is important in avoiding
subversion or other damage.
4.2.8 Assurance Case
Tradeoffs of other qualities or functionality 4.2.10 References
against security can be highlighted by issues [Avizienis 2004] Avizienis, Algirdas, Jean-
arising from the assurance arguments for Claude Laprie, Brian Randell, and Carl
security and the different qualities. Landwehr. “Basic Concepts and Taxonomy of
An assurance case will be more complete if it Dependable and Secure Computing,” IEEE
includes consideration for possibilities and risk Transactions on Dependable and Secure
values that are Computing, vol. 1, no. 1, Jan.-Mar. 2004, pp. 11-
33.
• Known ensuring none are overlooked
[Bishop 2003] Bishop, Matt. Computer Security:
• Known unknowns, Art and Practice, Addison-Wesley, 2003.
• Considers the possibility of unknown [Gasser] Gasser, M. Building a Secure
unknowns Computer System. Van Nostrand Reinhold,
4.2.9 More on Computer Security as 1988.
Conflict [Hoglund 2004] Hoglund, Greg, and Gary
McGraw. Exploiting Software: How to break
The attack and defense of software-intensive code. Addison-Wesley, 2004.
systems is not a routine situation; it is a serious
conflict situation with serious adversaries, such [Howard 2002] Howard, Michael, and David C.
LeBlanc. Writing Secure Code, 2nd ed.,
as competitors committing industrial espionage,
criminal organizations, terrorist groups, and Microsoft Press, 2002.
nation states. One should not forget what is [Ibrahim et al, 2004] Ibrahim, Linda, et al.
known about how to be successful in conflicts. Safety and Security Extensions for Integrated
[Hunt 2005b] While it is difficult to state the Capability Maturity Models. Washington D.C.:
principles of conflict in a brief manner, some United States Federal Aviation Administration,
principles exist covering such areas as exploiting Sept. 2004.
the arenas in which the conflict occurs, time, [ISO/IEC ??] ISO/IEC 13888 Information
quality, defining success, and hindering technology — Security techniques — Non-
adversaries. repudiation
Currently reducing the attackers’ relative [Landwehr 2001] Landwehr, Carl, Computer
capabilities and increasing systems’ resilience Security, IJIS vol. 1, 2001, pp. 3-13.
dominate many approaches. Anonymity of [Ministry of Defence 2003b] Ministry of
attackers has let to an asymmetric situations Defence. Defence Standard 00-42 Issue 2
where defenders must defend everywhere always Reliability and Maintainability (R&M)
and attackers can chose the time and place of
Assurance Guidance Part 3 R&M Case. 6 June 4.2.11 Further Readings
2003.
[Bourque and Dupuis 2004] Bourque, Pierre, and
[NSA 2002] National Security Agency, The Robert Dupuis (Editors). Guide to the Software
Information Systems Security Engineering Engineering Body of Knowledge. 2004 Edition.
Process (IATF) v3.1. 2002. Los Alamitos, California: IEEE Computer
[Redwine 2004] Redwine, Samuel T., Jr., and Society, February 16, 2004.
Noopur Davis (Editors). Processes for [Clark and Wilson 1987] Clark, David D. and
Producing Secure Software: Towards Secure David R. Wilson. “A Comparison of
Software. vols. I and II. Washington D.C.: Commercial and Military Computer Security
National Cyber Security Partnership, 2004. Policies,” Proc. of the 1987 IEEE Symposium
[Viega] Viega, J.. The CLASP Application on Security and Privacy, IEEE, 1987, pp. 184-
Security Process. Secure Software, 2005. 196.
[Whittaker] Whittaker, J. A. and H. H. [Sommerville 2004] Sommerville, I. Software
Thompson. How to Break Software Security: Engineering. 7th ed. Pearson Education, 2004.
Effective Techniques for Security Testing.
Pearson Education, 2004.
4.3 Ethics, Law, and Governance
4.3.1 Scope Code of Ethics and Professional Conduct, the
British Computer Society Code of Conduct
This section addresses the ethics, laws,
[Bynum and Rogerson 2004], and the
regulations, and standards peculiar to the
International Information Systems Security
development of secure software. Given the
Certification Consortium (ISC)2 Code of Ethics
relatively recent interest in the security of
[ISC 2005].
software, much of the knowledge in this section
is in a formative stage. Often it is derived from The ethics and codes of conduct share common
work from a larger domain. For example, no elements: (1) acting in the public interest, (2)
code of ethics has been proposed specifically for duty to clients insofar as it is consistent with the
secure software developers, but there are ethical public interest, (3) honesty and integrity in the
codes for information security professionals, practice of the profession, and (4) maintaining
computing professionals, and software engineers competence in the profession.
[Bynum and Rogerson 2004].
4.3.3 Law
Legal and regulatory knowledge in the domain
The primary legal issues surrounding the
of secure software development tends to focus
development of secure code include intellectual
on issues such as privacy, intellectual property,
property and their associated copyright, patent,
and liability. Although many laws addressing
trade secrets, and trademarks. Principles of
computer crime have been enacted within the
privacy, private and corporate liability, including
last two decades, these laws do not involve the
the so-called “downstream” liability, are
development of secure software and are
beginning to be codified in law, although these
therefore not included in this section.
principles are often spread through various
Several standards, including international legislative acts [Bosworth and Kabay 2002],
standards, addressing security are important in [Smedinghoff 2003], [Smedinghoff 2004]. The
the development of secure software. Standards most explicit statement of corporate liability in
that address the evaluation of secure systems the domain of secure information architectures
such as the Common Criteria are included in a includes the requirement for “internal control
separate section. structure and procedures for financial reporting”
4.3.2 Ethics in the Sarbanes-Oxley Act of 2002. Important
liability principles in this area include the
In defining the ethical body of knowledge that prudent man rule, due care, and due diligence.
applies to secure software development, we A summary of laws and executive orders
adopt the view of ethics as a “… branch of pertaining to computer security is contained in
professional ethics, which is concerned primarily [Moteff 2004].
with standards of practice and codes of conduct
of computing professionals…” [Bynum 2001]. 4.3.4 Governance: Regulatory Policy
Furthermore, a computer professional is “… and Guidance
anyone involved in the design and development
Regulatory policies and guidance are created by
of computer artifacts ...” [Gotterbarn, 1991].
government agencies to set performance levels
Codes of conduct and statements of ethics and compel certain behaviors in corporations or
embody much of the ethical knowledge in the industries [Harris 2003]. Privacy regulations
development of secure software. Example codes have the most far-reaching impact on the
include the Institute of Electrical and Electronics development of secure software. Privacy
Engineers Code of Ethics, The Software protects the personal information of individuals
Engineering Code of Ethics and Professional from misuse by governments or corporations.
Practice, Association of Computing Machinery Privacy principles include the lawful use of
personal information, the accuracy of that Other state and federal laws and regulations may
information, disclosure, consent, and secure apply. This issue, in most cases, deserves
transmission of that information. attention from legal counsel.
4.3.4.1 Policy 4.3.4.2 Standards and Guidance
A policy is a statement by an authoritative body Standards are more specific statements of
intended to determine present and future behavior intended to implement a policy or
decisions, actions, or behaviors. It is by nature policies. Guidance, on the other hand, are
general and requires specific standards, suggestions on how one might implement a
guidelines, and procedures to be implemented in standard. International standards on information
a particular situation. Policies are of three types: security include the Information Security
regulatory, advisory, and informative [Harris Management standards ISO/IEC 17799. Other
2003]. Regulatory policies are mandatory and standards and guidance on computer security
carry the force of law. Advisory policies include the National Institute of Standards and
encourage adherence to certain standards and are Technology (NIST) 800 series special
enforceable while informative policies are publications.
intended make known a particular policy and are
4.3.4.3 References
not enforceable.
[Bosworth and Kabay 2002] Bosworth,
Security policies are often formulated in
Seymour and Kabay, M. eds., Computer Security
response to the requirements of law. In the US
Handbook, 4th Edition, John Wiley and Sons,
these include the following federal laws.
2002.
4.3.4.1.1 Laws Bourque, Pierre, and Robert Dupuis (Editors).
• Health Insurance Portability and Guide to the Software Engineering Body of
Accountability Act – HIPAA Knowledge. 2004 Edition. Los Alamitos,
• The Sarbanes-Oxley Act of 2002 requires California: IEEE Computer Society, February
companies to implement extensive corporate 16, 2004.
governance policies, procedures, and tools to [Bynum 2001] Bynum, Terrell, “Computer
prevent, respond and report fraudulent Ethics: Basic Concepts and Historical
activity within the company. Effective self- Overview,” The Stanford Encyclopedia of
policing requires companies to have the Philosophy (Winter 2001 Edition), Edward N.
ability to acquire, search and preserve Zalta (ed.),
electronic data relating to fraudulent activity <http://plato.stanford.edu/archives/win2001/entri
within the organization. es/ethics-computer/>.
• Gramm-Leach-Bliley Act requires financial [Bynum and Rogerson 2004] Bynum, Terrell
institutions to safeguard customer (ed.) and Rogerson, Simon, Computer Ethics
information as well as detect, prevent and and Professional Responsibility: Introductory
respond to information security incidents. Text and Readings, Blackwell Publishing, 2004.
• California SB 1386 (California Civil Code § [Flickenger 2003] Flickenger, Rob, Wireless
1798.82) addresses how a company responds Hacks, O’Reilly and Associates, Inc., 2003.
to a breach, and has important features based [Gotterbarn 1991] Gotterbarn, Donald,
on cooperation with law enforcement and “Computer Ethics: Responsibility Regained,”
prompt notification to affected customers. National Forum, Vol. 71, pp. 26-31, 1991.
• Information Security Management Act [Harris 2003] Harris, Shon, All-in-One CISSP
(FISMA). FISMA mandates that federal Certification, McGraw-Hill Inc., 2003.
agencies must maintain an incident response
capability.
[Howard and LeBlanc 2003] Howard, Michael [Smedinghoff 2003] Smedinghoff, Thomas J.,
and LeBlanc, David, Writing Secure Code, 2nd “The Developing U.S. Legal Standard for
Edition, Microsoft Press, 2003. Cybersecurity,” Baker and McKenzie, Chicago,
[ISC 2005] International Information Systems <www.bakernet.com/ecommerce/us
Security Certification Consortium, Code of %20cybersecurity%20standards.pdf>, May
Ethics, <http://www.isc2.org>. 2003.
[Krutz and Vines 2003] Krutz, Ronald and [Smedinghoff 2004] Smedinghoff, Thomas J.,
Vines, Russell, The CISSP Prep Guide, John “Trends in the Law of Information Security,”
Wiley and Sons, 2003. World Data Protection Report, The Bureau of
National Affairs, Inc. Vol. 4, No. 2, August
[Moteff 2004] Moteff, John, Computer Security:
2004.
A Summary of Selected Federal Laws,
Executive Orders, and Presidentidal Directives 4.3.5 Additional Readings
(Order Code RL32357), Congressional Research
Matt Bishop. Computer Security: Art and
Services, April 16, 2004.
Practice, Addison-Wesley, 2003
[Pfleeger 1997]Pfleeger, Charles, Security in
I. Sommerville. Software Engineering. 7th ed.
Computing, Prentice Hall PTR, 1997.
Pearson Education, 2004.
.
4.4 Requirements for Secure Software
4.4.1 Scope shown by the assurance case – evidence tied
together by an assurance argument.
It is widely acknowledged within the software
industry that software engineering projects are Validation of software requirements and
critically vulnerable when requirements verification of consistency among
activities are performed poorly. Software representations are also of concern and
requirements document the needs and constraints addressed in part here and in part in the
placed on a software product that contribute to Verification, Validation, and Evaluation section.
the solution of a real-world problem. [Kotonya This section addresses knowledge needed in the
2000] requirements process, the establishment of
requirements needs and constraints, their
The Secure Software Requirements Knowledge
analysis, specification of system’s external
Area addresses knowledge needed in the
behavior, and several special areas.
requirements process for secure software and,
while mentioning some that knowledge for 4.4.2 Requirements for a Solution
context, presumes the knowledge to do
At the beginning of requirements activities,
requirements for “unsecured” software. This
generally no one individual or group knows the
includes knowledge covering interacting with
full requirements. Plus, different individuals and
stakeholders; establishing problem scope;
groups will have conflicting thoughts. A process
identifying needs, measures of merit and
involving and/or considering many stakeholders
alternatives other than for security or safety;
usually evolves to an agreed understanding of
analyses of technical, economic, and social
current and potential security and other needs in
feasibility; and specifying system (external)
the light of possible security solution
attributes, properties, behavior, and its
approaches.
interactions with its environment as well as its
environment’s relevant attributes, properties, and To the extent their systems share the same
behaviors (excluding threats). environment organizations have found it
effective to create “standard” inputs to security
Requirements are in large part about the
requirements such as threat identification and
system’s environment and the system’s
security policies reflected the union of external
interactions with it to achieve the required
and organizational constraints and the
effects – and avoid others. Secure software
organizations’ standard procedures. These
requirements analysts and designers pay special
standard inputs can then be tailored as needed to
attention to threats in software system’s
particular software systems.
environment and to the protection of assets.
Traceability, formality, and rigor facilitate [ISO-15448], [S4EC], [Moffett and Nuseibeh
correctness and assurance throughout 2003] and with some redundancy [Moffett 2004]
development. provide general guidance. [NSA 2004] and [??]
provide suggestive examples of security
Security requirements contain elements related
requirements.
to availability; integrity including authentication
and non-repudiation; and confidentiality 4.4.2.1 Traceability
including anonymity, privacy, and intellectual Assurance is impossible without a way to trace
property rights. These issues can be among requirements’ artifacts. Requirements
accompanied by a number of associated ones. also need to be stated in unique, single units to
Requirements identify not only what must or ease traceability to design, code, tests, and other
should be true about the software system for products. Software cannot be seen in isolation,
purposes of directing the producers of the so traceability must also be upwards, to any
system, but thereby also identify what must be system requirements.
4.4.2.2 Identify Stakeholder Security- o Primary memory
related Requirements o Transit
Interactions with stakeholders result in a list of o Registers
needs and preferences for security including
o Caching
privacy and intellectual property protection.
These interactions may also identify standards o Virtual memory paging
and guidelines that relate to the application from Startup including attempting to start (and
which requirements can be derived. (See Ethics, operate) in a potentially hostile or damaged
Regulations, and Governance section.) These environment and the need for recovery
may directly relate to the particular assets Shutdown including waiting for garbage
involved. collection and physical memory to be given to
Of course, initial requests may or may not another process
remain unchanged in the agreed-upon Logs
requirements.
Mobile computers and devices
4.4.2.3 Asset Protection Needs Disposed or reassigned equipment and media
In this area developers need to know goals, Lost or stolen computers, media, and devices
means, and processes for
The assets to be protected usually can be
• Identification of information and capabilities usefully thought of being of two kinds, implying
that might be within scope – whether the needs for
solution eventually puts them in digital form Data protection
or not
Software protection
• Estimating potential for damage from private
o Avoid loss of intellectual property
and public disclosure, contamination,
reduced access/capability or loss of asset o Avoid giving adversary knowledge, for
The latter may be reflected by categorizing data example, insight into commercial
or systems by sensitivity. [Radack 2005] process or military weapons system
[CJCSM 6510.01 2004, Enclosure C Appendix Special asset-related considerations generating
E] [DoDI S-3600.2] requirements that will be covered below include
Assets may need to be protected in many Mistakes, abuse, or failure to follow accepted
situations across the life of the asset and any procedures/use rules
copies of it. Information assets may be located at • Threat entities’ capabilities to detect
various times in a number of places each of deception and hiding of assets
which could imply security requirements
• Help ensure cost-effective and timely
including ones for:
certification of system security and
File system security-relevant facilities operational system accreditation
Database security • Aid administration, operation, and use in
Hardware protection secure fashions
Communications security In addition, environments where the software
Backup, archives, and recovery including and assets reside may have varying levels of
business continuity security.
Media used for mobility, e.g. memory sticks and See also Threat Analysis Processes subsection.
CDs 4.4.2.4 Threat Analysis
Location and movement during computation
One needs to look at software from the
including
viewpoint of the different relevant threatening
entities. Their decisions and capabilities will 4.4.2.4.2.3 Added capabilities from partial
determine the security protection needs of assets success
in combination with the defensive stakeholders’ Requirements for tolerance and defense in depth
valuation of the assets and the consequences of can be derived by considering the effects of the
successful exploits. possible partial successes by adversaries and the
The Threat-analysis process must involve added potential this provides them. The ability to
owners of the assets and other stakeholders who specify partial successes should evolve with the
might be affected by consequences of an design. Thus, security requirements continue to
incident. Inside attackers come from the user be created in the course of design activities.
population, so one must consider users from both
offensive and defensive perspectives. [Swiderski 4.4.2.4.2.4 Misuse Cases
and Snyder 2004] [Meier 2004] [Meier 2005a] Constructing and using misuse cases
Identifying, analyzing, and forecasting threat appropriately can aid in understanding the
entities or categories take expertise that may concrete nature of threat. One needs to be aware
only be available from experts. Validation of the of the limitations resulting from their being
threat analysis should involve stakeholders and partial examples.
expert review. 4.4.2.4.2.5 Attack Trees
4.4.2.4.1 Identification of Threat Entities Attack trees have the attacker goal at the top and
Identifying, analyzing, and forecasting threat branches that identify alternate (or) or combined
entities or categories take expertise that may ways (achieve subgoal A and subgoal B) that
only be available from experts, but a number of allow achievement of the goal or subgoal. These
resources exist, e.g. [Meier 2005b]. Intentions can provide a graphical way to portray potential
and resources of threats need to be established. attack paths that can then be investigated.
This includes their evaluation of the value of [Swiderski 2004]
possessing, disclosing, contaminating, and 4.4.2.4.2.6 Physical Access and Tampering
denying the assets. Insider and outsider threats
If the attacker has physical access to the system,
must be identified. For better results, validation
then additional possibilities arise. These threats
of the threat analysis should involve stakeholders
should be explicitly documented as concerns or
and expert review including by experts in
non-concerns – assumed not to exist. Common
methods of attack.
cases of possessing physical access include
4.4.2.4.2 Threats’ Capabilities insider attacks and stolen portable computers.
Capabilities of interest include both current and Other approaches include surreptitious entry into
future capabilities. The rate and directions in work or equipment areas, and subversion of the
which threat entities will improve their equipment maintenance process.
capabilities is therefore of interest. [Swiderski Physical access by attacker can raise many
and Snyder 2004] difficulties for a defender. Physical access may
aid authentication if, for example, IP address is
4.4.2.4.2.1 Skill used as part of authentication or a user has their
Identify or postulate skills, competence, and password written down somewhere near their
tools, and the implications of their possession. computer. A hard disk might be removed and
4.4.2.4.2.2 Resources studied using forensic techniques thereby
bypassing software access control. Techniques to
Identify or postulate attacker resources, initial address this last problem are generally based on
access, and persistence. cryptology.
Tampering comes in a number of forms. Those
concerning the integrity and self-protection of
software and assets are covered elsewhere. What are the risks to the owners or operators of
Tampering with the physical equipment, the system and/or the risk related to the assets
however, is a source of additional threats. These needing protection. If the system is a commercial
threats must be addressed or explicitly assumed one with effective legal protection based on a
to be nonexistence in a particular situation. given authentication or non-repudiation involved
[Atallah, Bryant and Sytz 2005] in the commercial interactions, then fiscal risks
from these transactions may be able to be
4.4.2.4.3 Threat Entities’ Intentions
minimized with limited worry about the systems
In large part, security is about the issues raised insecure environment. Other situations can be
by real or potential malicious intentions, so the significantly more difficult. Requirements need
practical importance of intentions is not to address the need for both an acceptable level
surprising. In practice, they are critical, of risk and technical feasibility.
particularly for entities that are not currently
Integrity issues always exist for entering data,
exercising their capabilities to cause maximum
and integrity checks are normally a requirement.
harm. Also in practice, one extends different
These checks include ones for acceptability of
degrees of responsibilities, privileges, and trust
data type and size, value possible in real world,
to different entities based, in part, on judgments
likely or plausible, consistency internally and
about their intentions as well as on risk estimates
with other data, proof of origin and legitimacy of
and resource allocation decisions.
origin, and lack of corruption or interference
Some advice, “When in doubt, one should not during transit. Check for non-repudiation may
trust in others’ intentions (or competence) at all.” also be relevant.
Motivations drive intentions. The attackers’ Security inspired requirements on nature and
motivations result from such factors as the value attributes of computing hardware, infrastructure,
they place on the asset – knowledge, use, or or other externally available services must be
denial – and their willingness to take the risks explicitly recorded as requirements or
involved including discovery and punishment. assumptions and assured. In some cases these
Motivations may be individual, organizational, will be the explicit responsibility of others in the
or even second hand as in mercenaries hired for organization, but their accomplishment and/or
industrial espionage or organized crime extorting existence still need to be assured.
experts to perform attacks for them.
The chief security concerns about effects on
A template for a threat modeling document is environment are to
available from Microsoft. [Meier 2005c]
• Not cause security problems for systems in
4.4.2.5 Interface and Environment environment
Requirements • Not output any information it should not
Many systems must interface with outside • Not output any information in a form it
software whose security characteristics are should not, e.g. unencrypted
uncertain or run on an operating system of other
infrastructure whose security is known to be 4.4.2.6 Usability Requirements
questionable. The system then may have difficult Usability requirements are important not only to
requirements to be secure in an insecure reduce user mistakes, but also to ensure efficient
environment. External systems or domains not and effective usage and acceptance. Security can
under trusted control should be considered be systematically bypassed by users if it places a
potentially hostile entities. Connections to such perceived unacceptable impediment between
external systems or domains must analyze and them and accomplishing their tasks.
attempt to counter hostile actions originating from
Process and task analyses and reengineering are
these entities. These call for careful risk analysis
the starting point for achieving these objectives.
as well as sound technical solutions.
Beyond using sound user-system interaction
requirements and design methods throughout. service that are acceptable, tolerable, and
[Schneiderman], certain general interaction unacceptable. These must, however, be
concerns have been particularized to security, technically and economically feasible, and limits
[Yee 2004] identifies a small set of general on need may be set by attacks that would in any
needs and principles, and many more detailed case deny service or passage upstream in the
issues are identified in [Cranor 2005] and paths of requests or downstream for responses.
[Garfinkel 2005]. A series of research-oriented Maintainability affects availability and tolerance.
workshops exists in this burgeoning research
area that addresses practical issues. [SOUPS 4.4.2.9 Requirements for
2005] Maintainability
Software with unrepaired vulnerabilities is
4.4.2.7 Requirements for Reliability
insecure software. Software that has yet to be
Reliability of a system depends on the changed to meet new security requirements is
distributions of inputs or the patterns of use. A also insecure.
seldom executed part of the software could have
faults with little or no impact on reliability. 4.4.2.10 Requirements for Deception
Although, strictly speaking, reliability is not a While it is certainly not desirable to rely on
security property, preserving the integrity and obfuscation, nevertheless, one may employ
availability of software and data that without deception and hiding. To be effective one should
reliability result in receiving wrong answers is consider requirements’ merits regarding the
somewhat futile. measures attackers might take to detect,
For these reasons and because of the resulting overcome, or exploit one’s deceptions as well as
economy, reliability’s assurance case is a natural additional requirements deriving there from. For
candidate to include in the same assurance case concrete examples, see Design subsection on
with security. Deception.

4.4.2.8 Requirements for Availability, 4.4.2.11 Requirements for


Tolerance, and Sustainability Validatability, Verifiability, and
Evauluatability
Developers need to know techniques that can
improve tolerance to security violations. Concerns here include
Robustness, resilience, and automatic adaptation • Validatability
and recovery are possible parts of the security- • Verifiability
influenced requirements. Separation of duties
and mechanism might be required, as might • Controllability
avoidance of a single point of total vulnerability, • Observability
analogous to the avoidance of a single point of Particularly, one must address the needs to
failure. As an additional example, one might provide evidence for the assurance case on
require secret sharing (splitting) across machines conformance to security requirements, policy,
for added confidentiality. See also subsection on and correctness of security functionality.
Added [Threat] Capabilities from Partial Success Requirements may also be generated by other
and the Design section. security-related aspects including
Continued availability of service and business
• Requirements to ease of development and
continuity including disaster recovery also
evolution of assurance case
generates requirements.
Denial of service attacks can make it difficult to • Requirements to help ensure cost-effective
continue to service legitimate users. and timely certification of software system
Requirements can be set for the size and nature security
of attacks and the amount of degradation of
• Requirements to help ensure cost-effective NSA/CSS Manual 130-1 [NSA 1990] and
and timely accreditation of operational [DIAM 50-4 1997]. [JDCSISSS 2001, p. i] The
systems JDCSISSS is a technical supplement to both the
If the certification of software system or NSA/CSS Manual 130-1 and DIAM 50-4 and
accreditations of operational situations are goals, provides procedural guidance for the protection,
then the requirements for this should be use, management, and dissemination of SCI.
considered and, whenever relevant, included 4.4.2.12.3 BITS Certification
when establishing requirements. See
The Financial Services Round Table BITS
Verification, Validation, and Evaluation section.
certification program aims achieving a set of
Testability for security or other concerns may security functionality suitable for banking and
lead to desires for additional observability, but financial services and a certification process
no means should be added that would violate simpler than that of the Common Criteria. See
security or unduly increase vulnerability. http://www.bitsinfo.org/c_certification.html.
Desirable additional controllability also should
Two ISO technical reports exist related to
be carefully assured as not producing security
financial services
problems.
• ISO/TR 13569 Banking and related financial
4.4.2.12 Certification Requirements services – Information security guidelines
A number of certifications requirements may • ISO/TR 17944:2002(E) Banking – Security
apply particularly to government systems. and other financial services – Framework for
4.4.2.12.1 Common Criteria Requirements security in financial systems
Associated with the Common are a set of The latter includes definitions, lists of applicable
Protection Profiles – standard minimal security standards and ISO technical reports, and mention
requirements – for are a number of kinds of of items lacking ISO standards in 2002 when it
software. These can be a required input to was published.
establishing security requirements or provide 4.4.2.13 System Accreditation
interesting examples. Requirements
Historically, the Common Criteria and The DoD accreditation process is governed by
associated Protection Profiles have identified DoD Instruction 5200.40, DoD Information
security-orient functionality to be required of Technology Security Certification and
systems. [Common Criteria v.3.0 Part 2] The Accreditation Process (DITSCAP) supplemented
Common Criteria now calls for self-protection by the DoD8510.1-M Application Manual.
and non-bypassability of security functionality.
On the civilian side of the US government the
[Common Criteria v.3.0 Part 3, page 96] These
National Institute for Standards and Technology
two properties, of course, change the
has produced NIST Special Publication 800-37
straightforward functionality requirements of
Guide for the Security Certification and
prior versions of the Common Criteria into
Accreditation of Federal Information Systems.
software system security property requirements.
[Ross R. 2004]
4.4.2.12.2 Sensitive Compartmented Internationally, ISO/IEC 17799:2005
Information Information technology. Code of Practice for
Protecting Sensitive Compartmented Information Security Management combined
Information (SCI) [DoDI S-3600.2] within the with BS7799-2:2002 (anticipated will become
US government is covered by [DCID 6/3 2000]. ISO/IEC 27001) form a basis for an Information
The combination of security safeguards and Security Management System (ISMS)
procedures used for SCI information systems certification (sic) of an operational system.
shall assure compliance with DCID 6/3,
The implications for a software system’s (or even larger scope) issue, not just a software
requirements deriving from an intention for the one.
software to be certified or be in an operational
4.4.3.2 Feasibility Analysis
system that requires accreditation need to be
carefully identified. Certain security requirements may make the
proposed effort of low feasibility or infeasible
4.4.3 Requirements Analysis from technical, economic, organizational,
Requirements analyses are intimately involved regulatory, marketing, or societal viewpoints.
in the requirements discovery process, as they Feasibility analyses address these issues.
are relevant to the evolving decision process that 4.4.3.3 Tradeoff Analysis
results in agreed-upon requirements.
Additional security may impact usability,
4.4.3.1 Risk Analysis performance, or other characteristics so as to
For security risks, the probabilistic approach make the system less desirable. A tradeoff might
embedded in the usual risk analysis cannot be exist between more security, for example,
based on technical characteristics of the system. multiple authentications rather than single, and
One can, in low and possibly medium risk acceptability, calling for analysis to resolve the
situations, make some probabilistic estimates problem.
based on humans and organizations that are Tradeoffs also occur with many activities such
potential threats. In high threat situations, the as establish the security policy and specifying
attacks are essentially assured eventual success if external behavior. This last is a product design
even a few vulnerabilities exist. This makes the activity and, as such, can be fraught with
chance of a successful attack a certainty for all tradeoffs including ones among qualities.
but the extremely well-designed and rigorously [Kazman 2000] [Kazman 2002] [Prasad 1998]
implemented systems. One should presume that [Despotou 2004]
these vulnerabilities will be found by a skilled,
persistent foe.
4.4.3.4 Analysis for Security
Requirements Conflicts
Analysis of possibilities and risk also addresses
the effect of attacks. The effects are not just first Security requirements conflicts can derive from
order such as disclosure or corruption of data but differing needs or viewpoints, or from differing
what effects this has on organizational mission policy models, capabilities, or inconsistent
success3, reputation, stock price, recovery and possibilities of configurations or settings among
repair costs, etc. A risk index (occurrence x components of the system.
effect), stakeholder utility judgments, or 4.4.4 Specfication
sensitivity level classifications of assets can be
used to identify security critical concerns, to 4.4.4.1 Document Assumptions
influence the specifications and design, and to Producers must identify and document the
help decide the allocation of resources and the assumptions on which their results depend.
development process. See the Project These are part of the requirements, and must
Management section. have agreement and approval. Generally, weaker
Of course, software is not the only path to assumptions are better.
success for an attacker. Social engineering and The following types of assumption should be
subversion may often be part of the easiest included: [ISO-15448, p. 10]
attacking path. Ultimately, security is a systems
• Aspects relating to the intended usage
• Environmental (e.g. Physical) protection of
3
In the US DoD this could include mission impact assessment any part
classified in accordance with DOD 5200.1-R and
DODInstruction 3600.2.
• Connectivity aspects (e.g. a firewall being or future user specification of a variety of
configured as the only network connection security policies. [ISO-15448]
between a private network and a hostile Specify the properties that must be preserved
network) separately from the specification of the external
• Personnel aspects (e.g. The types of user behavior of the functionality of the software
roles anticipated, their general system allows the latter to be explicitly verified
responsibilities, and the degree of trust to be consistent with the property constraints. In
assumed to be placed in those users) practice what is labeled security policy make
have in addition to these property constraints a
• Operational dependencies, for example on
number of requirements on usages of
infrastructure [Meier 2004, p. 101-105]
components or techniques, e.g. cryptographic
• Development environment dependencies, for techniques.
example correctness assumptions about tools
4.4.4.2.1 Mapping Security Requirements to
Assumptions include statements predicting bad
qualities or events as well as good ones. Some System Security Policy
assumptions may be stated The security protection
conditionally or Typical Organizational Security needs of assets and the
probabilistically. Policy Areas identified threats (existing,
Rationales for the assumptions future, concrete, or
• Access control
should also be documented, postulated) must be
• Awareness and training mapped into a stakeholder
reviewed, and approved. One
key question is, “Are the • Audit and accountability understandable description
assumptions appropriate for the • Certification, accreditation, and of the security policy to
environments in which the security assessments govern the software system.
software system is intended to The security properties
• Configuration management portion of this software
operate – now and in the
germane future?” A related • Contingency planning system security policy is
goal to show the answer is yes • Identification and then stated using formal
may belong in the assurance authentication notation as a basis for
case. verifying specification,
• Incident response design, and code
4.4.4.2 Specify Security • Maintenance compliance. The other
Policy • Media protection portions of the policy need
The software system security to be put in at least a semi-
• Physical and environmental formal or structured form
policy is part of software protection
system requirements placing suitable for verifying and
constraints on system behavior. • Planning evaluating compliance to
It can include policies related • Personnel security them in the specifications,
to confidentiality, integrity, design, code, and possibly
• Risk assessment
availability, non-repudiation, elsewhere.
and authentication. It is derived • Systems and services Documented traceability is
from higher level policies, laws acquisition required to assure the first
and regulations, threats, the • System and communications description of the policy
nature of the resources-being protection agrees with what was
protected, or other sources. • System and information intended and the formal
Care needs to be taken as part integrity properties portion of the
of the requirements may reflect policy or other restating of
NIST PIPS-200 Draft 2005
needs for flexibility for current portions of the policy agree
with their validated less formal statement. Where products. The other certification schemes may
possible, revalidation may also be appropriate. also make statements about functionality.
4.4.4.2.2 Informally Specify Organizations may have enterprise architectures
calling for certain functionality. In addition,
As with any critical requirement, the first step is
laws, regulations, standards, and policies may
creation of a carefully expressed, reviewed, and
also have requirements.
approved informal or preferably semi-formal
policy specification understandable by Security functionality specifics and locations can
stakeholders and traced to its justifications. be strongly affected by the degree of assurance
required, compatibility requirements, and
See the Design section for design options that
architectural decisions as well as development
may become an implicit or explicit context such
tool choices and decisions to purchase or reuse.
as kinds of access policies, in which terms some
security policies are stated. [ISO-15448] Security functionality behavior visible as
external behavior of the software system must be
In order to avoid excessive rework of the formal
sufficient to provide conformance to the
statement, this user understandable description
system’s security policy.
must be first analyzed and validated.
4.4.4.4 Specification of External
4.4.4.2.3 Formally Specify
Behavior
A clear fully documented mapping must exist
Designing the external behavior of a software
between the informal or semi-formal expression
system is, just as with “unsecured” software and
of the policy and the formal mathematical one.
its required non-functional requirements, often a
To ease the demonstration that external behavior
difficult process. [Chung 1999] Also similarly,
and design specifications are consistent with the
the use of possibly abstract system state values
security policy’s constraints, the formal policy
eases writing the specification. In security
specification needs to be made in notation or
concerns, however, these state values – such
terms allowing reasoning about their
where something is stored and if it is encrypted
consistency. The obvious choice may be using
or not – can be a central concern.
the same notation.
While the specification of external behavior is
4.4.4.3 Security Functionality traditionally labeled a requirements activity, it is
Requirements a product design activity and much of what is in
The Security Functionality subsections in the Design section is potentially relevant here as
Fundamentals and Design sections enumerate well. See Design section. Remember that – as in
functionalities. [SCC Part 2 2002] provides “unsecured” systems – while external behavior
perhaps the most extensive catalog of security may be recorded separately from internal design,
functionality while the most official is in the problems are intertwined and humans have
[Common Criteria v.3.0 Part 2]. The legitimate trouble thinking of only one of them
requirements question is to select from the without the other.
possible functionalities those that are needed for Skills (and methods/tools) are needed for
preserving the security properties of interest that constructing an appropriate (unambiguous,
are needed for the security policy and are complete, etc.) specification including formal
appropriate to the dangers and risks. [ISO- specification of the constraints generated by
15448], [CJCSM 6510.01 2004] [S4EC], security property requirements, e.g.
[Moffett and Nuseibeh 2003] and with some [Sommerville 2004, Chapers 9 and 10] [Hall
redundancy [Moffett 2004] all provide some 2002] [Summerville 1997]
guidance. The Common Criteria community has The external behavior specification of the
a number of Protection Profiles including lists of software system is the bedrock by which
functionality for different kinds of software assurance is first gained that the system will be
consistent with the required formal security activities related to creating and evaluating the
policy and other relevant portions of the security assurance case are also identified in the project
policy. The remainder of development will be plans. Finally note, the description of the
based on it including its consistency with proposed assurance case, which would normally
security policy. The specification and design also appear in a proposal document during
could be consistent with the security policy by acquisition, must also be based on adequate
• Having mechanisms allowing a range of understanding of requirements.
user/operator specification of security Requirements and assumptions play a central
policies role in the assurance case. Making the case that
• Identifying the configurations or settings for the system developed meets the requirements is
the required policy the main purpose of the assurance case. An
assurance could also address the requirements
The external specification must be traceable to and assumptions are suitable for the
identified needs, statements of relevant analyses, expectations, intentions, environment, and
and design rationales. Of course, in addition to external requirements.
consistency with security policy requirements,
the software system must also perform its See Verification, Validation, and Evaluation.
intended purposes and provide acceptable 4.4.7 References
usability, performance, and other qualities.
[Atallah, Bryant and Sytz 2005] Atallah,
4.4.5 Requirements Validation Mikhail, Bryant, Eric, and Styz, Martin, A
In addition to verifying the consistency among Survey of Anti-Tamper Technologies, Crosstalk –
requirements artifacts, one needs to ensure they The Journal of Defense Software Engineering,
reflect a system that will work as intended under Nov 2004.
operational conditions and provide the desired [Avizienis 2004] Avizienis, Algirdas, Jean-
benefits. As for all software systems, secure Claude Laprie, Brian Randell, and Carl
software must also validate security Landwehr. “Basic Concepts and Taxonomy of
requirements including the security policy. Dependable and Secure Computing,” IEEE
Validation involves producers and the external Transactions on Dependable and Secure
stakeholders. Computing, vol. 1, no. 1, Jan.-Mar. 2004, pp. 11-
Producers and stakeholders may prioritize and 33.
perform tradeoff studies on security and privacy [Bishop 2003] Bishop, Matt. Computer Security:
requirements based on application release or Art and Practice, Addison-Wesley, 2003.
schedule. Producers and relevant stakeholders [Bourque and Dupuis 2004] Bourque, Pierre, and
should explicitly justify any exclusion of Robert Dupuis (Editors). Guide to the Software
requirements deriving from laws, regulations, or Engineering Body of Knowledge. 2004 Edition.
standards/guides or explicitly identified by Los Alamitos, California: IEEE Computer
stakeholders. Society, Feb. 16, 2004.
Developers of secure software must know the [Chaves et al, 2005] Chaves,C. H. P. C., L. H.
relevant validation techniques and their Franco, and A. Montest. “Honeynet Maintenance
applicability. See Verification, Validation, and Procedures and Tools,” Proc. 6th IEEE Systems,
Evaluation; and Tools and Methods sections. Man and Cybernetics Information Assurance
Workshop (IAW 05), IEEE CS Press, 2005,
4.4.6 Assurance Case pp.252-257.
The construction of the assurance case begins [Chung 1999] Chung, Lawrence, et al. Non-
during conception and requirements – and any Functional Requirements in Software
related acquisition activities – and continues Engineering, Kluwer, 1999
throughout development and sustainment. The
[Cranor 2005] Cranor, Lorrie, and [Graff and van Wyk 2003] Graff, Mark G.and
Simson Garfinkel. Security and Usability: Kenneth R. Van Wyk. Secure Coding:
Designing Secure Systems that People Can Use. Principles and Practices. O'Reilly & Associates,
O’Reilly, 2005 2003.
[CJCSM 6510.01 2004] CJCSM 6510.01 [Hall 2002] Hall, Anthony and Rodrick
Defense-In-Depth: Information Assurance (IA) Chapman. “Correctness by Construction:
and Computer Network Defense (CND), DoD Developing a Commercial Secure System,”
Joint Staff, 2004 (Official Use Only) IEEE Software, vol. 19, no. 1, Jan./Feb. 2002,
[DIAM 50-4 1997] DIAM 50-4 Security of pp.18-25.
Compartmented Computer Operations. [Holz and Ravnal 2005] Holz, T. and F. Ravnal.
Department of Defense (DoD) Intelligence “Detecting Honeypots and Other Suspicious
Information System (DODIIS) Information Environments,” Proc. 6th Ann. IEEE Systems,
Systems Security (INFOSEC) Program, 30 April Man and Cybernetics Information Assurance
1997. Workshop (IAW 05), IEEE CS Press, 2005,
[DITSCAP 1997] DoD Instruction 5200.40, pp.29-36.
DoD Information Technology Security [Honeynet 2002] The Honeynet Project. Know
Certification and Accreditation Process Your Enemy. Addison-Wesley, 2002.
(DITSCAP). December 30, 1997 [Howard 2002] Howard, Michael, and David C.
[DoD8510.1-M 2000] DoD8510.1-M LeBlanc. Writing Secure Code, 2nd ed.,
Application Manual. July 2000 Microsoft Press, 2002.
[DoDI S-3600.2] DOD Instruction S-3600.2 [Hunt et al, 2005] Hunt, C., J. R. Bowes, and D.
Information Operations (IO) Security Gardner. “Net Force Maneuver,” Proc. 6th Ann.
Classification Guidance, 6 August 1998 IEEE Systems, Man and Cybernetics
[Eeles 2004] Eeles, Peter, Appendix C: Sample Information Assurance Workshop (IAW 05),
Architectural Requirements Questionnaire, IBM IEEE CS Press, 2005, pp.419-423.
30 Apr 2004. http://www- [Ibrahim et al, 2004] Ibrahim, Linda, et al.
128.ibm.com/developerworks/rational/library/47 Safety and Security Extensions for Integrated
10.html Capability Maturity Models. Washington D.C.:
[Evans 2005] Evans, S. and J. Wallner. “Risk- United States Federal Aviation Administration,
Based Security Engineering Through the Eyes of September 2004.
the Adversary,” Proc. 6th Ann. IEEE Systems, [ISO-15448] ISO/IEC TR 15446:2004,
Man and Cybernetics Information Assurance Information technology - Security techniques -
Workshop (IAW 05), IEEE CS Press, 2005, Guide for the production of Protection Profiles
pp.158-165. and Security Targets, JTC1/SC27 Technical
[Fowler 1995] Fowler, C. A., & Nesbit, R. F. Report, International Standards Organization,
“Tactical deception in air-land warfare,” Journal 2004
of Electronic Defense, vol. 18, no. 6. June, 1995, [ISO/TR 13569] ISO/TR 13569 Banking and
pp. 37-44 & 76-79. related financial services — Information security
[Garfinkel 2005] Garfinkel, Simson L. Design guidelines
Principles and Patterns for Computer Systems [ISO/TR 17944] ISO/TR 17944:2002(E) Banking
that are Simultaneously Secure and Usable, PhD – Security and other financial services –
Thesis MIT, 2005 http://www.simson.net/thesis/ Framework for security in financial systems,
[Gasser] Gasser, M. Building a Secure ISO, 2002
Computer System. Van Nostrand Reinhold, [JDCSISSS 2001] Joint DoDIIS/Cryptologic
1988. SCI Information Systems Security Standards.
DIA DoDIIS Information Assurance (IA) [Moffett and Nuseibeh 2003] Moffett, Jonathan
Program, Official Use Only 31 March 2001. D. and Bashar A. Nuseibeh. A Framework for
[Kazman 2000] Kazman R., Klein M., Clements Security Requirements Engineering. Report YCS
P. ATAM: Method for Architecture Evaluating 368, Department of Computer Science,
the Quality Attributes of a Software University of York, 2003.
Architecture. Technical Report CMU/SEI-200- [Neumann 2000] Neumann, P. G. Practical
TR004. Software Engineering Institute, Carnegie architectures for survivable systems and
Mellon University, 2000. networks. Technical report, Final Report, Phase
[Kazman 2002] Kazman R., Asundi J., Klein M., Two, Project 1688, SRI International, Menlo
Making Architecture Design Decisions: An Park, California, 2000.
Economic Approach. SEI-2002-TR-035. [Neumann 2003] Neumann, P.G. Principled
Software Engineering Institute, Carnegie Mellon Assuredly Trustworthy Composable
University, 2002. Architectures (Draft), Dec., 2003.
[Kotonya 2000] Kotonya, G. and I. Sommerville, [NSA 1990] National Security Agency,
Requirements Engineering: Processes and NSA/CSS Manual 130-1, Operational Computer
Techniques, John Wiley & Sons, 2000. Security, October 1990.
[Meier 2004] Meier, J.D., Alex Mackman, [NSA 2002] National Security Agency, The
Srinath Vasireddy, Michael Dunner, Ray Information Systems Security Engineering
Escamilla, and Anandha Murukan, Improving Process (IATF) v3.1., 2002.
Web Application Security: Threats and [Prasad 1998] Prasad D., Dependable Systems
Countermeasures, Microsoft, 2004 Integration using Measurement Theory and
http://download.microsoft.com/download/d/8/c/d Decision Analysis. PhD Thesis, Department of
8c02f31-64af-438c-a9f4- Computer Science, University of York, UK,
e31acb8e3333/Threats_Countermeasures.pdf 1998.
[Meier 2005a] Meier, J.D., Alex Mackman, [Redwine 2005a] Redwine, Samuel T., Jr.
Blaine Wastell, Threat Modeling Web Dependability Properties: Enumeration of
Applications, Microsoft Corporation, May 2005 Primitives and Combinations with Observations
http://msdn.microsoft.com/security/default.aspx? on their Measurement. Commonwealth
pull=/library/en-us/dnpag2/html/tmwa.asp Information Security Center Technical Report
[Meier 2005b] Meier, J.D., Alex Mackman, CISC-TR-2004-001.
Blaine Wastell, Cheat Sheet: Web Application [Redwine 2005b] Samuel T. Redwine, Jr.,
Security Frame, Microsoft Corporation, May Principles for Secure Software: A Compilation
2005 of Lists. Commonwealth Information Security
http://msdn.microsoft.com/security/default.aspx? Center Technical Report CISC-TR-2005-002.
pull=/library/en-
us/dnpag2/html/tmwacheatsheet.asp?_r=1 [Radack 2005] Radack, Shirley, editor.
“Standards for Security Categorization of
[Meier 2005c] Meier, J.D., Alex Mackman, Federal Information and Information Systems.
Blaine Wastell, Template Sample: Web Federal Information Processing Standard (FIPS)
Application Threat Model, Microsoft 199, July 10, 2005.
Corporation, May 2005
http://msdn.microsoft.com/security/default.aspx? [Redwine 2004] Redwine, Samuel T., Jr., and
pull=/library/en- Noopur Davis (Editors). Processes for
us/dnpag2/html/tmwatemplatesample.asp?_r=1 Producing Secure Software: Towards Secure
[Moffett 2004] Moffett, Jonathan D. Charles B. Software. vols. I and II. Washington D.C.:
Haley, and Bashar Nuseibeh, Core Security National Cyber Security Partnership, 2004.
Requirements Artefacts, Security Requirements [Ross 2005] Ross, Ron et al. “Recommended
Group, The Open University, UK, 2004 Security Controls for Federal Information
Systems,” NIST Special Publication 800-53, 4.4.8 Additional Reading
Feb. 2005.
[Boudra 1993] Boudra,P., Jr. Report on rules of
[S4EC} System Security, Survivability, and system composition: Principles of secure system
Safety Engineering Criteria (S4EC) Project, design. Technical Report, National Security
www.s4ec.org Agency, Information Security Systems
[SafSec Guidance] “SafSec Methodology: Organization, Office of Infosec Systems
Guidance Material.” SafSec: Integration of Engineering, I9 Technical Report 1-93, Library
Safety and Security. No. S-240, 330, March 1993.
http://www.safsec.com/safsec_files/resources/50 [Davis 1993] Davis, A.M. Software
_3_SafSec_Method_Guidance_Material_2.6.pdf Requirements: Objects, Functions and States,
[SafSec Introduction] “Why SafSec?” SafSec: Prentice Hall, 1993.
Integration of Safety and Security. [Goguen 1993] J. Goguen and C. Linde,
http://www.safsec.com/safsec_files/resources/50 “Techniques for Requirements Elicitation,”
.6_Why_SafSec.pdf International Symposium on Requirements
[SafSec Standard] “SafSec Methodology: Engineering, 1993.
Standard.” SafSec: Integration of Safety and [IEEE830-98] IEEE Std 830-1998, IEEE
Security. Recommended Practice for Software
http://www.safsec.com/safsec_files/resources/50 Requirements Specifications, IEEE, 1998.
_2_SafSec_Method_Standard_2.6.pdf
[FIPS 188] FIPS 188 Standard Security Labels
[SCC Part 2 2002] Information Assurance for Information Transfer, September 1994
Engineering Division SPAWAR System Center
[Flechais 2003] Flechais, I., Sasse, M. A., and
Charleston, Security Certification Criteria for
Hailes, S. M. Bringing security home: a process
Information Assurance Enabled Systems and
for developing secure and usable systems. In
Infrastructures Part 2: Functional Certification
Proceedings of the 2003 Workshop on New
Criteria, Version 0.2, Information Assurance
Security Paradigms (Ascona, Switzerland,
Engineering Division SPAWAR System Center
August 18 - 21, 2003). C. F. Hempelmann and
Charleston, 1August 2002.
V. Raskin, Eds. NSPW '03. ACM Press, New
http://www.s4ec.org/scc_part2_ver0-21.pdf
York, NY, 49-57.
[SDI 1992] Department of Defense Strategic
[Open Group] Open Group. Security Design
Defense Initiative Organization. Trusted
Patterns (SDP) Technical Guide v.1, April 2004
Software Development Methodology, SDI-S-SD-
91-000007, vol. 1, 17 June 1992. [Rowe 2004] Rowe, Neil C. “Designing Good
Deceptions in Defense of Information Systems,”
[Swiderski 2004] Swiderski, F. and Snyder, W.
ACSAC 2004
Threat Modeling. Microsoft Press, 2004.
http://www.acsac.org/2004/abstracts/36.html
[Viega 2005] Viega, J. The CLASP Application
[Thompson 2005] Thompson, H. H. and S. G.
Security Process. Secure Software, 2005.
Chase. The Software Vulnerability Guide.
[Whittaker] Whittaker, J. A. and H. H. Charles River Media, 2005.
Thompson. How to Break Software Security:
[Gutmann 2004] Gutmann, P. Cryptographic
Effective Techniques for Security Testing.
Security Architecture: Design and Verification.
Pearson Education, 2004.
Springer-Verlag, 2004.
[Yee 2004] Ka-Ping Yee. “Aligning security and
[Manadhata and Wing 2004] Manadhata, P. and
usability.” Security & Privacy Magazine, 2:
J. M. Wing. "Measuring A System's Attack
48–55, Sept–Oct 2004. Surface,"http://www-
2.cs.cmu.edu/afs/cs/project/calder/www/tr04-
102.pdf, CMU-TR-04-102, January 2004.
[Howard] Howard, M., J. Pincus, and J. Wing. [Stoneburner et al, 2004] Stoneburner, Gary,
“Measuring relative attack surfaces.” To appear Hayden, Clark, and Feringa, Alexis.
in: Proceedings of the Workshop on Advanced Engineering Principles for Information
Developments in Software and Systems Security. Technology Security (A Baseline for Achieving
[Giorgini 2004] P. Giorgini, F. Massacci, J. Security).NIST Special Publication 800-27 Rev
Mylopoulous, and N. Zannone. “Requirements A, June 2004.
Engineering meets Trust Management: Model, [Wyk and McGraw 2005] van Wyk, Kenneth
Methodology, and Reasoning”. Proc. of the 2nd and Gary McGraw, After the Launch: Security
Int. Conf. on Trust Management (iTrust) 2004. for App Deployment and Operations.
[Karger et al, 1990] Karger, Paul A., Mary Ellen Presentation at Software Security Summit, April
Zurko, Douglas W. Benin, Andrew H. Mason, 2005.
and Clifford E. Kahn. “A VMM Security Kernel [Zwicky et al, 2000] Zwicky, Elizabeth D.,
for the VAX Architecture,” 1990 IEEE Simon Cooper, and D. Brent Chapman. Building
Symposium on Security and Privacy, IEEE, Internet Firewalls (2nd ed.), O'Reilly, 2000.
1990. [Radack 2005] Radack, Shirley, editor.
[NIST 2005] The National Institute of Standards “Standards for Security Categorization of
and Technology. Common Criteria v. 3.0, July, Federal Information and Information Systems.
2005. Federal Information Processing Standard (FIPS)
[Saltzer and Schroeder 1975] Saltzer, J. H. and 199, July 10, 2005.
M. D. Schroeder. “The protection of information [Ross 2005] Ross, Ron et al. “Recommended
in computer systems,” Proceedings of the IEEE, Security Controls for Federal Information
vol. 63, no. 9, 1975, pp. 1278-1308. Systems,” NIST Special Publication 800-53,
[Robertson 1999] S. Robertson and J. Robertson, Feb. 2005.
Mastering the Requirements Process, Addison- [Barker 2004] Baker, William C. “Guide for
Wesley, 1999. Mapping Types of Information and Information
[Sommerville 1997] I. Sommerville and P. Systems to Security Categories,” NIST Special
Sawyer, Requirements Engineering: A Good Publication 800-60, June 2004.
Practice Guide, John Wiley & Sons, 1997 [HMAC 2002] “The Keyed-Hash Message
[Schell 2005] Roger Schell, Keynote Talk, Authentication Code (HMAC)”, FIPS 198,
International Workshop on Information March 2002.
Assurance, March 24, 2005.
[US DoD 1996] Joint Chiefs of Staff, DoD JP 3-58 Joint Doctrine for Military Deception, 31 May,
1996.
[SHS 2002] “Secure Hash Standard (SHS)”, FIPS 180-2, August 2002.
[US Army 2003] US Army, Field Manual (FM) 3-13: Information Operations: Doctrine, Tactics,
Techniques, and Procedures, 28th Nov., 2003. (particularly Chapter 4 on Deception)
.
4.5 Secure Software Design
4.5.1 Scope • Defending against outside attack of design
Software design is the software engineering infrastructure and artifacts – outsiders
activity taking the products of the requirements • Creating an assurance case with arguments
activity and decomposing the system at possibly and evidence that address intelligent,
multiple levels into components and their malicious actions as well as security relevant
interactions. Design produces random misfortunes
• Description of the software’s internal And one that might also exist in high-confidence
structure that will serve as the basis for its system for other purposes
construction • Minimizing what must be trusted
• Rationale for the design decisions These differences along the differing goals
• Assurance case related to design including raised by security make designing secure
argument and evidence for the design’s software a different kind of problem for
conformance to external behavior designers.
specifications, the systems security policy,
and other relevant constraints
4.5.2 Design Goals
Designers of secure software systems can have a
Design normally includes descriptions of the
number of distinctive goals including: [Gutmann
architecture, components, interfaces, and other
2004, Chapters 1-4] [Redwine 2005a]
characteristics of a system or component.
[Gasser] provides an in introduction to design • Design to defend perfectly then assume this
for secure software systems, and coverage of defense will fail and design to defend after
web applications design is provided by [Meier initial security violation(s)
2004, Chapter 4]. • Have a design that is open to and eases
Traceability must be monitored between relevant verification, validation, and evaluation
elements of requirements, such as the external • Ease creation and maintenance of assurance
behavior specification and the architecture, as case
well as continue throughout design. This
• Architecturally eliminate possibilities for
traceability is also reflected in the assurance
violations – particularly of information flow
case.
policies
Design must produce the detail required by the
construction activity. • Do not cause security problems for systems
in environment
Design of secure software systems is similar to
design of any extremely high-confidence • Have sound
software but with such additional concerns as o Authentication
• Consistency with a security policy o Authorization and access control
• Incorporating security functionality o Administrative controllability
• Ensuring design does what the specification o Comprehensive accountability
calls for and nothing else o Tamper resistence
• Employing separation of mechanisms, • Design so system does what specification
duties, and privileges as well as least calls for and nothing else
privilege
• Design to tolerate violations (See section on
• Negating the subversion or maliciousness of Confine Damage and Increase Resilience.)
designers or other personnel – insiders
• Design for survivability [Ellison 2003]
• Make weak assumptions security. A separate trusted computing base can
• Do not ultimately rely on obfuscation, but isolate security policy and localize reasoning and
exploit deception and hiding assurance arguments. This does not mean that
the trusted software must be monolithic.
• Help ensure cost-effective and timely [Vanfleet 2005] To help simplify and minimize
certification of software system security and what must be trustworthy, one can minimize
accreditation of operational system functionality included in trusted parts in a
4.5.3 Principles and Guidelines for number of ways:
Designing Secure Software4 • Exclude non-security relevant functionality
This subsection employees and adds to the • Virtualize roles of hardware and device
principles in the Fundamental Concepts and drivers
Principles sections. • Separate policy and mechanism
4.5.3.1 General Design Principles for • Localize or constrain dependencies
Secure Software Systems • Maintain minimal retained state
Developers need to know secure software design o Do not keep startup processes or state
principles and how they are employed in design. after startup, or normal running ones
They also need to know ways to recognize during shutdown [Schell 2005]
whether the principles were used in a design and o Attackers cannot compromise
evaluate designs and proposed changes including information that is not in the system
improvements. (Also see Fundamental Concepts
and Principles sections.) • Minimize support for functionality, which is
outside the trusted software component(s)
Design includes using abstraction and
decomposing the system using architecture and If a program retains minimal state, it’s harder for
constraints to facilitate quality design and attackers to find information or to use disallowed
achievement of security requirements including actions.
facilitating evolution of required functionality Among the approaches to modularity, layering is
and security policy to address future user needs a crucial information hiding mechanism in
particularly changing threats. secure systems. Layered and distributed
Architecturally one needs to a sound security protection can be effective. Object orientation
architecture amendable to assurance arguments has merit – but beware the difficulties of
including formal proofs. To ease production of assuring the correctness of inheritance and
accompanying assurance case for security polymorphism as well as generics. Indeed, one
preserving correctness of compositions, take care may need to avoid these unless one has adequate
with composability at all levels of detail. [Bishop means of analysis and assurance for their use.
2003 ??] [Neumann 2003] Achieving these using Eliminating or reducing the possibilities of
known security techniques can aid confidence. certain kinds of violations is done by separation
Information hiding and encapsulation are • Least common mechanism – avoid shared
established general design principles, but mechanisms
minimizing and simplifying the portion of the • Separation of duties
software that must be trusted is crucial to
• Separation of privilege
4
This section draws heavily on an existing larger compilation
from a number of sources in [Redwine 2005b]. Portions of
• Separation of roles
these principles were first collected in [SDI 1992] or • Separation of domains
[Neumann 2003] as well as [Stoneburner et al, 2004] and are
described in more detail there. Discussions of some appear in • Constrained dependency
[Howard and LeBlanc 2003], [Viega and McGraw 2002]
particularly accessible, and [Bishop 2003].
Where possible within design constraints and o Especially important when COTS
assuring analyzable results, security products are used.
functionality might best be based on sound open • Limit or contain vulnerabilities’ impacts
standards aiding evolution, portability, and
interoperability – as well as assurance. • Design for survivability [Ellison 2003]
Modularity and separation enable one to • Be able to recover from system failure in any
state
• Defend in depth
o Be able to recover from failure during
• Have diversity in defenses [Zwicky, et al.]
recovery (applies recursively)
o Possible lack of variety can be indicated
• Choose safe default actions and values
not just by being identical but by
common heritage of software, common • Self-limit program consumption of resources
involvement of persons or practices, o Attempting to exhaust system resources
common settings, and common (e.g., memory, processing time.) is a
components common attack. Add capabilities into the
o To be most effective, combine program to prevent overusing system
[overlapping] physical, procedural, and resources
software security [Moffett and Nuseibeh] • Make sure it’s possible to reconstruct events
Defense in depth’s success is more likely if the o Helps focus response and reconstitution
defenses are diverse in nature. efforts to those areas that are most in
Designers should avoid exposing vulnerabilities need.
in any system state including startup, operation, o Organizations should establish “detect
shutdown, exception handling, system failure, and respond” capabilities, manage single
updating, recovery, and loss or disposal. points of failure in their systems, and
Treating a design as holistically as is practicable implement a reporting and response
avoids an unreasonable combination of strategy.
seemingly reasonable local design decisions. To o Supports forensics and incident
maintain the same level of assurance, one must investigations.
reuse designs and components only if known to
o Record secure audit logs and facilitate
be secure in the fashion required for this system.
periodical review to ensure system
4.5.3.2 Damage Confinement and resources are functioning, reconstruction
Resilience possible, and identify unauthorized or
A designer cannot depend on a defense being abuse
perfect. One response to this is to design and • System should have a well-defined status
operate a secure software system to after failure, either to a secure failure state or
• Limit damage via a recovery procedure to a known secure
state
• Be resilient in response to events
o Rollback
The aim is to make the system resistant to attack,
limit damage, and have it detect and recover o Fail forward
rapidly when attacks do occur. This can be o Compensate
accomplished in a variety of related ways: • Degrade Gracefully
• Build in appropriate levels of fault tolerance o Software should not just fail when a
• Eliminate “weak links” damaging problem occurs but should
• Implement layered security (no single point continue to run, at least in a restricted
of vulnerability). way
• Fail securely • Best practice suggests it is better to have
• Fail Safely several administrators with limited access to
security resources rather than one person
o Fail “open”: If patient “sensor” software
with "super user" permissions.
happens to fail, should it turn off the life
support machine? No! • Identify the roles/responsibilities that, for
security purposes, should remain separate.
o Fail “closed”: If a firewall dies, should
the server’s network connection be • Reduce the number of input/output points –
shutdown? Yes! the attack surface [Howard, Pincus, and
Wing 2003] [Manadhata and Wing 2004]
• Provide assurance that the system is, and
shall continue to be, resilient in the face of • Do not implement unnecessary security
expected threats mechanisms. Unneeded security mechanisms
add complexity and are potential sources of
o Documentation of the specific and
additional vulnerabilities.
evolving threats is important
• Ensure proper security at system shutdown
• To extent possible, isolate publicly and disposal
accessible systems from mission critical
resources (e.g., data, processes, etc.). o Although a system may be powered
down, critical information still resides on
o Physical isolation: no physical
the system and could be retrieved by an
connection exists between an unauthorized user or organization.
organization’s public access information
resources and the organization’s critical o At the end of a system’s life-cycle,
information. procedures must be implemented to
ensure system hard drives, volatile
o Logical isolation: layers of security
memory, and other media are purged to
services and mechanisms should be an acceptable level and do not retain
established between public systems and residual information.
secure systems responsible for protecting
mission-critical resources • The system does what the specification calls
for and nothing else
• Use boundary mechanisms and guardian
mechanisms to separate computing systems Identify and prevent common errors and
and network infrastructures to control the vulnerabilities. Many errors reoccur with
flow of information, access across network disturbing regularity, e.g.
boundaries, and enforce proper separation of • buffer overflows
user groups. • format string errors
4.5.3.3 Vulnerability Reduction • failing to check input for validity
A number of techniques, issues, and benefits • programs/processes being given excessive
result from carefully analyzing proposed designs privileges
and identifying and reducing vulnerabilities. Examples sources for lists of vulnerability
[Swiderski and Snyder 2004] categories and instances include books and
• Helps ensure cost-effective and timely websites. See in Development More on
certification of system security features. Fundamental Concepts and Principles and
• Deny access unless explicitly authorized Secure Software Construction sections.
• Deploy with safe initial defaults Separation of duties and mechanism might aid
avoidance of a “single point of total
• Check every access vulnerability”, analogous to the avoidance of a
• Implement least privilege. “single point of failure”. As an additional
example, one might require secret sharing physically the information and respective
(splitting) across machines for added information systems processing the data.
confidentiality or redundancy across machines as
well as for improved availability and integrity. 4.5.4 Documentation of Design
Of course, secret sharing (splitting) increases the Assumptions
minimum number of machines that must The list of assumptions made primarily about the
accessible to access the secret. software systems environment is one of the
4.5.3.4 Viewpoints and Issues products of the requirements activity. These may
change as the design develops, hopefully, by
Some viewpoints and issues are special for discovering design approaches that allow
secure software design. These normally include making them fewer or weaker. They may also
• Design with the enemy in mind grow if dependencies are introduced by the
o Subversion is the attack mode of choice design – for example via reuse of software
e.g., subvert people, processes, component whose assurance case derives in part
procedures, testing, repair, tools, from assertions or evidence from supplier. See
infrastructure, protocol, or systems Requirements subsection on Assumptions for
o Understand and enforce the chain of trust more discussion of contents.
o Don’t invoke untrusted programs from 4.5.4.1 Environment
within trusted ones. Assumptions about the environment and its
• Test any proposed design decision against interfaces to the software system can aid in
policy and ease of assurance simplification of design and assurance
arguments, but also can provide bases of attack.
• Be aware of composition difficulties for
security properties Development environment and operational
environment assumptions should be
• Design with the network in mind documented.
o Implement security through a
4.5.4.2 Internal
combination of measures distributed
physically and logically. Assumptions made about items inside the
software system could be the result of
o Associate all elements with the security
assumptions about the environment or internal.
service they provide.
Miss identified internal “assumptions” could be
• Cross domain issues conditions guaranteed to parts of the system by
o Authenticate users and processes to parts of the system and thereby not assumptions
ensure appropriate access control but something that must be provided by the
decisions both within and across software system design.
domains.
4.5.5 Documention of Design
o Level of trust is always an issue when
Decisions and Rationales
dealing with cross-domain interactions.
Document design decisions made for security
o Formulate security measures to address
reasons and ensure traceability of design items
multiple overlapping information and decisions to requirements and code.
domains. Traceability is essential. Rationale should often
o An efficient and cost-effective security provide arguments or evidence for assurance
capability should be able to enforce case.
multiple security policies to protect
multiple information domains without
the need to (excessively) separate
4.5.6 Software Reuse confidence resilience and recovery may be more
acceptable. Consider simply not producing an
Several general reuse guidelines are compiled in
insecure OTS-based system containing any high-
[Redwine 2005b].
value assets or for purposes or uses with
• Reused software that includes flaws will significant security risks.
undermine the trust in the entire software
For OTS-based systems, an excellent, thorough
system.
assurance case is crucial.
• Deal with risks when not feasible to require Subcontracting or outsourcing can reduces one’s
all reused software to meet the level of intimate visibility and control, causing increased
developed software uncertainties and risks. (See Acquisition
• There may be little control over software section.)
fixes and patches. Contracting or outsourcing production (or
• Prototype Software Reuse production support) does not remove the same
o One should reuse a prototype in the final security requirements as would be hold for in-
product only if it will be possible to house production or the need to provide the same
enhance the prototype software or higher quality assurance case – including
traceability, documentation, and arguments and accompanying evidence. If this is
evidence to adhere to the level of trust a portion of a larger system, then its assurance
required for the overall assurance case. case must integrate properly with other
assurance arguments in the remainder of the
o Code analysis and testing should also be
larger assurance case. The same is, of course,
performed to add assurance. true of any suppliers they in turn use that could
With rare exceptions, composing systems from affect security – furthermore, in the worst case,
COTS, GOTS, or open source components is for the entire supply chain.
fraught with security problems and existing
vulnerabilities. One need only read the long lists 4.5.7 Security Architectures
of known defects and vulnerabilities for common Security architectures need to address all the
COTS and open source software to realize some normal concerns of high-dependability plus
of the dangers (even if most known ones are those particular to preserving the required
fixed). confidentiality, integrity, and availability under
Here even an emphasis on attack.
• Performing adequate component analysis There are many architectures. Secure
prior to acquisition architectural styles or style elements include
• Analyses of the frequently awkward issues • Reference monitors
in composition along • Systems high
• Testing and assessment of resulting • Multiple Independent Levels of Security
composite system (MILS)
can usually provide high confidence only if the • Multiple Single Levels of Security (MSLS)
individual components deserve high confidence.
• Distributed access control
Identifying alternatives and performing trade-off
studies on software components is critical. • Layered
If “forced” to use normal OTS software, • Tolerant – to (partial) attacker success –
consider other ways of avoiding and sharing including “self healing” approaches
risks in addition to defense in depth, damage • Adaptive distributed reconfiguration
confinement, and resilience. When integrity is responses to attacks [MAFTIA]
the only issue, relying primarily on high-
• Compartmentalization via
o Virtual machines One particular problem is that information
o Separation via encryption already encrypted – say to gain the benefits of
end-to-end encryption – cannot be directly
o Physical separation inspected at the boundary. Providing cross-
o Separation except at point of use domain solutions has become a niche area.
o Filters, guardians, and firewalls
4.5.8 Security Functionality
Peter Neumann in [Neumann 1995] and
Areas of knowledge needed include
[Neumann 2000] discusses a number of relevant
architectural styles. (See Damage Confinement • Kinds of security functionality [Common
and Resilience section.) Criteria v.3.0 2005, Part 2]

4.5.7.1 Generic Access Control • Design of security functionalities


Policies o Individually and in combination
There are a number of access control concepts, o Potential conflicts
kinds policies, and issues. [Bishop 2003, p. 381- o Centralized and decentralized security
406] Policies can be used individually or in functionality
combination. These include o Planning for evolution of security
• Access control matrix functionality
• Generic access control policies • Availability, characteristics, and appropriate
o Discretionary access control use of OTS or reusable security functionality
o Mandatory access control The Common Criteria now call for self-
protection and non-bypassability of security
o Role based access control
functionality. [Common Criteria v.3.0 Part 3,
o Workflow access control page 96-101] This, of course, changes
o Chinese wall access policy requirements from the easier functionality
o High latency with existence requirements of prior versions of the
Common Criteria into ones correctly requiring
provisions/prerequisites and obligations
(e.g. credit card transactions) system security properties.
One area of functionality to the designer is
• Access control process and mechanisms
access control.
• Access control in distributed
systems/databases 4.5.8.1 Identity Management
• Disclosure by inference An identity may represent an actual user or a
process or data with its own identity
• Potential exploitation of access control to
Use identities authenticated by (possibly
create possibilities for disclosure by covert
multiple) means sufficient for the threat
channels
environment and asset value to
4.5.7.2 Cross-Domain Control • Maintain accountability and traceability of a
Information crossing the boundary between user.
security domains with different policies (or in • Assign specific rights to an individual user
practice ownership) raise potential problems for or process.
both the originating domain (Should be allowed
to leave? Does it need to be encrypted first?) and • Provide for non-repudiation.
the receiving one (Can it be trusted? How should • Enforce access control decisions.
it be handled?). Guards may be placed on one or • Establish the identity of a peer in a secure
both sides of a boundary. communications path.
• Prevent unauthorized users from accordance with the originator’s policy. Thus,
masquerading as an authorized user. PACLs are suitable for implementing an
originator-controlled access policy (ORCON).
4.5.8.2 Access Control Mechanisms
Access conflicts can arise if a user possesses
The protection state of a system is defined by the multiple access permissions to an entity. For
rights that can be asserted over entities in that example, if by virtue of a user’s identity,
system [Bishop 2003]. The ability of a system to read/write access is granted and by virtue of the
grant, deny, or modify those rights constitute the same users group membership, only read access
process of access control within a system. The is granted, which access permissions take
various methods used to control access to precedence? Typically, if a type of access is
entities is the subject of this section. The access held by a user by whatever means, that type of
control matrix is a very general framework for access can be exercised. However, this does not
describing a systems protection state [Bishop have to be the case.
2003]. Users form the rows of the matrix, while
the entities in the system are the columns. 4.5.8.2.2 Capabilities
Entries in a particular cell of the matrix describe In contrast to ACLs whereby access rights are
the access the user from that row has over the associated with the controlled entity, capabilities
entity from that column. are associated with users. Capabilities describe
A particular access control method can be the objects a user is ‘capable’ of accessing and
classified as either mandatory or discretionary. the manner of that access. In an access control
A mandatory access control scheme is enforced matrix, capabilities are a row-wise partition of
by the system and cannot be altered or the matrix.
overridden by users of that system. Often these
4.5.8.2.3 Locks and Keys
schemes are rule-based and implement
organizational security policies. Discretionary Locks and keys can be thought of as a
access controls, on the other hand, are optional combination of an ACL and a capability. The
controls set by system users with the appropriate lock is information associated with the protected
rights to the entity. entity, while the key is information associated
with or possessed by one or more users [Bishop
4.5.8.2.1 Access Control Lists 2003]. Sometimes cryptographic secret-sharing
Access control lists (ACL) are typically identity algorithms are used to prevent access to an entity
based mechanisms associated with the controlled unless a given number of users agree to that
entity. In the access control matrix described access (e.g., the Shamir secret sharing scheme).
above, ACLs are a column-wise partition of the
4.5.8.2.4 Ring-based Access Control
matrix. An identity may be that of a particular
user of the system, but can also be more general. Ring-based access control generalizes the
That is, the identity may be held by several users concept of a super-user and an ordinary user
of the system in the form of group membership used in many operating systems [Bishop 2003]
or a role being played by a user. to an arbitrary number of levels. Entities within
a system operate within a certain ring level and
Procedure-based access control is a form of a
must take certain system or policy directed
trusted interface mechanism. To access an
actions to access entities in other rings.
entity, users must not only have been granted
permission, but must also use a specified 4.5.8.2.5 Revocation
procedure to accomplish that access. While revocation of access privileges may seem
A propagated access control list (PACL) to be a simple task, it can become quite difficult
provides the creator of an entity control over the to administer. For example, ensuring revocation
use of that entity [Bishop 2003]. Any user when users and entities are distributed across
allowed access to the object, may only do so in domains, systems, or networks is problematic.
Further complicating matters is whether rights See Fundamentals section for more discussion.
granted to, say, User B by User A, should be
revoked if User A rights are revoked. 4.5.12 Specify Configurations
Specify software configurations and settings that
4.5.9 Frameworks reduce vulnerabilities and facilitate security can
Many design efforts occur within a given have significant impact.
framework such Sun’s J2EE [??] or • Specify Database Security Configuration(s)
Microsoft’s .net [Meier 2004, Part III]. When [Viega 2005, p. 79]. [Thuraisingham ??]
this true, designers need a thorough knowledge
of the framework’s security philosophy and • Specify Operational Configuration(s) [Viega
features as well as its strengths and weaknesses. 2005, p. 98-99]
In addition, knowledge of the approaches taken This includes resource-based security
by the major frameworks and the experiences configurations and contents of the operational
with and analyses of them can contribute to all security guide. The Center for Internet Security
secure software designers understanding of the issues a number of configuration setting guides
field. for common software.

4.5.10 Design Patterns for Secure 4.5.13 Methods for Tolerance and
Software Recovery
A standards-oriented organization, the Object While generally prevention is to be preferred,
Management Group, has proposed several design designers of secure systems cannot assume
patterns for secure software. [Open Group 2004] preventive or proactive measures will always
succeed. The field of fault tolerance and more
4.5.11 Proper Use of Encryption and recently the associated field of intrusion
Encryption Protocols tolerance have developed a number of
In this area, experts should be consulted before techniques. Recovery techniques have a long
establishing organizational or project decisions. history as well including disaster recovery. A
To choose an encryption scheme or protocol system can undertake a number of kinds of
properly, one must know the existing standards activities related to tolerance of errors or
(official and de facto), their characteristics, violations of correctness or legitimacy. Design
(strengths and weaknesses), and their future approaches exist to [Pulium] [Jalote ??]
prospects. [NIST SP 800-53, Final, Appendix F]; • Forecasting violations
[FIPS 140], and [FIPS 180] • Detection of violations, possible violations,
Computational difficulty is the fundamental and non violating activity
measure of encryption merit, and knowledge • Notification and Warning
should be possessed of the computational
requirements to break, as well as to use, different • Recording usually via logs
forms of • Damage isolation or confinement
• Encryption and decryption • Continuing service although possibly
• Calculating and reversing hashes degraded
Hashing is an example of an area without • Diagnosis of cause of violation
adequate theoretical foundations where • Repair of fault or vulnerability
algorithms have been shown to be weaker than • Recovery of the system to a legitimate state
thought. Such areas deserve careful treatment
and concern for potential future problems. Only • Tactics that adapt to attacking actions.
cryptographic software certified to [FIPS 140] One might also warn or characterize, investigate
(or official equivalent) should be used. root cause or causer, analyze, and/or learn and
improve. Together these actions attempt to make • Deception realism should be tailored to
things right after something is found to be needs of the setting.
wrong. As a bonus, these actions may also set • Deception should be imaginative and
the stage for prevention of reoccurrences. creative.
See Confine Damage and Increase Resilience
subsection for related ideas. 4.5.15.3 Particular Techniques for
Deception
4.5.14 Software Protection Techniques include such approaches as
A number of schemes exist to try to enforce honeynets, decoys, disinformation, and virtual
intellectual properties, e.g. licenses files, or views of system [Holz 2005] [Cheves 2005]
avoid understanding or reverse engineering by [Hunt 2005]. [Rowe 2004a] [Cohen 2001]
adversaries. [Atallah, Bryant and Sytz 2005] The [Honeynet 2002]
US Department of Defense has a Software
Protection Initiative. [Hughes SPI??] 4.5.16 Forensic Support
Techniques used in relation to malware may also The first basic support for forensics is audit logs
illuminate a number of techniques. [Szor 2005] of actions and accesses. These can record more
data or less data, but in a secure system their
4.5.15 Deception and Diversion integrity and, depending on requirements, their
Design knowledge includes purposes, principles, confidentiality must be protected. Confidential
and techniques of deception. A brief introduction data needs to be protected in logs as in regular
with motivations appears in [Hunt 2005]. storage. In addition, designs must avoid allowing
the exhaustion of log storage space to become a
4.5.15.1 Purposes of Deception
form of attack.
Purposes of deception by the defense include
Other forensic support includes support for
• Intelligence gathering identifying suspects and investigating insiders
• Diversion of illegitimate attention and and outsiders. For insiders where the identity of
resources the user may be known, recognition of use in an
unusual fashion could be enough automated
• Effects outside the system such as attracting
support to help identify suspects.
others’ attention, good or bad publicity, or
aiding in law enforcement or mission 4.5.17 User Interface Design
accomplishment
Because achieving a design providing usable
• Deceptions may also sometimes be used to security may need to derive from a basic
help conceal other deceptions reconceptualization of the solution, user
4.5.15.2 Principles of Deception interface design’s beginning is during process
and task analyses or possibly reengineering.
Rowe expands on the implications for information
Lists of principles in [Yee 2004] and [Garfinkel
security deceptions of six principles suggested by
2005] extent sound user-system interaction
Fowler and Nesbitt [Fowler and Nesbitt 1995]
[Rowe 2004b] design methods to cover security issues. [Cranor
2005] has design ideas including a number
Deception should reinforce enemy expectations. concerning privacy. A series of research-oriented
• Deception should have realistic timing and workshops exists in this burgeoning research
duration. area that may address practical issues. [SOUPS
• Deception should be integrated with 2005]
operations.
4.5.18 Assurance Case for Design
• Deception should be coordinated with The arguments and evidence related to design
concealment of true intentions. are central to the assurance case. [SafSec
Standard] [SafSec Guidance] These arguments relevant expertise and legitimate stakeholder
necessitate assurance mappings between system interests. [Meier 2004, Chapter 5 and]
descriptions. This can have implications about Formalized techniques exists for reviews include
how to structure the assurance case. [Kelly 2003] a scenario based [Bass 2001] one created for
See Verification, Validation, and Evaluation architecture reviews. Reviews including security
section. issues are essential at all levels of design.
Design-related portions of the assurance case
4.5.19 Design for Easier Modification should be reviewed as well. Since the best
of Assurance Argument after results occur when one develops much of the
Software Change design assurance case along with the design,
When the security-relevant software changes, these parts may be best reviewed together.
changes in the assurance case are required – if Use of checklists can help.[Meier 2004]
not in the argument, then certainly in the [Ramachandran 02]
evidence. Most obviously, new testing and See the Verification, Validation, and Evaluation
results are necessary. section.
The structuring of the design and of the
assurance case can ease these modifications. 4.5.22 References
Designs with clean separations among parts may [Atallah, Bryant and Sytz 2005] Atallah,
allow localization of assurance arguments and Mikhail, Bryant, Eric, and Styz, Martin, A
evidence facilitating change. Survey of Anti-Tamper Technologies, Crosstalk –
The Journal of Defense Software Engineering,
4.5.20 Secure Design Process and Nov 2004.
Methods [Avizienis 2004] Avizienis, Algirdas, Jean-
The design process needs to go hand-in-hand Claude Laprie, Brian Randell, and Carl
with construction of the assurance argument. Landwehr. “Basic Concepts and Taxonomy of
Design components need to be traceable to Dependable and Secure Computing,” IEEE
requirements, and their composition yield the Transactions on Dependable and Secure
behavior called for in the external behavior Computing, vol. 1, no. 1, Jan.-Mar. 2004, pp. 11-
specification. Designers need to be 33.
knowledgeable about similar systems and known [Bass 2001] Bass, L.; Klein, M.; & Moreno, G.
techniques. Nevertheless, the creative element Applicability of General Scenarios to the
must be exercised as well. Architecture Tradeoff Analysis Method
Multiple alternatives should be considered with a (CMU/SEI-2001-TR-014, ADA396098).
well-structured decision process. Architectural Pittsburgh, PA: Software Engineering Institute,
coherence must be preserved and architectural Carnegie Mellon University, 2001.
constraints never violated. [Bishop 2003] Bishop, Matt. Computer Security:
Formal notations or combined formal and semi- Art and Practice, Addison-Wesley, 2003.
formal notations (e.g. UMLsec [Jürjens 2004] [Boudra 1993] P. Boudra, Jr. Report on rules of
[Jürjens 2005]) and methods exist to address system composition: Principles of secure system
security. design. Technical Report, National Security
For more information on the design process see Agency, Information Security Systems
the Tools and Methods and Process sections. Organization, Office of Infosec Systems
Engineering, I9 Technical Report 1-93, Library
4.5.21 Design Reviews for Security No. S-240, 330, Mar., 1993.
Designs need to be open to verification, [Chaves et al, 2005] Chaves, C. H. P. C., L. H.
validation, and evaluation. Reviews should be Franco and A. Montest. “Honetnet Maintenance
performed by multiple persons in each area of Procedures and Tools,” Proc. 6th IEEE Systems,
Man and Cybernetics Information Assurance [Holz and Ravnal 2005] Holz, T. and F. Ravnal,
Workshop (IAW 05), IEEE CS Press, 2005, “Detecting Honeypots and Other Suspicious
pp.252-257. Environments,” Proc. 6th Ann. IEEE Systems,
[Cohen 2001] Cohen, Fred, Dave Lambert, Man and Cybernetics Information Assurance
Charles Preston, Nina Berry, Corbin Stewart, Workshop (IAW 05), IEEE CS Press, 2005,
and Eric Thomas``A Framework for Deception'', pp.29-36.
Final Report IFIP-TC11, 2001 [Honeynet 2002] The Honeynet Project. Know
[Cranor 2005] Cranor, Lorrie, and Your Enemy. Addison-Wesley, 2002.
Simson Garfinkel. Security and Usability: [Howard 2002] Howard, Michael, and David C.
Designing Secure Systems that People Can Use. LeBlanc. Writing Secure Code, 2nd ed.,
O’Reilly, 2005 Microsoft Press, 2002.
[Ellison 2003] Ellison, Robert J., and Andrew P. [Howard] Howard, M., J. Pincus and J. Wing.
Moore. Trustworthy Refinement Through “Measuring relative attack surfaces,”
Intrusion-Aware Design (TRIAD). Technical Proceedings of the Workshop on Advanced
Report CMU/SEI-2003-TR-002. Software Developments in Software and Systems Security.
Engineering Institute, October 2002 Revised 2003. available as CMU-TR-03-169, August
March 2003 2003
[Evans 2005] Evans, S. and J. Wallner, “Risk- [Hughes n. d.] Hughes, Jeff, and Martin R. Stytz,
Based Security Engineering Through the Eyes of Advancing Software Security– The Software
the Adversary,” Proc. 6th Ann. IEEE Systems, Protection Initiative
Man and Cybernetics Information Assurance http://www.preemptive.com/documentation/SPI_
Workshop (IAW 05), IEEE CS Press, 2005, software_Protection_Initative.pdf
pp.158-165. [Hunt et al, 2005] Hunt, C., J. R. Bowes and D.
[Fowler 1995] Fowler, C. A., & Nesbit, R. F. Gardner. Net Force Maneuver,” Proc. 6th Ann.
“Tactical deception in air-land warfare,” Journal IEEE Systems, Man and Cybernetics
of Electronic Defense, vol. 18, no. 6, June 1995, Information Assurance Workshop (IAW ‘05),
pp. 37-44 & 76-79. IEEE CS Press, 2005, pp.419-423.
[Garfinkel 2005] Garfinkel, Simson L. Design [Ibrahim et al, 2004] Ibrahim, Linda, et al.
Principles and Patterns for Computer Systems Safety and Security Extensions for Integrated
that are Simultaneously Secure and Usable, PhD Capability Maturity Models. Washington D.C.:
Thesis MIT, 2005 http://www.simson.net/thesis/ United States Federal Aviation Administration,
[Gasser] Gasser, M. Building a Secure Computer Sept. 2004.
System. Van Nostrand Reinhold, 1988. [Jalote ??] Jalote, Pankaj. Fault Tolerance in
[Graff and van Wyk 2003] Graff, Mark G. and Distributed Systems
Kenneth R. Van Wyk. Secure Coding: [Jürjens 2004] Jürjens, Jan, Secure Systems
Principles and Practices. O'Reilly & Associates, Development with UML, Springer-Verlag 2004
June 2003. [Jürjens 2005] Jürjens, Jan, “Sound Methods and
[Gutmann 2004] Gutmann, P. Cryptographic Effective Tools for Model-based Security
Security Architecture: Design and Verification. Engineering with UML,” 27th International
Springer-Verlag, 2004. Conference on Software Engineering, St.Louis,
[Hall 2002] Hall, Anthony and Rodrick Missouri, USA; 15 - 21 May 2005
Chapman. “Correctness by Construction: [Karger et al, 1990] Karger, Paul A., Mary Ellen
Developing a Commercial Secure System,” Zurko, Douglas W. Benin, Andrew H. Mason,
IEEE Software, vol. 19, no. 1, Jan./Feb. 2002, and Clifford E. Kahn. “A VMM Security Kernel
pp.18-25. for the VAX Architecture,”Proc. of the 1990
IEEE Symposium on Security and Privacy, [Parnas 1985] Parnas, D. L. and Weiss, D. M.
IEEE, 1990, pp.2-19. “Active design reviews: principles and
[Kelly 2003] Kelly, T. P. Managing Complex practices.” In Proceedings of the 8th
Safety Cases Department of Computer Science international Conference on Software
University of York, 2003. Engineering (London, England, August 28 - 30,
www.cs.york.ac.uk/~tpk/sss03.pdf 1985). International Conference on Software
Engineering. IEEE Computer Society Press, p.
[Manadhata and Wing 2004] Manadhata, P. and
132-136. 1985.
J. M. Wing. Measuring A System's Attack
Surfacehttp://www- [Pullum 2001] Pullum, L. L. Software Fault
2.cs.cmu.edu/afs/cs/project/calder/www/tr04- Tolerance, Artech House, 2001.
102.pdf, CMU-TR-04-102, Jan. 2004. [Neumann 2003] Neumann, P.G. Principled
[Meier 2004] Meier, J.D., Alex Mackman, Assuredly Trustworthy Composable
Srinath Vasireddy, Michael Dunner, Ray Architectures (Draft), Dec., 2003.
Escamilla, and Anandha Murukan, Improving [Pullum ??] Laura L. Pullum. Software Fault
Web Application Security: Threats and Tolerance: Achievement and Assessment
Countermeasures, Microsoft, 2004 Strategies, ??
http://download.microsoft.com/download/d/8/c/d [Ramachandran 02] Ramachandran, J.
8c02f31-64af-438c-a9f4- Designing Security Architecture Solutions. New
e31acb8e3333/Threats_Countermeasures.pdf York: John Wiley & Sons, 2002.
[Moffett and Nuseibeh 2003] Moffett, Jonathan [Redwine 2004] Redwine, Samuel T., Jr., and
D. and Bashar A. Nuseibeh. A Framework for Noopur Davis (Editors). Processes for
Security Requirements Engineering. Report YCS Producing Secure Software: Towards Secure
368, Department of Computer Science, Software. vols. I and II. Washington, D.C.:
University of York, 2003. National Cyber Security Partnership, 2004.
[Neumann 1995] Neumann, Peter G. [Redwine 2005a] Redwine, Samuel T., Jr.
Architectures and Formal Representations for Dependability Properties: Enumeration of
Secure Systems, Final Report SRI Project 6401, Primitives and Combinations with Observations
October 2, 1995 on their Measurement. Commonwealth
[Neumann 2000] Neumann, P. G. Practical Information Security Center Technical Report
architectures for survivable systems and CISC-TR-2004-001.
networks. Technical report, Final Report, Phase [Redwine 2005b] Samuel T. Redwine, Jr.,
Two, Project 1688, SRI International, Menlo Principles for Secure Software: A Compilation
Park, California, 2000. of Lists. Commonwealth Information Security
[Neumann 2003] Neumann, P.G. Principled Center Technical Report CISC-TR-2005-002.
Assuredly Trustworthy Composable [Rowe 2004a] Rowe, Neil C. “Designing Good
Architectures (Draft), Dec., 2003. Deceptions in Defense of Information Systems,”
[NIST 2005] The National Institute of Standards ACSAC 2004.
and Technology. Common Criteria v. 3.0, July, http://www.acsac.org/2004/abstracts/36.html.
2005. [Rowe 2004b] Rowe, N., & Rothstein, H., "Two
[NSA 2002] National Security Agency, The Taxonomies of Deception for Attacks on
Information Systems Security Engineering Information Systems," Journal of
Process (IATF) v3.1. 2002. Information Warfare, Vol. 3, No. 2, July
[Open Group 2004] Open Group. Security 2004, pp. 27-39.
Design Patterns (SDP) Technical Guide v.1, [SafSec Guidance]“SafSec Methodology:
April 2004. Guidance Material.” SafSec: Integration of
Safety and Security.
http://www.safsec.com/safsec_files/resources/50 Los Alamitos, California: IEEE Computer
_3_SafSec_Method_Guidance_Material_2.6.pdf Society, Feb. 16, 2004.
[SafSec Introduction] “Why SafSec?” SafSec: [HMAC 2002] “The Keyed-Hash Message
Integration of Safety and Security. Authentication Code (HMAC)”, FIPS 198,
http://www.safsec.com/safsec_files/resources/50 March 2002.
.6_Why_SafSec.pdf [Mantel 2002] Mantel, Heiko, “On the
[SafSec Standard] “SafSec Methodology: Composition of Secure Systems,” IEEE
Standard.” SafSec: Integration of Safety and Symposium on Security and Privacy, p. 88, 2002
Security. [Radack 2005] Radack, Shirley, editor.
http://www.safsec.com/safsec_files/resources/50 “Standards for Security Categorization of
_2_SafSec_Method_Standard_2.6.pdf Federal Information and Information Systems.
[SDI 1992] Department of Defense Strategic Federal Information Processing Standard (FIPS)
Defense Initiative Organization, Trusted 199, July 10, 2005.
Software Development Methodology, SDI-S-SD- [Ross 2005] Ross, Ron et al. “Recommended
91-000007, vol. 1, 17 June 1992. Security Controls for Federal Information
[Swiderski 2004] Swiderski, F. and Snyder, W. Systems,” NIST Special Publication 800-53,
Threat Modeling. Microsoft Press, 2004. Feb. 2005.
[Thompson 2005] Thompson, H. H. and S. G. [Saltzer and Schroeder 1975] Saltzer, J. H. and
Chase. The Software Vulnerability Guide. M. D. Schroeder. “The protection of information
Charles River Media, 2005. in computer systems,” Proceedings of the IEEE,
[Thuraisingham] Thuraisingham, Bhavani, vol. 63, no. 9, 1975, pp. 1278-1308.
Database and Applications Security ??. [Schell 2005] Schell, Roger. Keynote Talk,
[Viega 2005] Viega, J. The CLASP Application International Workshop on Information
Security Process. Secure Software, 2005. Assurance, Mar. 24, 2005
[Whittaker] Whittaker, J. A. and H. H. [SHS 2002] “Secure Hash Standard (SHS)”,
Thompson. How to Break Software Security: FIPS 180-2, August 2002.
Effective Techniques for Security Testing. [Sommerville 2004] Sommerville, I. Software
Pearson Education, 2004. Engineering. 7th ed. Pearson Education, 2004.
[Wyk and McGraw 2005] van Wyk, Kenneth [Stoneburner et al, 2004] Stoneburner, Gary,
and Gary McGraw. After the Launch: Security Hayden, Clark, and Feringa, Alexis.
for App Deployment and Operations. Engineering Principles for Information
Presentation at Software Security Summit, April Technology Security (A Baseline for Achieving
2005. Security).NIST Special Publication 800-27 Rev
[Yee 2004] Ka-Ping Yee. “Aligning security and A, June 2004.
usability.” Security & Privacy Magazine, 2: [US Army 2003] US Army, Field Manual (FM)
48–55, Sept–Oct 2004. 3-13: Information Operations: Doctrine,
Tactics, Techniques, and Procedures, 28th Nov.
4.5.23 Additional Reading 2003. (particularly Chapter 4 on Deception)
[Barker 2004] Barker, William C. “Guide for [US DoD 1996] Joint Chiefs of Staff, DoD JP 3-
Mapping Types of Information and Information 58 Joint Doctrine for Military Deception, 31
Systems to Security Categories,” NIST Special May, 1996.
Publication 800-60, June 2004. [Zwicky et al, 2000] Zwicky, Elizabeth D.,
[Bourque and Dupuis 2004] Bourque, Pierre, and Simon Cooper, and D. Brent Chapman. Building
Robert Dupuis (Editors). Guide to the Software Internet Firewalls (2nd Ed.), O'Reilly, 2000
Engineering Body of Knowledge. 2004 Edition.
4.6 Secure Software Construction
Scope vulnerabilities in software [Viega and McGraw
2002]. Buffer overrun occurs when a program
Secure software construction creates working, reads or writes outside the bounds of a storage
meaningful, secure software through coding, buffer. If the overrun occurs when reading, a
verification, unit testing, integration testing, and data leak may result or the program may
debugging [IEEE 2004]. To construct a secure malfunction due to incorrect input. If the overrun
software system, then, more specific security occur during writing, the program may again
knowledge is required from a software engineer, malfunction or data may be corrupted. Buffer
programmer, or analyst. It has been said that we overruns are often exploited to inject attack code
have long known how to build secure software; into a program or to induce a denial of service
we simply don’t act on what we know. This attack. Buffer overruns can be exploited whether
section summarizes the knowledge and storage is statically allocated (i.e., at compile
techniques required to produce secure software. time) or dynamically allocated (i.e., in the stack
4.6.1 Construction of Code or heap). Although specific exploitation
techniques for buffer overruns are many and
4.6.1.1 Common Vulnerabilities varied, prevention, detection and protection
Some vulnerabilities in software systems occur mechanisms are well-known and can be found in
with such great frequency they have been various references [Howard and LeBlanc 2003],
deemed “common vulnerabilities.” Rather than [Goertzel and Goguen 2005], [Viega and
attempt to list specific vulnerabilities, this McGraw 2002], [Viega and Messier 2003].
section classifies vulnerabilities by the manner in 4.6.1.1.2 Resource Exhaustion
which the security of a system is compromised
[Meunier 2004] [Howard 2005] [Whittaker and Resource exhaustion is possible whenever there
Thompson 2004]. are finite resources and unrestricted access to
those resources. When done with malicious
However, it is quite important to know where to intent, the exhaustion of a resource may be
find specific information on vulnerabilities, intended to deny access to the resource itself, to
exploits, and patches for application software evade traceability or accountability, or to serve
and systems. Some sources of such information as a distraction to increase the probability of
include NIST’s ICAT database, the Secunia succeeding in a separate attack.
database (www.secunia.com), and US-CERT
(www.us-cert.gov). A recognized source for Resource exhaustion “enablers” include
standard naming of vulnerabilities is the [Meunier 2004] poor resource management due
MITRE’s Common Vulnerabilities and to design or implementation errors, expensive
Exposures dictionary (cve.mitre.org) tasks such as encryption, and coding errors such
as memory leaks that become vulnerabilities.
Although not addressed below, there are Protocols and algorithms which allow
numerous references that cover application, anonymous or unauthenticated resource
network, and operating system-specific allocation and “amplification” services such as
vulnerabilities. Some of these include [Skoudis broadcasts and other distribution mechanisms
2002], [Graff and Wyk 2003], [Howard and can also be used to consume resources.
LeBlanc 2003], [Koziol et al. 2004], [Chirillo Asymmetric attacks are a form of amplification.
2002], [Flickenger 2003], [Wheeler 2003], The cost to the attacker to request a service is
[Whittaker and Thompson 2004]. much less than the cost of the service provider to
4.6.1.1.1 Buffer Overrun respond. Thus, the attacker can easily
Buffer overrun, also known as buffer overflow, overwhelm a service provider with service
is arguably one of the most common requests.
4.6.1.1.3 Operating Environment processes contend for resources are subject to
The operating environment a process executes in them. Techniques to prevent race conditions
is a source of many vulnerabilities. This usually requires operating system support. The
environment includes [Meunier 2004] the file general strategy to prevent race conditions
system, the user interface, the operating system ensures that access and configuration of the
(OS), user accounts, services provided by the OS resource appears to be one atomic operation,
or other applications, and environment variables. even when multiple operations and disjoint
Many aspects of the environment are under the periods of time are needed to complete the
control of an untrusted user(s) and therefore “atomic” operation.
should be sanitized or validated prior to use. For 4.6.1.1.5 Canonical Form
example, most systems provide the user a
In the context of computer security, canonical
capability to specify which directories are to be
form is the fundamental representation of
searched when a request for a file is submitted to
symbols that can be interpreted without
the system.
ambiguity. For example, suppose the canonical
Most systems provide services to create form of a file on a file system must begin with a
temporary files for use by processes and drive letter. Then “\my_file.doc” would not be in
applications. Some provide world readable and canonical form; the location of the file is
writable directories for these temporary files as ambiguous. Depending on the number of drives
well as services to generate unique names for in the system, there may be many files called
temporary files. Since temporary directories are “\my_file.doc”. Canonicalization is the process
available to all and the temporary file names of resolving a set of symbols to their standard or
provided by a system, though unique, are canonical form.
typically predictable, care must be taken when
Most resources on a system are named so a
creating and using temporary files and
person using the system can easily remember
directories. A secure temporary file has the
them. The system itself, however, rarely uses
following properties [Howard and LeBlanc
this name to refer to the same resource. Rather,
2003] [Viega and McGraw 2002]: it resides in a
the system interprets the name and translates it to
directory accessible only to the process that
the internal representation the system uses. A
created it, it has a unique, difficult to guess
fundamental security axiom with regard to
name, and access control are set to prevent
names is “Do not make any security decision
unauthorized access.
based on the name of a resource… [Howard and
4.6.1.1.4 Race Conditions LeBlanc 2003]” Vulnerabilities associated with
When the behavior of a system depends on the the interpretation of non-canonical names span
order in which critical events occurs, a race every level of the OSI network model and
condition is said to exist. The most prominent include making decision based on Internet
example of a race condition in secure Protocol addresses, short forms of directory
programming occurs in the so-called time-of- names (i.e., “dot-dot” attacks), uninterpreted
check, time-of-use (TOCTOU) scenario. For symbolic file system links, and named network
example, suppose a file is created (using default resources and services.
system access permissions) using a system call 4.6.1.1.6 Violations of Trust
and a subsequent call places more restrictive
Trust can be defined as accepting the risk that an
access controls on the file. Between the two
entity which can harm you, will not do so.
calls, a malicious process can change the
Bishop [Bishop 2003] measures trust by using
contents of the file or even replace it with a
evidence of trustworthiness. “An entity is
different file.
trustworthy if there is sufficient credible
Although race conditions are commonly used to evidence leading one to believe that the system
exploit file systems, any situation where multiple will meet a set of given requirements.”
Common vulnerabilities related to trust span validation techniques, however, the principles
both definitions. Format strings in ‘C’ print and techniques below are fairly general and
statements require arguments, and a compiler, in widely applicable.
a sense, trusts that the programmer has included
the correct number; this is trust in the first sense.
4.6.1.3.1.1 Input cleansing
Trust in the sense of Bishop requires evidence of Part of input validation includes bounds and type
trustworthiness due to methods used to build a checking. This may be classified as a kind of
system, or metrics gathered during testing. The input cleansing, but input cleansing scope is
authentication of an entity claiming a certain much broader. Input cleansing can be simply
identity is an example of this type of trust. defined as the process of discarding input that
Cross-site scripting (XSS) and SQL code does not conform to what is expected.
injection are vulnerabilities that can lead to While simple to state, the definition presumes a
exploits resulting from the violation of trust. model of expected input is known or has been
defined. Based on this input model, a “white list”
4.6.1.2 Using Security Principles in of acceptable input can be specified or an
Secure Coding algorithm devised that recognizes acceptable
A number of security principles for software input. The common practice of specifying a
systems have been covered in the prior sections. “black list” of unacceptable input is strongly
The usefulness of these principles continues discouraged. Experience has confirmed intuition
during secure coding. Several authors have again and again: the number of bad inputs for a
discussed security principles while emphasizing given program far exceeds the number of valid
coding. [Viega and McGraw 2002] [Howard and inputs. Therefore, validation must be approached
LeBlanc 2003] They do not all cover the same from the perspective of determining “good”
set but have readable introductions and input rather than detecting “bad” input.
discussions of those they do cover. Also see In many languages, such as SQL, HTTP, or even
[Bishop 2003]. ‘C’ printf functions, it is quite easy to insert
4.6.1.3 Secure Coding Practices instructions into a program for subsequent
execution via input mechanisms. This is the so-
This section introduces specific practices that are called “code injection” exploit. The fundamental
followed by writers of any type of secure code. problem is code and data are occupying the same
Indeed, the practices described below are input channel and the program accepting the
prudent whether the code is intended to be input must determine which is which. If code
secure or not; however, they are essential for any (data) can be crafted to appear as data (code),
code claiming some level of security. Code normal security controls can be bypassed the
includes traditional procedural languages such as system exploited. The remedy for this
‘C’ and ‘Java’, shell scripts, database queries vulnerability is to separate the code input
using languages such as SQL, or web-based channel from the data input channel in some
programming such as HTML. enforceable manner.
4.6.1.3.1 Input Validation Input should not be declared valid until its
Input validation is fundamental to any program semantic meaning (i.e., its meaning within the
claiming security. Characterizations of it’s context of the program) has been determined. To
importance range from the understated “Handle accomplish this will entail completely resolving
data with caution.” [Graff and Wyk 2003] to the various meta-characters, encodings, and any
“All input is evil!” [Howard and LeBlanc 2003]. link indirections of the input into its most
Identifying sources of input are the critical first fundamental representation (cf., Canonical
step in validating input. Form). Only at this point can the semantic
Due to the wide range of input a program can meaning and proper security checks be
accept, it is difficult to be precise on specific performed.
The following has been proposed as an order for hardware-based protections, (2) wrappers, (3)
proper input validation [Meunier 2004]: obfuscation, (4) watermarking, and (5) guards.
1. Interpret and render all character Hardware-based protection is often based on a
encodings to a common form, trusted processor. In one protection scenario, a
2. Cleanse the input using several passes if trusted processor verifies the integrity of other
necessary, hardware devices in the system upon boot-up
and perhaps stores cryptographic keys or other
3. Validate type, range and format of input,
means of verifying the trustworthiness of
4. Validate the semantics of the input. software that will be executing on the system in
4.6.1.3.2 Preventing Buffer Overflow trusted memory [Atallah, Bryant and Sytz 2005].
This type of anti-tamper protection will likely be
There are four basic strategies to eliminate
suitable for only the most critical security
buffer overflow vulnerabilities [Goertzel and
aspects of a system. “A less drastic protection …
Goguen 2005]. The first is simply to identify and
also involves hardware, but is more lightweight
avoid the various language-specific
such as a smart card or physically secure token.
vulnerabilities, constructs, and runtime libraries
These lightweight … techniques usually require
while coding. Validating input to ensure it does
that the hardware be present for the software to
not exploit these language-specific
run, to have certain functionality, to access a
vulnerabilities will prevent a significant number
media file, etc. Defeating this kind of protection
of buffer overflow attacks if they are used.
usually requires working around the need for the
The second strategy is to use non executable hardware rather than duplicating the hardware
memory locations. This prevents attackers from [Atallah, Bryant and Sytz 2005].”
injecting code via buffer overflow attacks.
Wrappers are a common method of
However, this requires operating system and
incorporating “new” technology or behavior into
perhaps hardware support. Furthermore,
legacy code or software libraries by intercepting
although this will be effective against direct code
calls to the legacy code and enhancing the
injection attacks, even with non-executable code,
characteristics of the legacy software within the
there are ways to induce execution of code.
wrapper code [Birman 1996]. In the context of
The third strategy uses (or modifies) the process anti-tamper techniques, an encryption wrapper
runtime environment to perform bounds can be used to protect part (or all) of an active
checking on all data accesses. This will have a process’s instructions and data in memory. Only
significant, and perhaps intolerable, impact on the portion being executed is decrypted and the
performance. decryption takes place as “close” as possible the
Finally, the compiler can perform integrity items use [Atallah, Bryant and Sytz 2005].
checks prior to dereferencing a pointer at a The object of obfuscation is to render software
significantly lower performance penalty. This, resistant to analysis or reverse engineering.
however, does not prevent more sophisticated Obfuscation is added to software at various
buffer overflow attacks. levels including the source and object code level.
4.6.1.3.3 Anti-Tamper Technologies Within source or object code, obfuscation
techniques include altering program layout,
Tampering occurs when an attacker modifies a
control structures, data encoding, and data
program or data such that it “continue[s] to
aggregation. It is generally recognized that
operate in a seemingly unaffected manner, but
relying on obfuscation as the sole software
on corrupted data or in a corrupted state
protection means is poor practice. However, the
[Goertzel and Goguen 2005].” Anti-tamper
value of using obfuscation in conjunction with
technology can be classified using the following
other anti-tamper techniques is recognized as
categories [Atallah, Bryant and Sytz 2005]: (1)
well.
Watermarking “embeds information into guidelines, best practices, suggestions, and case
software in a manner that makes it hard to studies [Birman 1996], [Goertzel and Goguen
remove by an adversary without damaging the 2005], [Graff and Wyk 2003], [Howard and
software’s functionality [Atallah, Bryant and LeBlanc 2003], [Viega and Messier 2003],
Sytz 2005]”. In contrast to watermarks in [Wheeler 2003]. Many companies have internal
steganography, anti-taper watermarks need not secure coding standards. However, there seems
be hidden as they have a deterrent effect to be a lack of public standards as such for
[Atallah, Bryant and Sytz 2005]. secure programming. The community would
“A specific type of watermarking is benefit from internationally recognized secure
fingerprinting, which embeds a unique message coding standards for common programming
in each instance of the software for traitor languages. Use of such coding standards may
tracing [Atallah, Bryant and Sytz 2005].” This greatly reduce the number of false positives
type of anti-tamper technique is sometimes produced by vulnerability seeking static analysis
called code-signing as well. tools – see Verification, Validation, and
Evaluation section.
Software guards are used during execution to
detect tampering during execution. “[A] guard’s 4.6.1.4 Language Selection
response when it detects tampering is flexible “The single most important technology choice
and can range from a mild response to the most software projects face is which program
disruption of normal program execution through language (or set of languages) to use for
injection of run-time errors […] Generally, it is implementation [Viega and McGraw 2002].”
better for a guard’s reaction to be delayed rather The term, language, is not limited to the
than to occur immediately upon detection so that traditional procedural programming languages
tracing the reaction back to its true cause is as like C, C++, and Java, but also includes query
difficult as possible and consumes a great deal of languages, shell scripting languages, and other
the attacker’s time [Atallah and Bryant and Sytz compiled or interpreted instructions a software
2005].” system may employ. The security implications
A reference monitor is a tamperproof trusted of this choice extends to other technologies as
access or interface point that mediates access to well including the host operating system and
objects within a system [Bishop 2003]. Thus, a distributed system services such as Common
reference monitor can be considered a type of Object Request Broker Architecture (CORBA),
guard as well, albeit not a anti-tamper guard as remote procedure call (RPC), and Java’s remote
discussed above. Critical services or portions of invocation procedure (RMI) service.
code are often secured using some type of Specifc security characteristics of languages and
reference monitor. services are readily accessible [Wheeler 2003],
4.6.1.3.4 Secure Coding Standards [Viega and Messier 2003], [Viega and McGraw
2002]. In general, however, the following
An essential element of secure software
language characteristics are desirable: strong
construction is enforcing particular coding
typing, safe typing, single entry/exit procedures,
standards. This has several benefits. It ensures
passing parameters by value (not by reference),
programmers use a uniform set of security
and the restricted use or elimination of pointers.
approaches which can be selected based on the
requirements of the project and suitability rather 4.6.1.5 Language Subsets
than the programmers familiarity or preference. These language characteristics and the
This, in turn, increases the maintainability of the requirement for analyzability lead to the use of
code. Finally, it reduces the techniques available language subsets. While organizations’ coding
for malicious developers to subvert the code. standards commonly exclude some dangerous
There are numerous references available both language features, concerns for security and high
on-line and in print with secure coding
assurance have led to excluding significant individuals producing product components and
portions of programming languages. extends thorough deployment of the combined
Analyzability was the driving force behind product as well as possibly further through
defining the SPARK subset of Ada [Barnes integration elsewhere with other products before
2003] and was a factor in defining the final delivery to customer or user.
SmartCard subset of Java. A number of technical techniques have been
tried to protect distributed software as
4.6.1.6 Annotations and Add-ons
intellectual property. Software protection is also
While good practices for commenting code are a national security concern. These issues were
well established, security and correctness covered above under anti-tampering.
concerns may be addressed in part by
annotations that specify pre- and post-conditions, 4.6.4 References
invariants including class invariants, and [Anderson 2001] Anderson, Ross, Security
information flow constraints. [Barnes 2003] Engineering, John Wiley and Sons, Inc., 2001.
[Leavens 2005] [Atallah, Bryant and Sytz 2005] Atallah,
Proof carrying code has been advocated, but it Mikhail, Bryant, Eric, and Styz, Martin, A
has not come into use. Survey of Anti-Tamper Technologies, Crosstalk –
The Journal of Defense Software Engineering,
4.6.2 Construction of User Aids Nov 2004.
Security needs to be included in user aids such
[Barnes 2003] Barnes, John. High Integrity
as manuals and help facilities. If user visible
Software: The SPARK Approach to Safety and
security-is significant or has changed
Security, Addison Wesley 2003
significantly, users will benefit from inclusion of
motivations and explicit “user mental model”. [Birman 1996] Birman, Kenneth, Building
Kinds of user aids including security-oriented Secure and Reliable Network Applications,
material include Manning Publications, Inc., 1996.
• Documentation [Bishop 2003] Bishop, Matt, Computer Security:
Art and Science, Addison Wesley, 2003.
o Normal documentation includes security
[Bosworth and Kabay 2002] Bosworth, Seymour
aspects and use
and Kabay, M. eds., Computer Security
o Operational Security Guide [Viega 2005, Handbook, 4th Edition, John Wiley and Sons,
p. 33, 98-00] 2002.
• Training [Chirillo 2002] Chirillo, John, Hack Attacks
• Online user aids, e.g. help facility Revealed: A Complete Reference for UNIX,
Windows, and Linux with Custom Security
• User support
Toolset, Wiley Publishing, Inc., 2002.
o Help desk
[Flickenger 2003] Flickenger, Rob, Wireless
o Online support Hacks, O’Reilly and Associates, Inc., 2003.
4.6.3 Secure Release [Goertzel and Goguen 2005] Goertzel, Karen
and Goguen, Alice, et al., Application
In addition to sound configuration and release Developer’s Guide to Security-Enhancing the
management, achieving secure releases requires Software Development Lifecycle (DRAFT),
secure configuration management facilities and UNPUBLISHED DRAFT, June 2005.
cryptographic signing of artifacts or other means
to ensure their proper origin and their end-to-end [Graff and Wyk 2003] Graff, Mark and Wyk,
integrity when distributed internally or Kenneth, Secure Coding Principles and
externally. End-to-end integrity can be achieved Practices, O’Reilly and Associates, Inc., 2003.
by cryptographic signing that starts with
[Howard 2005] Howard, Michael, David [Viega 2005]Viega, J. The CLASP Application
LeBlanc, John Viega, 19 Deadly Sins of Security Process. Secure Software, 2005.
Software Security, McGraw-Hill Osborne [Wheeler 2003] Wheeler, David, Secure
Media; 1 edition, 2005 Programming for Linux and Unix HOWTO
[Howard and LeBlanc 2003] Howard, Michael v3.010, www.dwheeler.com/secure-
and LeBlanc, David, Writing Secure Code, 2nd programs/Secure-Programs-HOWTO.pdf,
Edition, Microsoft Press, 2003. 3 March 2003.
[IEEE 2004] Guide to the Software Engineering [Whitman and Mattord 2005] Whitman, Michael
Body of Knowledge, 2004 Version, IEEE and Mattord, Herbert, Principles of Information
Computer Society, 2004. Security, 2nd Edition, Thomson Course
[Koziol et al. 2004] Koziol, Jack et al., The Technology, 2005.
Shellcoder’s Handbook: Discovering and [Whittaker and Thompson 2004] J. A. Whittaker
Exploiting Security Holes, Wiley Publishing, and H. H. Thompson. How to Break Software
Inc. 2004. Security: Effective Techniques for Security
[Leavens 2005] Leavens, Gary T., Yoonsik Testing. Pearson Education, 2004.
Cheon, Curtis Clifton, Clyde Ruby, and David
R. Cok. How the design of JML accommodates
4.6.5 Additional Reading
both runtime assertion checking and formal Avizienis, Algirdas, Jean-Claude Laprie, Brian
verification. Science of Computer Programming, Randell, and Carl Landwehr. Basic Concepts and
Volume 55, pages 185-205, Elsevier, 2005. Taxonomy of Dependable and Secure
Computing. IEEE Transactions on Dependable
[Meunier 2004] Meunier, Pascal, Secure
and Secure Computing, Vol. 1, No. 1, IEEE,
Programming Educational Material, Purdue
January-March 2004
University,
http://www.cerias.purdue.edu/homes/pmeunier/s Matt Bishop. Computer Security: Art and
ecprog/, 2004. Practice, Addison-Wesley, 2003
[Pfleeger 1997]Pfleeger, Charles, Security in Bourque, Pierre, and Robert Dupuis (Editors).
Computing, Prentice Hall PTR, 1997. Guide to the Software Engineering Body of
Knowledge. 2004 Edition. Los Alamitos,
[Saltzer and Schroeder 1975] Saltzer, Jerome
California: IEEE Computer Society, February
and Schroeder, Michael, The Protection of
16, 2004.
Information in Computer Systems, Proceedings
of the IEEE, Vol. 63, No. 9, pp. 1278-1308, Sep. M. Gasser. Building a Secure Computer System.
1975. Van Nostrand Reinhold, 1988.
[Skoudis 2002] Skoudis, Ed, Counter Hack: A Ibrahim, Linda, et al. Safety and Security
Step-by-step Guide to Computer Attacks and Extensions for Integrated Capability Maturity
Effective Defenses, Prentice Hall, 2002. Models. Washington D.C.: United States Federal
Aviation Administration, September 2004.
[Viega and McGraw 2002] Viega, John and
McGraw Gary, Building Secure Software, Redwine, Samuel T., Jr., and Noopur Davis
Addison-Wesley, 2002. (Editors). Processes for Producing Secure
Software: Towards Secure Software. Volumes I
[Viega and Messier 2003] Viega, John and
and II. Washington D.C.: National Cyber
Messier, Matt, Secure Programming Cookbook,
Security Partnership, 2004.
O’Reilly and Associates, Inc., 2003.
I. Sommerville. Software Engineering. 7th ed.
Pearson Education, 2004.
4.7 Secure Software Verification, Validation, and Evaluation
4.7.1 Scope confidentiality, integrity, availability, etc.
requirements but also reliability, and security-
The methods used for verification, validation,
related usability and sustainability. Separate
and evaluation of original development, reused
Assurance cases, however, may be more
and OTS software, and changes can be static or
straightforward. [Despotou 2004]
dynamic. The primary dynamic techniques are
simulation and testing. [Bourque and Dupuis Best developed concurrently with the software
2004] states, “Testing is an activity performed itself, starting with the initial statement of
for evaluating product quality, and for improving requirement, the assurance case subsequently
it, by identifying defects and problems. includes identified perceived and actual risks,
avoidance and mitigation strategies, and an
“Software testing consists of the dynamic
assurance argument that refers to associated and
verification of the behavior of a program on a
supporting evidence. This may include evidence
finite set of test cases, suitably selected from the
and data from requirements, design,
usually infinite executions domain, against the
construction, verification, validation, and
expected behavior.”
evaluation; management, and other sources.
Static techniques may involve direct reviews or Evidence concerning quality results from
automated model checking. The term “formal” is reviews, analyses, tests, trials, in-service and
used here and elsewhere to mean mathematically field experience, process fidelity, standards
based. See the Tools and Methods section for a conformance results, personnel qualification
number of relevant techniques and the tools. records, and elsewhere. The assumed case also
While the primary initial purpose for using a records any changes to the case including
technique or method may be product changes required by changes in the software
improvement, all the techniques used should system and its threat environment.
contribute to the assurance case for the software thus it is a top-level control document, normally
system. Those improving the development summarized periodically thorough the issuance
process and the developers’ organizations, such of Assurance Case Reports with arguments
as process audits and improvements, contribute linked to the evidence that record progress. It
indirectly. remains with the software system throughout its
This section first addresses assurance case life through disposal. The Assurance Case is,
concerns and then addresses a number of issues therefore, a progressively expanding body of
and techniques related to secure software evidence during development and responds as
engineering verification, validation, and required to all changes during development and
evaluation. sustainment. [Ministry of Defence 2003b, p. 5]
4.7.2 Assurance Case As a reasoned, auditable argument created to
support the contention that a defined software
As mentioned in the Fundamentals section system satisfies the relevant requirements, the
quoting from MoD DefStan 00-42 Part 3, an assurance case provides an audit trail of the
Assurance Case is: “A reasoned, auditable engineering considerations, from requirements to
argument created to support the contention that a full evidence of compliance. It provides the
defined system will satisfy the … requirements.” justification of why certain activities have been
These, in our case, are the security requirements, undertaken and how they are judged successful.
but a combined assurance case could be made It is initiated at the concept stage, progressively
for security and safety or other properties. Where revised during a system life cycle to be up-to-
they have significant impacts on the required date and available, and with its status continually
security properties justification exists for the
security assurance case including not only
tracked. It is typically summarized in Assurance achievement of risk reduction is appropriate
Case Reports at predefined milestones. for the level and importance of the risk
The assurance argument provides a structured set reduction.”
of claims (or goals to assure) that progressively 5. Modular Certification and Technical
justify through argument – show, demonstrate, Modularity: organizational or system
perform causal analysis, prove, quantify – their interfaces, particularly with external
super-ordinate claim or goal. This element of the systems, need the “other” side of the
case may contain details of the initial (justified) interface to justifiably have the assured
requirement and reasons for the proposed qualities claimed. “Module boundaries shall
solution. match the organizational boundaries.”
Each goal or claim may be achieved either
6. Evidence: requirements have been
directly or by goal breakdown. In general, all of
established for the recording, handling, and
the things needing to be achieved by a goal are
characteristics of evidence to be used
carried down and allocated to one or more of its
subgoals, and hence once the decision has been 7. Evaluation/Assessment: the project shall
made to use goal breakdown, the higher level document a means of demonstrating the
goal does not need to be referred to again. For achievement, by the system, of each residual
this reason, many of the requirements stated in risk to a level of confidence appropriate for
one goal are repeated in its subgoals, sometimes that risk, obtain agreements on evaluations
verbatim, sometimes slightly enhanced. and assessments among the parties involved,
To achieve a goal directly, a number of things and carry them out successfully (as
must be argued. The SafSec Standard divides the determined by evaluation/assessment
ways of arguing or showing into seven non- results).
exclusive ways and calls these “frameworks”. “In order to demonstrate that a system or module
Most importantly, compliance with each meets its dependability objectives, evidence for
framework means meet certain standards such as the dependability argument is prepared and
the organizational roles being defined, including presented. Evidence shall be permanent,
standard ones. However, in addition, they are traceable and managed in such a way as to
one categorization of the rationales for an provide later readers confidence in its source,
argument, although many of them must usually contents and validity.” [SafSec Standard 2004,
be included in the arguments and evidence of a page 8]
single claim or goal. [SafSec Standard 2004, When judging the credence to be given evidence
page 19-24] The arguments are: its relevance, visibility, traceability, and quality
1. Organizational: the goal is achieved by some are crucial factors. Therefore, one must
organization. necessarily confirm that the evidence is
2. Procedural: certain actions have been carried generated, managed, validated, and used within
out. constraints of acceptable practices and controls.
It must also achieve the objectives claimed in the
3. Unified Risk Management: this process assurance argument. [MoD DefStan 00-42 Part
justifies the acceptability of residual risk 3, section 9.1] The body of evidence can become
against the agreed and documented top level quite large, and for usability probably needs to
dependability objectives – particularly be organized by some evidence framework. The
relevant to assurance arguments. structure of the assurance can make it easier or
4. Risk Directed Design: “document a hared to create, understand, and modify. [Kelly
justification for achievement, by the system, 2003]
of each residual risk; and document a As the principal means by which the “justifiably
justification that the evidence of high confidence” can be explicitly obtained, the
assurance case is central to the verification, • Perform penetration testing and test security
validation, and evaluation of secure software. It functionality
should not only give coherent confidence to
• Provide a covert channel analysis
developers, sustainers, and acquirers, but also be
directly usable by others including certifiers and • Perform ongoing monitoring of security
accreditors. threats, needs, and environment
Activities involved might include • Perform changes securely maintaining
• Create top-level assurance goal from conformance to – possibly revised – security
requirements requirements while continuing to provide
• Establish structure of argument with complete assurance case
subordinate goals or claims including their • Deploy securely
relationships These by no means exhaust the list of software
• Create portions of assurance argument security assurance related activities one can
tailored for level of assurance desired expect to do. Indeed most of the activities and
• Compile portions of argument and knowledge mentioned in this section can be
supporting evidence expected to contribute to the assurance case.
While less often mentioned in standards, several
• Verify other factors can affect one’s assurance
• Validate concerning software including [Redwine 2004,
• Analyze p. 36] [SCC ??]
• Establish level of confidence • The quality and history of the people who
• Use as input to certification produced it
Several steps are mentioned in More • The characteristics and history of the kind of
Fundamental Concepts and Principles in process used to produce it
Development that contribute to the assurance • The quality of the environment in which it
case. was produced
• Perform threat analysis and assure it quality • Data on the quality and fidelity of use of the
• Provide assurance case showing that top- production process for this piece of software
level specification agrees with security • The realism of the assumptions made
policy
• Characteristics of the software design
• Develop a security test plan including the extend to which it implements
• Develop an implementation of the system defense in depth and tolerance
based on the top-level specification • Results of reviews, tests, and analyses of it
providing assurance case with argument and
evidence (preferably including proofs) that • Data on the execution history of the software
itself – particularly for post-update assurance
• Assure design
While certification authorities may not always
o Agrees with top-level specification and consider everything relevant, every aspect
security policy having potentially signification consequences for
o Contains only items called for in top- meeting security requirements or for the
level specification confidence of key stakeholders has a place in a
full assurance case along with its related
• Code correspondences to design and security evidence.
policy and contains only items called for in
design
4.7.3 Ensure Proper Version restricted to technical attacks (e.g. no social
engineering), then the interval between
Of course, one needs sound configuration and
successes by a quality red team might be
release management, but one also needs secure
used. But how long without a break-in would
configuration management facilities and
be acceptable? Today, red teams seem to
cryptographic signing of artifacts or other means
succeed in attacks against almost all
to ensure their proper origin and their end-to-end
software producers’ software with the limit
integrity when distributed internally or
often set by the willingness to continue
externally. A process review could trace the use
paying the expensive red team. [Howard
and non-use of cryptographic signatures could
2002, p. 48] gives the firm advice, “Do not
start with individuals producing them and extend
ship with known exploitable vulnerabilities,”
all the way to execution or other external use.
that he reports has worked for Microsoft.
4.7.4 Testing • Certification Testing is a part of a
4.7.4.1 Test Process certification process such as FIPS-140
certification of cryptographic software, virus
Many of the considerations in testing secure software certification by ICSA Labs, or the
software are those of any quality testing process, Common Criteria Evaluation process.
for example, assuring the correct version is
tested.. Several, however, have differences. • Change and Regression Testing: Testing
Differences occur in activities such as vulnerability fixes can need to be extensive
because security properties are emergent
• Independent Verification and Validation’s properties. Any ability to confine the effort
independence add credibility to its results for depends on the roles different tests play in
some stakeholders and bring added the assurance case. Just as the structures of
expertise. See Independent Verification and the designing and the assurance case can
Validation subsection. help in gaining assurance originally,
• Unit Testing: One high-end heavyweight structuring can potentially aid in restricting
producer has reported that measurement has the amount of rework required in modifying
show little contribution by unit test to defect the assurance case including work in testing.
discovery and has tried eliminating it.[Praxis • Measurement: Security metrics are a difficult
??] No lightweight process user has reported area as is discussed in the Secure Software
similar results. Measurement subsection, and metrics
• Test Planning and Preparation must include attempting to measure the additional security
security-related testing and a facility where assurance justified by a particular test result
potentially dangerous malicious actions can suffer the same difficulty. Successful attacks
be imitated without fear of impacting others. have a clear result that the system is not
Incorporating test results in the assurance secure, but what confidence is justified after
case is eased by planning the tests to fill fixing the identified vulnerability?
certain places requiring evidence within the • Reporting and Incorporating in Assurance
assurance case’s structured argument. Argument
• Testing for Functionality Not in While testing is essential in developing secure
Specifications is important to help establish software, test driven development where
the software not just meets the specification specifications are replaced by tests (specification
but does nothing else by example) is not viable as it does not provide a
• Conducting Test needs to be ethical as well serious approach to specifying such properties as
as efficient and effective non-bypassability.
• Termination Criteria are unclear for testing
such as penetration testing. If testing is
4.7.4.2 Test Techniques 4.7.4.2.3 Fault- and Vulnerability-Oriented
A number of testing techniques either are unique Testing
to security or have special uses for security Testing designed to discover certain kinds of
testing. [Bishop 2003, p. 533-540] faults is known to be useful. Similarly testing
aimed at a certain class of vulnerabilities can be
4.7.4.2.1 Attack-Oriented Testing
useful particularly if one or a few kinds of faults
Attack testing involves having persons try to underlie them.
break the software. For security this is called
While fault tolerance can mitigate the effects of
penetration testing and aims to violate specified
faults, the characteristic of low defects is often
or expected security usually by imitating
discussed as a prerequisite for secure software,
techniques used by real-world malicious
and the existence of increasing numbers of
attackers. [Whitaker 2004] [Flickenger 2003]
defects is usually believed to increase the chance
The teams that perform these are often called red
of the existence of vulnerabilities. [Redwine
teams, and generally include dedicated
2004] Some dispute, however, exists about this
specialists
[Lipner 2005b], and conclusive, generalizable
Clearly, the tests are seeking vulnerabilities and evidence is hard to create.
likely vulnerabilities, i.e. potentially exploitable
to cause harm but testing time was not spent to 4.7.4.2.4 Security Fault Injection Testing
prove it. One additional question to be addressed To test the conditions where the attacker has had
is how successful any defensive deceptions partial success or where part of the system has
employed. mal-functioned or ceased to exist/respond, one
While often following game plans and scripts, needs to inject faults in code or errors in data –
this testing also often has an exploratory testing or possibly via hardware faults.
aspect. Exploratory testing is defined in the An attacker in control of part of the system
SWEBOK Guide (p. 5-5) as simultaneous would attempt to exploit its capabilities to
learning, test design, and test execution; that is, achieve further success. A mal-functioning part
the tests are not defined in advance in an of the system might be created deliberately by an
established test plan, but are dynamically attacker to achieve favorable results.
designed, executed, and modified. In a common case, attackers could attempt fault
The difficulties with deciding when to stop and injection (modification) of client software
what the results mean were discussed under test including webpages under their control in such a
process. way as to cause security violations. This can be
explored by injections such faults in an attempt
4.7.4.2.2 Brute Force and Random Testing
to test the possibility.
Advocates exist for testing with large volumes of
Finally, an issue may exist in the possibility of
structured or unstructured random data with the
failure by some element in the systems
aim of exposing faults or vulnerabilities. When
environment especially its infrastructure that can
the frequently used criterion for failure is an
be tested by fault injection in the environment.
abort or crash, failing these tests is a sign of
quite poor software. [Redwine 2004] 4.7.4.2.5 Using Formal Methods to Support
Random testing to compare the performance a Testing
new version to a different one can, however, be Formal methods provide a formal specification
useful. Finally, testing with appropriate random of expected behavior that can be used to aid in
distributions of input can be used to predict generating test and most powerfully as a test
reliability and availability in the non-security oracle to automatically check if the behavior
context. resulting from a test meets the specification.
4.7.4.2.5.1 Test Generation mistakes made. The testing would include
Specifications can be analyzed for a making any needed changes to the assurance
characterization of the input space and tests case.
created heuristically (etc. partition or boundary Among other uses maintainability forecasts are
value tests). State-machine-based specification inputs to availability analysis.
may use model checking techniques to select test
4.7.4.2.11 Operational Testing
cases.
The goal of operational testing is to investigate
4.7.4.2.5.2 Test Oracle and assure successful use in a real world
Checking code derived from formal environment. .Operational testing may include
specifications can potentially be used to check testing from early in development to after the
test results for correctness at any level of system is in use. Kinds of operational tests
aggregation, but for security the special aspect is include.
checking is emergent system properties are • Alpha and beta testing
violated as well as the correctness of security
• Field Testing
functionality. Most logic-based techniques
presume that the specification related directly to • Operational test and evaluation
system state and state changes, and not to the • Parallel with old version
history of the system. Test oracles-based on state
• History collection of actual use after release
machines may also check state sequences.
Security aspects need to be involved in all of
4.7.4.2.6 Usability Testing these.
Do security aspects affect usability and
acceptability negatively, and, if so, how much? 4.7.5 Dynamic Analysis
Dynamic analyses as opposed to dynamic testing
4.7.4.2.7 Performance Testing often do not actually execute the software.
How much do development decisions or product However, some do.
aspects affected by the security requirements
slow performance? 4.7.5.1 Simulations
Simulation models of the system might be used
4.7.4.2.8 Compliance Testing to help verify that it conforms to the security
Testing to demonstrate compliance with policy or has resistance to certain kinds of denial
standards can be relevant for those standards of service attacks.
relating to security.
4.7.5.2 Prototypes
4.7.4.2.9 Reliability Testing Prototypes intended to answer security related
Reliability testing’s probabilistic approach is not questions play a normal prototype role.
appropriate for confidentiality and integrity but Prototypes that are intended to evolve into part
is one input to availability analysis. of the system need to be developed so the same
Whether a software system produces correct level of assurance case can be created for them
results despite being under attack is a reliability as for the remainder of the system. Building the
issues, but it is also one of the central isssues in assurance case (or possibly a prototype thereof)
security testing in general. during prototype development might be wise just
as it is in regular development.
4.7.4.2.10 Evolvability or Maintainability
Testing 4.7.5.3 Mental Executions
Evolvability or maintainability testing could Mental execution of software has a long history
involve postulating changes and having them and can have utility for cheaply and quickly
performed while measuring the time, effort, and
illuminating sample execution paths, but its include [Hall 2003] [Jürjens 2005] Experience
creditability as verification evidence is limited. has shown that humans are bad at analyzing
potential concurrent behaviors and that
4.7.5.4 Dynamic Identification of
automated analysis is essential to ensure
Assertions and Slices desirable properties.
Automatically “guessing” pre- and post- Use of formal methods is frequent in production
conditions or invariants by observing many of software to the highest assurance levels.
executions could be useful to provide candidates
for existing software, but would be poor practice 4.7.6.1.1 Static Code Analysis
when developing new software. Such candidates For code in particular programming languages or
cannot be uncritically depended upon as being subsets of languages automated help exists to
the proper conditions for correctness. show
Static analyses to determine a slice of a program • Correctness )e.g. partial correctness)
that contains all that is relevant to some action or
• Lack of exceptions
variable can be difficult and computationally
intensive. Estimating slices by observing a • Lack of potential vulnerabilities
number of program executions may help • Information flow
heuristically but generally cannot be depended
• Concurrency abnormities
upon to be complete.
Static analysis tools to scan source code for code
4.7.6 Static Analysis practices that can result in vulnerabilities are
Static analysis methods relevant to security reported to give many false positives. It is not
include formal methods and static code analysis. clear that this is the case, however, when coding
standards are followed that forbid or discourage
4.7.6.1 Formal Analysis and these practices. Conformance to the bulk of such
Verification coding standards can be verified by automated
The most powerful technical methods for style checkers.
assuring system descriptions’ consistency or 4.7.6.2 Informal Analysis, Verification,
correctness are mathematically-based formal
and Validation
methods. Most formal techniques in use are
based on either logic or model checking. See the 4.7.6.2.1 Reviews and Audits
Methods and Tools section. Reviews are prime examples of an informal but
Formal methods can be used for checking the essential and effective technique. Knowledge
existence or consistency of security-related exists concerning reviews with a variety of
software properties, within and across artifacts. characteristics. Three primary differences exist
Formal methods can show consistency among for the concerns addresses during reviews within
formal specifications of secure software projects – concern for.
• Security Policy 1. Achieving security properties and policies
• Specification 2. Doing only what the specification calls for
• Design 3. Avoiding software security pitfalls
• Code 4. Identifying possible attacker behaviors and
consequences
Code is, of course, a formal notation as well.
Some approaches lack rigorous automated Reviews, however, can be used to address all the
support for showing correspondence between topics mentioned under testing often more cost
design and code. effectively and should therefore normally be
used be significant investment in testing.
For designs, techniques are available for both
Security-oriented checklists exist for use in
sequential and concurrent structures. Examples
various software system reviews. For example 4.7.7 Usability Analysis
[Meier 2004] has a number of them.
In addition or in combination with usability
Formalized techniques exists for reviews include testing a fuller analysis can be done, for example
a scenario based [Bass 2001], inspections, and [Whitten 1999].
ones based on assurances of merit by experts
rather identification of faults [Parnas 1985] 4.7.8 Verification and Validation of
[Laitenberger n. d.] User Aids
. The last, called active or perspective reviews, Because security needs to be treated correctly
might provide output in an attractive form for the and effectively in user aids such as manuals and
assurance case. help facilities these need to be reviewed and
In addition to reviews of security-oriented probably included in usability testing. Kinds of
artifacts and concern for security as part of user aids covered to ensure existence and
normal review activities, specialized security correctness of security-oriented material include
reviews and security audits may occur as may • Documentation
legal or regulatory-oriented reviews. For full
o Normal documentation covers security
effectiveness, some of these reviews require
aspects and use
special security-oriented expertise. For example,
Validation of an original threat analysis should o Additional operational security guide
involve threat experts. • Training
All engineering artifacts for trusted software – • Online user aids, e.g. help facility
requirement, design, code, etc. artifacts – must
• User support
be reviewed by authors’ peers – and, as in
instances mentioned above, other reviewers as o Help desk
needed. Good practice would, of course, call for o Online support
this to be true for untrusted software as well.
Indeed, without reviews, adequate assurance that 4.7.9 Secure Software Measurement
software does need to be trusted may not be The ultimate measure of merit question may be,
possible. “How much harm will be or has been caused by
Special or additional rounds of review or more known and unknown security violations or
reviewers may be employed as measures to add vulnerabilities or other impacts of security and
assurance evidence. Readiness reviews may how security was embodied in the software
occur before the security certification or system involving it?”5 Even if one can make
accreditation activities. decisions to all the possible variations in
Finally, regular reviews of the assurance case are defining the ultimate measure of merit, as indeed
necessary to ensure its quality. These reviews one would have to, prediction or forecasting
can benefit from involving faces tremendous difficulties among the truth
that no intellectually honest way exists to
• Parties (or their surrogates) whose assurance calculate risks from the technical characteristics
is the goal of the assurance case of the software.
• Developers of the assurance case
5
• Persons who fully understand the evidence Should we include future and variant versions
as well as software using work products,
• Persons able to rigorously judge assurance methods, techniques, or ideas from this effort or
arguments product? What counts as harm? How
Reviews are possibly the most cost-effective insignificant a harm can be safely ignored? What
verification technique and should be planned for about offsetting benefits, say in fewer accidents
and performed. or reduced security spending for physical
security?
This does not mean, however, that useful the lightweight processes that have been
engineering measurements cannot be made, developed so organizations trying to improve in
analyzed, and used to improve. Often these that range can make comfortable decisions about
measures are relative ones – change in one improvement. Counting the number of activities
direction say becoming smaller, is known to be adopted and measuring projects’ training and
better, but, “How much?” is not answerable in fidelity in following them are process
terms of future experience in the field. Likewise measurements.
for an aspect of development, using a particular In comparing high-end processes, benefit can be
practice may be known to be better than using no gained from not only the limited number of such
practice at all, or one practice is better than secure systems about which information is
another. In addition, doing an activity more available but the wider experience and
skillfully is almost always better. arguments have been conducted in the area of
A somewhat similar problem exists with highly safe software.6 Most noticeably at the
software reliability prediction, knowing the high-end and regardless of process, well lead,
defect density of software does not allow (except highly intelligent, skilled, and motivated people
at the extremes) much to be said intellectually can make a big difference over merely good
honestly about the software’s reliability in the ones. These elements also offer opportunities
field as defects may or may not ever be from measurements.
exercised. Since we know having fewer defects Microsoft has adopted a set of metrics to aid in
is better, this has not kept us from using and producing secure software. [Lipner 2005a]
benefiting from density of discovered defects as reports, “These metrics range from training
an engineering measure. coverage for engineering staff (at the beginning
The same holds for counting vulnerabilities. of the development lifecycle) to the rate of
Also as with defects, vulnerabilities can be discovered vulnerabilities in software that has
ranked by severity and categorized for analysis been released to customers.
and improvement. To continue the analogy, “Microsoft has devised a set of security metrics
coding reviews have traditionally identified that product teams can use to monitor their
violations of the organization’s coding standards success in implementing the SDL. These metrics
without concern for correctness. With security, address team implementation of the SDL from
coding practices that are in some circumstances threat modeling through code review and
potential bases for vulnerabilities can often security testing to the security of the software
simply be forbidden by the coding standards and presented for Final Security Review (FDR). As
counted in reviews or by automatic style these metrics are implemented over time, they
checkers. should allow teams to track their own
Reviews of other artifacts can do similar performance (improving, level, or deteriorating)
measurements, root cause analysis, and as well as their performance in comparison to
improvements. Counts of vulnerabilities created other teams. Aggregate metrics will be reported
and discovered in each activity should be useful
analysis just as with other kinds of defects.
6
Measures of “attack surface” have been One confounding factor is that a number of such projects
have been government procurements where contractors or
proposed. A crude one is how many ports are groups of contractors have “pushed back” and possibly less
open. A more sophisticated measurement is than highly knowledgeable government negotiators have
described in [Manadhata and Wing 2004] and retreated. In other software areas governments have been said
[Howard 2003]. These also are relative measures to retreat from such arguments as, “We cannot hire enough
where less is better. programmers knowing <vastly superior programming
language> and, if we have to train them and bring them up to
For projects aiming at less than highly secure speed, your whole system’s tight schedule will slip.” Of
software, reasonable commonality exists among course, history seems to indicate that major systems have
significant slippage deriving from multiple causes.
to senior product team management and Information (SCI) within the US government is
Microsoft Executives on a regular basis.” covered by [DCID 6/3 2000]. See Requirements
Thus, while theoretically software security section. [Bishop 2003, Chapter 21]
measurement is ultimately on a slippery Each of these has its own evaluation process for
foundation, practically much useful can be done. certification. In the US, the Common Criteria
has a two step process, first evaluation by an
4.7.10 Third-Party Verification and approved laboratory and then validation by the
Validation and Evaluation government. Validation of EAL levels 1-4 is
Among the relevant third party activities are done by NIAP (National Information Assurance
independent security testing, independent Partnership) and that of EAL 5-7 generally by
verification and validation, certification of the NSA. [NIST 2005, Part 4]
software system, and accreditation of an Common Criteria is an international standard
operational system. and a number of countries have a reciprocity
agreement for EAL 1-4, so opportunity exists at
4.7.10.1 Independent Verification and
this level to consider laboratories internationally.
Validation
The phrase “Independent Verification and 4.7.10.3 System Accreditation
Validation” implies verification and validation The US DoD has created the DoD Information
are done by a separate organization – not just a Technology Security Certification and
different team within project. For software Accreditation Process (DITSCAP). [DoD
intended to claim some degree of security, an 8510.1-M] On the civilian side of the US
option exists to do concurrent certification. This government the National Institute for Standards
is the seemingly preferable but rare way to do and Technology has produced NIST Special
many certifications including Common Criteria Publication 800-37 Guide for the Security
ones. Certification and Accreditation of Federal
Even if this is done, however, independent Information Systems. [Ross R. 2004]
performance of some verification and validation ISO/IEC 17799:2005 published 15 June 2005
activities can make sense possibly to bring added Information technology. Code of Practice for
resources and expertise as well as independence.. Information Security Management combined
Independent security testing is possibly the most with BS7799-2:2002 (anticipated will be
common, and is available from a number of ISO/IEC 27001) form a basis for an Information
sources. Security Management System (ISMS)
Contracting out for development of the certification. More than 1300 certifications have
assurance case – as opposed to aid and review – been issued worldwide7.
would appear to have the same downside that See Requirements section.
outsourcing preparation of documentation for Ultimately secure software’s goal is to be part of
Common Criteria certification has – it does secure system providing secure services. This
influence and improve development or the larger goal can have a significant impact on
product. requirements; design; and verification,
4.7.10.2 Software Certification validation, and evaluation.
The Common Criteria standard is the most 4.7.11 Assurance for Tools
widely referred to official (ISO) standard Tools that have been subverted or have
regarding software security. Others also exist “naturally” occurring vulnerabilities are potential
such as FIPS-140 certification of cryptographic sources of problems in products and updates.
software, virus software certification by ICSA
Labs, and BITS in the financial sector. 7
ISMS Users Group, http://www.xisec.com/ These
Protecting Sensitive Compartmented
certifications were clearly to the old verdsion of ISO 17799.
Automation generation of an artifact can also The system may contain software that is trusted
give confidence if confidence exists in the and software that is untrusted. For software,
generation tool. For example, this is often the which is firmly established as not needing to be
case for compilers. But an assurance case is trusted, the dependability properties of interest
needed for tools as well as the primary products. tend toward traditional, probabilistic ones, and
In the assurance of automated support for formal therefore methods such as input-distribution-
methods, software to check proofs are much based reliability testing may remain adequate.
simpler than software to create proofs and are [ISO/IEC 15443 Part 3 2004] provides a number
therefore easier to gain confidence in. . of comparisons of official assurance or
If adequate assurance cannot be gained, then certification processes. These include ones based
options include hardening the development on ISO 13335, ISO/IEC 15408, ISO 17799,
network possibly separating it from all other ISO/IEC 21827, ISO/IEC 15408, TCMM,
networks. X/OPEN, and FIPS-140-2.
Finally, laws, regulations, policies, and standards
4.7.12 Selecting among VV&E may call for the use of certain techniques. While
Techniques laws and regulations may require compliance,
Much of the process of selecting among following some standards may be discretionary.
techniques for dealing with VV&E of secure On this point and the general problem of
software is unchanged. Special care can aid in selection, a relevant piece of advice appears in
ensuring that no aspect of security is covered by DEF STAN 00-56 [Ministry of Defence 2005b,
just one technique or only a few instances Part 2 p. 70] that begins by mentioning the
examination. concept of making risk “as low as reasonably
The importance of quality requirements and practicable” ALARP.8
design may be even more pronounced for “Because the ALARP criteria should be
security. The benefits of prevention and determined by the cost-effectiveness of the
discovery of problems while still relatively technique rather than the availability of the
inexpensive to correct remains true for security budget, the techniques recommended or
problems. That dangers and risk management mandated in authoritative sources (e.g.
drive much of the decision making on VV&E international or national standards) of
techniques and amounts of schedule, effort, and relevant good practice should be applied
costs devoted to them remains unchanged. whenever the costs are not grossly
Estimating the dangers and risks takes different disproportionate to the benefits. Where there
techniques and can result in less certain results. is a wide range of potential measures and
See Project Management section. techniques that could be applied, a consistent
set should be adopted and the choice should
From a development process viewpoint, one be justified in the [Assurance] Case. The
question often asked is how likely is a defect or selection should preferably be based on the
possibly more importantly a hard to discover and collection of data about the performance of
fix defect likely to occur if we do a development specific methods and techniques in similar
activity using certain techniques. When an applications, but as such data is difficult to
activity is straightforward and the performers obtain, it will probably be necessary to
and reviewers have previously proven justify the selection using expert judgement
consistently proficient in their roles, then careful (sic).”
performance and review may be adequate. If it is
less straightforward or the organization’s track
record is poorer, then more formal approaches to
performance and VV&E would be called for. 8
ALARP is a significant concept in UK law, and an excellent
engineering-oriented discussion of it appears in Annex B of
DEF STAND 00-56 Part 2
4.7.13 References 2003. available as CMU-TR-03-169, August
2003
[Barnes 2003] Barnes, John. High Integrity
Software: The SPARK Approach to Safety and [Howell] Howell, C. Assurance Cases for
Security, Addison Wesley 2003 Security Workshop (follow-on workshop of the
2004 Symposium on Dependable Systems and
[Bishop 2003] Bishop, Matt. Computer Security:
Networks), Arlington, Virginia, June 13-15,
Art and Practice. Addison-Wesley, 2003.
2005
[Bourque and Dupuis 2004] Bourque, Pierre, and
[Huang 2004] Huang, Y., Yu, F., Hang, C., Tsai,
Robert Dupuis (Editors). Guide to the Software
C., Lee, D., and Kuo, S.. Securing web
Engineering Body of Knowledge. 2004 Edition.
application code by static analysis and runtime
Los Alamitos, California: IEEE Computer
protection. In Proceedings of the 13th
Society, Feb. 16, 2004.
international Conference on World Wide Web
[DoD 8510.1-M] DoD 8510.1-M DoD (New York, NY, USA, May 17 - 20, 2004).
Information Technology Security Certification WWW '04. ACM Press, New York, NY, 40-52.
and Accreditation Process (DITSCAP)
[ISO/IEC 15443 Part 3 2004] ISO/IEC 4th WD
Application Manual, July 31, 2000.
15443-3 - IT security techniques – A framework
[Despotou 2004] Despotou, Georgios, and Tim for IT security assurance – Part 3: Analysis of
Kelly, “Extending the Safety Case Concept to assurance methods, ISO, 2004
Address Dependability,” Proceedings of the
[Jürjens 2004] Jürjens, Jan, Secure Systems
22nd International System Safety Conference, p.
Development with UML, Springer-Verlag 2004
645-654, - 2004
[Jürjens 2005] Jürjens, Jan, “Sound Methods and
[Du 1998] Du, Wenliang and Aditya P. Mathur.
Effective Tools for Model-based Security
“Vulnerability Testing of Software System
Engineering with UML,” 27th International
Using Fault Injection.” COAST, Purdue
Conference on Software Engineering, St.Louis,
University,1998
Missouri, USA; 15 - 21 May 2005
[NIST SP 800-40] NIST SP 800-40 Procedures
[Kelly 2003] Kelly, T. P. Managing Complex
for Handling Security Patches, NIST, September
Safety Cases Department of Computer Science
2002
University of York, 2003.
[Flickenger 2003] Flickenger, Rob. Wireless www.cs.york.ac.uk/~tpk/sss03.pdf
Hacks. O’Reilly and Associates, Inc., 2003.
[Laitenberger n. d.] Laitenberger, Oliver, The
[DCID 6/3 2000] Director Central Intelligence. Perspective-based Inspection Approach,
Protecting Sensitive Compartmented Fraunhofer Istitut Experimentelles Software
Information within Information Systems (DCID Engineering, Kaiserslautern, n.d.
6/3) Manual. 24 MAY 2000. http://www.tol.oulu.fi/projects/tarjous/perspectiv
[Gasser] Gasser, M. Building a Secure e.pdf
Computer System. Van Nostrand Reinhold, [Lipner 2005a] Lipner, Steve and Michael
1988. Howard, The Trustworthy Computing Security
[Ibrahim et al, 2004] Ibrahim, Linda, et al. Development Lifecycle, Microsoft, 2005
Safety and Security Extensions for Integrated http://msdn.microsoft.com/security/default.aspx?
Capability Maturity Models. Washington D.C.: pull=/library/en-
United States Federal Aviation Administration, us/dnsecure/html/sdl.asp#sdl2_topic8?_r=1
Sept. 2004. [Lipner 2005b] Personal conversation with SR
[Howard 2003] Howard, M., J. Pincus and J. July 13, 2005
Wing. “Measuring relative attack surfaces,” [Manadhata and Wing 2004] Manadhata, P. and
Proceedings of the Workshop on Advanced J. M. Wing. Measuring A System's Attack
Developments in Software and Systems Security. Surfacehttp://www-
2.cs.cmu.edu/afs/cs/project/calder/www/tr04- [SafSec Guidance] “SafSec Methodology:
102.pdf, CMU-TR-04-102, Jan. 2004. Guidance Material.” SafSec: Integration of
[Meier 2004] Meier, J.D., Alex Mackman, Safety and Security.
Srinath Vasireddy, Michael Dunner, Ray http://www.safsec.com/safsec_files/resources/50
Escamilla, and Anandha Murukan, Improving _3_SafSec_Method_Guidance_Material_2.6.pdf
Web Application Security: Threats and [Sommerville 2004] Sommerville, I. Software
Countermeasures, Microsoft, 2004 Engineering. 7th ed. Pearson Education, 2004.
http://download.microsoft.com/download/d/8/c/d [Viega 2000] Viega, John, et al. “Statically
8c02f31-64af-438c-a9f4- Scanning Java Code: Finding Security
e31acb8e3333/Threats_Countermeasures.pdf Vulnerabilities,” IEEE Software, vol. 17, no. 5,
[Ministry of Defence 2003b] Ministry of Sept./Oct. 2000 , pp. 68-74.
Defence. Defence Standard 00-42 Issue 2 [Viega] J. Viega. The CLASP Application
Reliability and Maintainability (R&M) Security Process. Secure Software, 2005.
Assurance Guidance Part 3 R&M Case, 6 June
[Whittaker] J. A. Whittaker and H. H.
2003.
Thompson. How to Break Software Security:
[Ministry of Defence 2004b] Ministry of Effective Techniques for Security Testing.
Defence. Interim Defence Standard 00-56 Safety Pearson Education, 2004.
Management Requirements for Defence Systems
[Whitten 1999] Whitten, A. and Tygar, J.D.
Part 2: Guidance on Establishing a Means of
“Why Johnny Can’t Encrypt: A Usability
Complying with Part 1, 17 December 2004
Evaluation of PGP 5.0,” Proc. Ninth USENIX
[NIST 2005] The National Institute of Standards Security Symposium, 1999
and Technology. Common Criteria v. 3.0, July,
2005. 4.7.14 Further Reading
[Parnas 1985] Parnas, D. L. and Weiss, D. M. [Avizienis 2004] Avizienis, Algirdas, Jean-
“Active design reviews: principles and Claude Laprie, Brian Randell, and Carl
practices.” Proceedings of the 8th international Landwehr. “Basic Concepts and Taxonomy of
Conference on Software Engineering (London, Dependable and Secure Computing,” IEEE
England, August 28 - 30, 1985). IEEE Computer Transactions on Dependable and Secure
Society Press, p. 132-136. 1985. Computing, vol. 1, no. 1, Jan.-Mar. 2004, pp. 11-
33.
[Ross R. 2004]Ron Ross, Marianne Swanson,
Gary Stoneburner, Stu Katzke, and Arnold [Chirillo 2002] Chirillo, John. Hack Attacks
Johnson. NIST Special Publication 800-37 Revealed: A Complete Reference for UNIX,
Guide for the Security Certification and Windows, and Linux with Custom Security
Accreditation of Federal Information Systems, Toolset. Wiley Publishing, Inc., 2002.
May 2004 [Herrmann 2001] Herrmann, Debra S. A
[Redwine 2004] Redwine, Samuel T., Jr., and Practical Guide to Security Engineering and
Noopur Davis (Editors). Processes for Information Assurance. Auerbach, 2001.
Producing Secure Software: Towards Secure [Merkow and Breithaupt 2004] Merkow, Mark
Software. vols. I and II. Washington, D.C.: S. and Jim Breithaupt. Computer Security
National Cyber Security Partnership, 2004. Assurance. Thomson Delmar Learning, 2004.
[SafSec Standard] “SafSec Methodology: [Ministry of Defence 2003b] MoD 00-56
Standard.” SafSec: Integration of Safety and Interim Defence Standard Safety Management
Security. Requirements for Defense Systems, Part 1, 2005.
http://www.safsec.com/safsec_files/resources/50 ( sections 6-11 are analogous to what one might
_2_SafSec_Method_Standard_2.6.pdf say about security and threats instead of safety
and hazards)
[CSTB 2004] Committee on Certifiably Dependability. National Academies Computer
Dependable Software Systems. Summary of a Science and Telecommunications Board,
Workshop on Software Certification and National Academies Press 2004.
4.8 Secure Software Tools and Methods
4.8.1 Scope calculus of communicating systems (CCS).
Algebraic languages implicitly define security
Secure software development tools and methods
properties using equivalence relations to specify
assist a developer in the creation of secure
functionality. Languages include the OBJ and
software and encourage a specific disciplined
Larch family of languages. Temporal logic
development approach [Abran and Moore 2004].
languages model timed or ordered sequences of
In this section, representative tools and
events. Languages of this class of formalism
methodologies are presented and discussed.
include Petri nets, Temporal Logic with time
From the outset it should be recognized that
Series (TLS), and state diagrams [Sinclair 2005].
there is no “silver bullet” tool or technique that
can deliver absolutely secure code. Each tool or Techniques for establishing security properties
technique has a particular purpose and has a fall into two broad categories: proof-based
particular scope of application. The section is not techniques or model-based techniques [Bishop
comprehensive in its level of detail, but does 2003]. Proof-based techniques are inductive and
intend to cover the various types of tools and attempt to show that from a set of initial
methods that can be used to produce secure conditions, a particular set of conclusions will
software. logically follow. Examples of theorem provers
include: Another Logical Framework (ALF),
4.8.2 Formal Methods Higher Order Logic (HOL), Prototype
“ ‘Formal Methods’ refers to mathematically Verification System (PVS), Naval Research
rigorous techniques and tools for the Laboratory Protocol Analyzer (NPA), and
specification, design and verification of software Theorem Proving System (TPS).
and hardware systems. … ‘[M]athematically Model-based techniques “establish how well a
rigorous’ means that the specifications used in specification of a system meets a set of
formal methods are well-formed statements in a properties [Bishop 2003].” Model checkers
mathematical logic and that the formal typically perform an exhaustive search over a
verifications are rigorous deductions in that logic finite state space [Clarke and Wing 1996].
[Langley 2005].” Methods for the formal Examples of model checkers include
verification of security can be classified Concurrency Workbench (CWB), Spin,
according to how security properties are Symbolic Model Verifier (SVM), and
specified [Sinclair 2005] or by the manner in METAFrame.
which the security properties are established
[Bishop 2003]. 4.8.3 Semi-Formal Methods
Axiomatic languages specify security properties Semi-Formal methods, sometimes known as
as logical assertions that are shown to be true or lightweight formal methods, often “emphasize
false. Axiomatic language notations include the the use of graphical representations” [Vaughn
Vienna Development Method (VDM) and Zed and George 2003]. Semi-formal techniques
(Z). These types of languages are also often used possess some formalism, but cannot be used to
to specify abstract models of security systems verify or validate most system characteristics.
due to their ability to model system state. State Another difference is that semi-formal
transition languages, on the other hand, not only techniques usually provide methodological
model system state, but explicitly model the guidance (since that is where these methods tend
conditions in which state transitions will occur. to be found) whereas formal techniques do not
State transition notation systems include state [Alexander 1995]. Data Flow Diagrams (DFD),
diagrams, message sequence charts, and the Universal Modeling Language (UML)
communicating sequential processes (CSP), and can be considered semi-formal methods.
Some tools combine semi-formal and formal applets for example, only get access to the
elements most notably extensions to UML. resources made available in the sandbox. Secure
[Jürjens 2004] [Jürjens 2005] application frameworks use a different
technique. A secure application framework
4.8.4 Compilers provides functionality through framework-
Most procedural language compilers will support provided libraries, applications and other built-in
some capability for compile-time checking of resources [Goertzel and Goguen 2005]. To use
language based security issues. Using the the services provided by the framework,
appropriate compiler invocation flags, compilers applications invoke the service as prescribed by
can perform strict type checking and “flag and the particular framework being used. Prominent
eliminate code constructs and errors with examples of these types of security services
security implications, such as pointer and array include secure sockets layer/transport security
access semantics that could generate memory layer (SSL/TSL) services [Rescorla 2001],
access errors, and bounds checking of memory cryptographic libraries, HTTPS, and the .NET
references to detect and prevent buffer overflow Windows security services.
vulnerabilities on stacks and (sometimes) heaps.
[Goertzel and Goguen 2005].” 4.8.5 Static Analysis
Many C/C++ compilers can determine whether Static analysis provides useful information about
format strings used in printf or scanf statements the complexity, structure, style, and other
are being used properly as well as ensuring the directly observable characteristics of code.
proper number of arguments are being used. While most often associated with source code,
Since these vulnerabilities are the source of Java bytecode and other binary files can also be
many code injection and buffer overflow attacks, statically analyzed. Furthermore, static analysis
it should always be enabled. may offer evidence of potential problems either
as a result of inadequate programming and
“Code should never be compiled with debugging
design approaches or as a result of a malicious
options when producing the production binary
code insertion. The following paragraphs
executable of an application or component. The
describe the types of things static analysis tools
reason for this is that some very popular
can detect. The information in the following
commercial operating systems have been
paragraphs draw heavily from and paraphrase
reported to contain a critical flaw that enables an
[NIST 2005]. Readers are encouraged to consult
attacker to exploit the operating system’s
[NIST 2005] for a further information as well as
standard, documented debug interface … to gain
a list of some of the current analysis tools and
control of programs accessed over the network.
development suites available.
[Goertzel and Goguen 2005]”
Source code scanning tools identify software
Similar to “safe” compilers, there are also
vulnerabilities during development or after
versions of safe runtime libraries. This libraries
deployment. These tools detect coding flaws by
contain versions of system calls or language-
examining the source code rather than scanning
specific functions that have been hardened
a target application at run time.
against vulnerabilities present in the normal
runtime libraries. Because they have access to application source
code, static source code scanners can identify a
Restricted execution and secure application
host of security issues including range and type
environments can be used to improve security
errors, calls to vulnerable library functions,
[Goertzel and Goguen 2005]. The sandboxing
technique used most notably by Java is an buffer overflow possibilities, and unvalidated
example of a restricted execution environment. input. Runtime environmental problems can
In the sandbox model, trusted code is given also be detected including relative path
access to all resources subject to normal security vulnerabilities, and possible sources of
policy. Untrusted code, like downloaded java information leaks.

86
Depending on the capability of the tool, scanners previously unobserved malicious behavior are
can identify synchronization and timing errors not unknown.
such as covert timing channels and time of Web Application Scanners focus specifically on
check, time of use race conditions. Protocol web applications and perform field manipulation
errors such as failure to follow chain of trust in cookie poisoning functions. In contrast, Web
certificate validation or key exchange without Services Scanners focus analyze web-based
authentication can be detected as well as services and perform things like XML validation
general logic errors and other flaws. of received messages.
While static Byte Code Scanners are used in a
4.8.7 Development Tool Suites
similar fashion to source code scanners, they
detect vulnerabilities primarily through pattern Development software can benefit the
recognition. They can determine, for instance, production of secure software. A typical
whether a method exposes its internal software requirements phase produces
representation or whether a finalizer should be requirements documents that can be examined
protected. by automated tools. However, a suite of widely-
used, peer-reviewed software requirements and
Static Binary Code Scanners detect design analysis tools does not yet exist.
vulnerabilities through disassembly and pattern However, whatever software is used, it should
recognition. Binary code scanners examine the support determining whether the specifications
compiled object code directly and as a result can are:
factor in any vulnerabilities created by the
compiler itself. Furthermore, library function • complete
code (not available to a source code scanner) can • consistent
be examined as well. • correct
Binary code scanners analyze binary signatures • modifiable
within executables for things like SQL injection,
cross-site scripting, buffer overflows, and • ranked or rated
missing format strings. • traceable
Database Scanners are used specifically to • unambiguous
identify vulnerabilities in database applications. • understandable
In addition to performing “external” functions
like password cracking, the tools also examine • verifiable
the internal configuration of the database for Suites covering the entire lifecycle are available
possible exploitable vulnerabilities. Database from IBM Rational and by composing suites
scanning tools discover vulnerabilities by from multiple vendors.
checking for adequate passwords, default
account vulnerabilities, role permissions, and 4.8.8 References
unauthorized object owners. [Alexander 1995] Alexander, Perry, Best of Both
Worlds: Combining Formal and Semi-formal
4.8.6 Dynamic Analysis Methods in Software Engineering, IEEE
Dynamic observes entities in operation. Network Potentials, December/January, 1995.
scanners can detect vulnerabilities in operating [Anderson 2001] Anderson, Ross, Security
systems, applications, servers and mobile code Engineering, John Wiley and Sons, Inc., 2001.
that manifest some type of observable network [Barnes 2003] Barnes, John. High Integrity
behavior such as generating network traffic. Software: The SPARK Approach to Safety and
These types of scanners typically look for known Security, Addison Wesley 2003
vulnerabilities but heuristics for detecting
[Bishop 2003] Bishop, Matt, Computer Security:
Art and Science, Addison Wesley, 2003.
[Bosworth and Kabay 2002] Bosworth, Seymour [NIST 2005] National Institute of Standards and
and Kabay, M. eds., Computer Security Technology, Software Diagnostics Conformance
Handbook, 4th Edition, John Wiley and Sons, and Testing Division, Software Diagnostics and
2002. Conformance Testing Division,
[Chirillo 2002] Chirillo, John, Hack Attacks //samate.nist.gov, Accessed 8 Aug 2005.
Revealed: A Complete Reference for UNIX, [Rescorla 2001] Rescorla, Eric, “SSL and TLS:
Windows, and Linux with Custom Security Designing and Building Secure Systems”,
Toolset, Wiley Publishing, Inc., 2002. Addison-Wesley Publishing, 2001.
[Clarke and Wing 1996] Clarke, Edmund and [Sinclair 2005] Sinclair, David, “Introduction to
Wing, Jeannette, Formal Methods: State of the Formal Methods”, Course Notes,
Art and Future Directions, ACM Computing www.computing.dcu.ie/~davids/courses/CA548,
Surveys, Vol. 28, No. 4, December 1996. 2005.
[Flickenger 2003] Flickenger, Rob, Wireless [Skoudis 2002] Skoudis, Ed, Counter Hack: A
Hacks, O’Reilly and Associates, Inc., 2003. Step-by-step Guide to Computer Attacks and
[Goertzel and Goguen 2005] Goertzel, Karen Effective Defenses, Prentice Hall, 2002.
and Goguen, Alice, et al., Application [Vaughn and George 2003] Vaughn, Rayford
Developer’s Guide to Security-Enhancing the and George, Vinu, Application of Lightweight
Software Development Lifecycle (DRAFT), Formal Methods in Requirement Engineering,
UNPUBLISHED, May 2005. The Journal of Defense Software Engineering,
[Graff and Wyk 2003] Graff, Mark and Wyk, January 2003.
Kenneth, Secure Coding Principles and [Viega and McGraw 2002] Viega, John and
Practices, O’Reilly and Associates, Inc., 2003. McGraw Gary, Building Secure Software,
[Howard and LeBlanc 2003] Howard, Michael Addison-Wesley, 2002.
and LeBlanc, David, Writing Secure Code, 2nd [Viega and Messier 2003] Viega, John and
Edition, Microsoft Press, 2003. Messier, Matt, Secure Programming Cookbook,
[Jürjens 2004] Jürjens, Jan, Secure Systems O’Reilly and Associates, Inc., 2003.
Development with UML, Springer-Verlag 2004 [Whitman and Mattord 2005] Whitman, Michael
[Jürjens 2005] Jürjens, Jan, “Sound Methods and and Mattord, Herbert, Principles of Information
Effective Tools for Model-based Security Security, 2nd Edition, Thomson Course
Engineering with UML,” 27th International Technology, 2005.
Conference on Software Engineering, St.Louis, [Whittaker and Thompson 2004] Whittaker,
Missouri, USA; 15 - 21 May 2005 James and Thompson, Herbert, How to Break
[Koziol et al. 2004] Koziol, Jack and Litchfield, Software Security, Pearson Education, Inc.,
Dave and Aitel, Dave and Anley, Chris and 2004.
Sinan, Eren and Neel, Mehta and Hassell, Riley,
The Shellcoder’s Handbook: Discovering and
4.8.9 Additional Readings
Exploiting Security Holes, Wiley Publishing, Matt Bishop. Computer Security: Art and
Inc. 2004. Practice, Addison-Wesley, 2003
[Langley 2005] Langley Formal Methods, Bourque, Pierre, and Robert Dupuis (Editors).
shemesh.larc.nasa.gov/fm/fm-what.html, 2005. Guide to the Software Engineering Body of
Knowledge. 2004 Edition. Los Alamitos,
[Meunier 2004] Meunier, Pascal, Secure
California: IEEE Computer Society, February
Programming Educational Material, Purdue
16, 2004.
University,
http://www.cerias.purdue.edu/homes/pmeunier/s Ibrahim, Linda, et al. Safety and Security
ecprog/, 2004. Extensions for Integrated Capability Maturity

88
Models. Washington D.C.: United States Federal and II. Washington D.C.: National Cyber
Aviation Administration, September 2004. Security Partnership, 2004.
Redwine, Samuel T., Jr., and Noopur Davis I. Sommerville. Software Engineering. 7th ed.
(Editors). Processes for Producing Secure Pearson Education, 2004.
Software: Towards Secure Software. Volumes I J. Viega. The CLASP Application Security
Process. Secure Software, 2005.
4.9 Secure Software Processes
This Secure Software Engineering Process – development and sustainment – has all the
subsection has two themes. usual problems and solutions of organizational
• The process arrangements for technical and change and software process improvement.
managerial activities within the secure 4.9.1 Heavyweight Processes
software lifecycle
Heavyweight processes are often built around
• Definition, implementation, assessment, extensive use of reviews and formal methods.
measurement, management, change, and
improvement of the these secure software
processes
The processes described cover a major part or all
of secure software acquisition, development,
sustainment, and retirement. They vary from
ones based around mathematical formalism to
those merely trying to make modest
improvements a legacy product.
This report presumes knowledge of software
engineering activities and processes that do not
Security
have security or safety as a concern. A number
of activities are new or significantly modified for
Policy
secure software. The other sections of this report Figure 3: Formal Methods Assurance (Dashed) –
cover these changes in activities. This section Modified from a diagram by Praxis Critical Systems
emphasizes the changes in overall lifecycle Formal notations used include Z, ACT2,
process by describing several classes and extensions to UML such as UMLsec, and code.
examples of secure software processes. [Hall ] [Moore ] [J [Barnes 2003]
In practice these vary across a wide range, but
they will be grouped here into three categories
Security
Error: Reference source not found show an
example of how formal techniques might be P
• Heavyweight development processes [Hall Properties
used to increase assurance by adding evidence to
S
2002] [NSA 2002, Chapter 3], the assurance case in the form of proofs. Use of
• Lightweight processes, [Viega 2005] [Lipner
2005a] [Howard 2002] [Meier 2004]
proofs does not lead to completely eliminating
testing. Demonstrating that the system handles Pro
all possible input – the functions are total – and
• Processes especially for the problems of that they are functions – .g. deterministic – aids
legacy software [Viega 2005, p. 40] [Meier in showing security properties. Not shown
2004, p. lxxxi]. explicitly are proofs needed to deal with
Generally, the kinds of activities done in concurrency.
heavyweight processes are a modest superset of One of the three processes mentioned in
lightweight processes but with many performed
with considerably more rigor. The activities used
[Redwine 2004] as foundations for producing
secure software uses fully explicit formal Pr
remedially for legacy software consist of about
half of the lightweight ones. Note that legacy
methods – Correctness-by-Construction. The
two other processes – Cleanroom and TSP-
Se
software may eventually require more serious
rework such as major rearchitecting.
Secure – use mathematically based methods such
as state machines but with more reliance on
Pro
Introducing secure software-oriented changes in reviews and informal proofs and less on formal
processes within ongoing organizations, projects proofs.

System Test
Specification
At least one exception exists to heavyweight • Implementation
processes using formal methods. While aimed at o Apply coding and testing standards
safety, the aviation industry’s heavyweight
process based on DO-178B uses extensive o Apply security-testing tools including
systematic testing not formal methods. Its fuzzing tools
unmodified effectiveness for security has not o Apply static-analysis code scanning tools
been accepted by DoD for the newest US o Conduct code reviews
military aircraft.
• Verification
4.9.2 Lightweight Processes o While the software is undergoing beta
Lightweight secure software processes may or testing, the product team conducts a
may not be used with processes having CMM "security push" that includes security
levels. A CMM-style addition has been proposed code reviews beyond those completed in
for safety and security and incorporated into the the implementation phase as well as
FAA’s iCMM but not the SEI’s CMMI. focused security testing
[Ibrahim et al, 2004] o Final Security Review or FSR
Microsoft’s Secure Development Lifecycle is an • Support
example of a lightweight process that builds on a
o Prepare to evaluate reports of
number of existing processes of varying degrees
of restrictiveness. [Lipner 2005a] Covering vulnerabilities and release security
engineer education, metrics, and accountability; advisories and updates when appropriate
and presumes a centralized security group, it o Conduct a post-mortem of reported
incorporates … vulnerabilities and taking action as
• Planning and Requirements necessary
o Assign "security buddy" from central o Improve tools and processes to avoid
group similar future vulnerabilities
o Incorporating security activities into CLASP adds 30 activities to existing software
processes without explicit concern for the
plans
previously existing processes. [Viega 2005]
o Identify key security objectives These activities are
o Consider security feature requirements • Institute security awareness program
o Consider need to comply with industry • Monitor security metrics Integrator
standards and by certification processes
• Manage certification process
such as the Common Criteria
o Consider how the security features and • Specify operational environment
assurance measures of its software will • Identify global security policy
integrate with other software likely to be • Identify user roles and requirements
used together with its software
• Detail misuse cases
• Design
• Perform security analysis of requirements
o Define security architecture and design
• Document security design assumptions
guidelines
o Document the elements of the software • Specify resource-based security properties
attack surface • Apply security principles to design
o Conduct threat modeling • Research and assess security solutions
o Define supplemental ship criteria for • Build information labeling scheme
security
• Design UI for security functionality He also provides a bonus touchpoint: external
• Annotate class designs with security analysis (outside the design team) is often a
properties necessity when it comes to security. He also
makes a point that at the design and architecture
• Perform security functionality usability level, a system must be coherent and present a
testing unified security front.
• Manage System Security Authorization
Agreement
• Specify database security configuration
• Perform security analysis of system Security Externa Static Penetratio
design requirement l analysi n
s review s testing
• Integrate security analysis into build Abus Risk Risk-based (tools) Risk Securit
e analysi security analysi y
process cases s tests s breaks

• Implement and elaborate resource


policies
Cod
• Implement interface contracts Requirements
and use
Desig
n
Test
plans
e
Test
result
Field
feedbac
cases s k
• Perform software security fault injection Figure 4: Activities and Applied Knowledge [Barnum 2005]
testing
• Address reported security issues
• Perform source-level security review
• Identify and implement security tests
• Verify security attributes of resources
• Perform code signing
• Build operational security guide
• Manage security issue disclosure process , Lightweight processes have found favor in
industry as a way to produce more secure
[McGraw 2005] takes an artifact-oriented
software. Unlike some heavyweight processes,
approach and proposes seven “touch points” to
their ability to produce highly secure software is
add to existing processes. These are
not established.
1. Abuse cases
2. Good security requirements 4.9.3 Legacy Upgrade Processes
3. Code review In practice, the lightweight processes discussed
4. Risk analysis is a necessity in the prior subsection have been applied to
legacy software. Two authors suggest processes
5. Penetration testing is also useful, especially or activities especially for approaching the
if an architectural risk analysis is driving the problems of legacy software [Viega 2005, p. 40]
tests and to a much lesser extent [Meier 2004, p.
6. Testing security functionality with standard lxxxi].
functional testing techniques, and risk-based Viega lists a varient of the CLASP activities in a
security testing based on attack patterns “Legacy Roadmap”. These are
7. Monitoring software behavior is an essential
• Institute security awareness program
defensive technique. Knowledge gained by
understanding attacks and exploits should be • Specify operational environment
cycled back into software development • Identify recourses and trust boundaries
• Document security relevant requirements 4.9.5 Improving Processes for
• Identify attack surface Developing Secure Software
• Perform security analysis of system Introducing changes in software engineering
requirements and design (threat modeling) processes to deal with the added requirements
• Address reported security issues for secure software resembles making other
software process changes. Organizational change
• Perform source-level security Reviewer and software process improvement are fields in
Identify, implement, and perform security which a substantial body of knowledge and
tests expertise exist; [Redwine 2004, Chapter 6] lists a
• Verify security attributes of resources number of references. The general characteristics
• Build operational security guide of good process definitions and characteristics
have also been enumerated [ISO 15288]
• Manage security issue disclosure process [Redwine 2004, Chapters 3 and 5]. These are
[Meier 2004, p. lxxxi] states, “Threat modeling presumed knowledge. The subsections below
and security assessment (specifically the code enumerate some items special considerations for
review and deployment review …) apply when secure software.
you build new Web applications or when you
review existing applications.” 4.9.5.1 Introducing Secure Software
See Introducing Secure Software Engineering
Engineering Processes
Processes subsection below. Readiness for these changes involves motivation,
general familiarity, and the particular skills
4.9.4 Concern for Developmental required for the changes being introduced.
Security Security awareness training has frequently been
Subversion must be overcome by verification used to initially address the first two. [Viega
and validation with a levels of independence, 2005] [Lipner and Howard 2003??].While
skill, and intensity sufficient to reduce the risk of sharing some features with computer security
the inclusion of malicious code in the product (or awareness training given to users and system
malicious elements in non-code artifacts) to an administrators it also provides an introduction to
acceptable level. This may include automated security-related virtues and sins in software
analysis and testing, but currently only human including common kinds of exploits – and
reviewers contain the intelligence needed. possibly a demonstrated breaking of the software
product of the project or team receiving training.
The concept of separation of duties is a central
one in countering maliciousness. This could Change is facilitated by early changes showing
extend to thorough reviews at all levels immediate payoff. These might include changing
including review of the “final” version resulting configuration settings to leave only universally
in final approvals by other than the author used, essential services available by default;
accompanied by configuration management eliminating uniform default passwords; and
ensuring the version reviewed and approved is ensuring all external inputs are validated.
the version delivered.. What to do first, however, may be driven by an
Verification, validation, and evaluation activities expert security audit of the product. [Klocwork
should be controlled to prevent deliberate n.d.] [Viega 2005] While this may require
overlooking or mislabeling of defects. obtaining outside expert services, the properly
explained results can directly motivate not only
Operations that touch the code such as doing changes to the product but changes to the
builds or preparing releases must also be process. Some process changes have first be
controlled and verified. done in a form aimed at finding all the items of
certain kinds needing changes, for example
problems deriving from unauthenticated users, techniques such as formally expressed
but can be motivated by current vulnerabilities “contracts” for invocations. [Myers 84??]
and avoiding similar problems in the future. While use of formal methods may be motivated
Repeated auditing could be used to motivate yet by concerns for security, their development has
more changes. [Klocwork n.d.] been driven as much by concerns for correctness
and safety. A mixed history exists concerning
4.9.5.2 Improving Secure Software
their introduction and use in industry. [??]
Engineering Processes Events related to the industrial use of formal
Special considerations in secure software methods include the decade old annual
engineering process improvement include International Workshop on Formal Methods for
driving forces and mathematical skills required. Industrial Critical Systems (FMICS)10 and a
Newly identified or exploited threats, exploits, related industry association is Formal
and vulnerabilities may drive changes in Techniques Industrial Association (FotTIA).
processes to counter them. These may be
identified by internal reviews, tests, or audits; or 4.9.6 References
may arise in the external world. [Barnes 2003] Barnes, John. High Integrity
The latter may be particular to the organization’s Software: The SPARK Approach to Safety and
product(s) or a general development. Scanning Security, Addison Wesley 2003
the environment for relevant items can provide [Barnum 2005]Sean Barnum, Gary McGraw.
early warnings of possible future problems. "Knowledge for Software Security," IEEE
Driven by security concerns, one major software Security and Privacy, vol.03, no.2, pp. 74-78,
vendor has adopted the policy of increasing the March/April 2005.
absolute requirements on its projects software [Boehm], Boehm, Barry, and Richard Turner,
engineering processes every six months. Balancing Agility and Discipline: A Guide for
The highest EAL-levels of the Common Criteria the Perplexed. Addison-Wesley 2003.
and much practice in high-confidence systems [Howard 2003] Howard, M., and S. Lipner,
call for use of mathematically based formal "Inside the Windows Security Push," IEEE
methods. These may require a limited amount of Security & Privacy, vol.1, no. 1, 2003, pp. 57-
knowledge of discrete mathematics – often 61.
centered on set theory and predicate logic – but [Humphrey 2000] Humphrey, Watts S.
may require sophistication in performing proofs Introduction to the Team Software Process,
for some on the project. Reading, MA: Addison Wesley, 2000.
Accept possibly for a few approaches, the [Ibrahim et al, 2004] Ibrahim, Linda, et al.
mathematics involved is much less sophisticated Safety and Security Extensions for Integrated
than other field of engineering – albeit discrete Capability Maturity Models. Washington D.C.:
rather than continuous mathematics.9 United States Federal Aviation Administration,
Nevertheless, many US developers and Sept. 2004.
managers appear to possess a reluctance to learn [Jürjens 2004] Jürjens, Jan, Secure Systems
(in the majority of the cases, probably relearn) Development with UML, Springer-Verlag 2004
the relatively simple mathematics involved so as
to apply it to software. As importantly, [Jürjens 2005] Jürjens, Jan, “Sound Methods and
organizations may possess a reluctance to pay Effective Tools for Model-based Security
for the training and learning. One does see, Engineering with UML,” 27th International
however, an increased acceptance of particular Conference on Software Engineering, St.Louis,
Missouri, USA; 15 - 21 May 2005

9
Remember this report does not address developing
10
cryptographic software. See http://www.inrialpes.fr/vasy/fmics/
[King] King, Steve, Jonathan Hammond, Rod [Powell] Prowell, S., C. Trammell, R. Linger,
Chapman, and Andy Pryor “Is Proof More Cost- and J. Poore, Cleanroom Software Engineering:
Effective Than Testing?” IEEE Transactions of Technology and Process, Addison Wesley,
Software Engineering, VOL. 26, No. 8, August Reading, MA, 1999.
2000. [Redwine 2004] Redwine, Samuel T., Jr., and
[Linger 1994] Linger, Richard. “Cleanroom Noopur Davis (Editors). Processes for
Process Model,” IEEE Software, IEEE Producing Secure Software: Towards Secure
Computer Society, March 1994. Software. vols. I and II. Washington, D.C.:
[Lipner 2005a] Lipner, Steve and Michael National Cyber Security Partnership, 2004.
Howard, The Trustworthy Computing Security [Rogers] Rogers, Everett. Diffusion of
Development Lifecycle, Microsoft, 2005 Innovations. Free Press, 1995.
http://msdn.microsoft.com/security/default.aspx?
pull=/library/en- 4.9.7 Further Readings
us/dnsecure/html/sdl.asp#sdl2_topic8?_r=1 [Jacky 1996] Jacky, Jonathan, The Way of Z:
[McGraw 2005] McGraw, Gary, “The 7 Practical Programming with Formal Methods,
Touchpoints of Secure Software,” Software Cambridge University Press, 1996
Development, September 2005 [Mead 2003] Mead, Nancy R. “Lifecycle
[McGraw 2004] McGraw, Gary, “Software Models for High Assurance Systems,” Proc. of
Security”, IEEE Security and Privacy, to appear Software Engineering for High Assurance
March 2004 Systems: Synergies between Process, Product,
[Meier 2004] Meier, J.D., Alex Mackman, and Profiling (SEHAS 2003), Software
Srinath Vasireddy, Michael Dunner, Ray Engineering Institute, 2003, p. 33. Available:
Escamilla, and Anandha Murukan, Improving http://www.sei.cmu.edu/community/sehas-
Web Application Security: Threats and workshop/.
Countermeasures, Microsoft, 2004 [Saitta 2005] Saitta, Paul, Brenda Larcom and
http://download.microsoft.com/download/d/8/c/d Michael Eddington. "Trike v.1 Methodology
8c02f31-64af-438c-a9f4- Document [Draft]". 2005/06/20.
e31acb8e3333/Threats_Countermeasures.pdf http://www.hhhh.org/trike/papers/Trike_v1_Met
[Mills] H. Mills and R. Linger, “Cleanroom hodology_Document-draft.pdf
Software Engineering,” Encyclopedia of
Software Engineering, 2nd ed., (J. Marciniak,
ed.), John Wiley & Sons, New York, 2002.
4.10Secure Software Project Management
4.10.1 Scope expertise may help stretch this scare resource
across all the projects in need. [Lipner 2005]
Secure Software Management is the “systematic,
disciplined, and quantified” application of Vigilance and proper processes can reduce the
management activity to include the “planning, chances of successful subversion. This means
coordinating, measuring, monitoring, using separation of duties and privileges as well
controlling, and reporting” that ensures the as least privilege. The concept of separation of
software being developed conforms to security duties could extend to thorough reviews at all
policies and meets security requirements [Abran levels including review of the “final” version
and Moore 2004]. resulting in final approvals by other than the
author accompanied by configuration
See More Concepts and Principles subsection in
management ensuring the version reviewed and
the Development section.
approved is the version delivered..
4.10.2 Start Up 4.10.7.2 Development Work
See Processes section. Environment
4.10.3 Scoping Project Secure software would best be done in an
environment with at least the level of security as
required of the product. The work environment
needs not only to be secure but contain the
proper tools and equipment for the approach
4.10.4 Project Risk Management
being taken to secure software. Also see Metods
and Tools section.
4.10.5 Economic Trades Physical security is also a concern as good
physical security is essential to maintaining
information security.
4.10.6 Selecting a Secure Software Commercial needs or customer requirements
Process may call for a higher than normal level of
operational security.
4.10.7.3 Managing Contractors for
4.10.7 Security Management Security
Secure software would best be done in an
See Acquisition section.
environment with at least the level of security as
required of the product. 4.10.7.4 Managing Outsourced Services
4.10.7.1 Personnel Management for Security
See Acquisition section.
People who are intelligent, highly skilled, and
knowledgeable about secure software may be 4.10.8 Assuring Security Level of
hard to find and recruit, and require careful
Software Shipped
management to ensure retention. In addition,
care needs to be taken to avoid personnel One of the key differences in managing secure
security problems. software development is the increased rigor of
the final assessment and approval process.
A management task that is always important,
Additional reviews and testing may be used.
having the proper skills on the project, is even
[Lipner 2005] In particular, development
more important and difficult for secure software
organizations are adhering to a strict policy of
projects. A central group with in depth security
never shipping a known vulnerability.
4.10.9 References 4.10.10 Additional Readings
[Lipner 2005a] Lipner, Steve and Michael [Feathers 2005] Feathers, M.C., Working
Howard, The Trustworthy Computing Security Effectively with Legacy Code, Prentice Hall,
Development Lifecycle, Microsoft, 2005 2005
http://msdn.microsoft.com/security/default.aspx?
pull=/library/en-
us/dnsecure/html/sdl.asp#sdl2_topic8?_r=1
5 Post-Release Assurance
5.1 Introduction or emergency actions. In both proactive and
reactive assurance, executive management
Post-release assurance involves processes that determines the level of resources by matching
sustain the capability of software or software the surety of the level of defense required to the
intensive systems to continue to satisfy their degree of acceptable risk. That is, environments
intended purpose. From a security standpoint, with the highest risk should have a high surety of
the environment in which software and software defense.
intensive systems exists is ever changing. Corrective change involves the identification and
Therefore, confidence in the continuing security removal of vulnerabilities and correction of
(confidentiality, integrity, availability, etc.) and actual errors (e.g. the security features were not
of software and the information that it maintains implemented as specified). Preventive change
is renewed on a constant basis. That process of involves the identification and detection of latent
renewal takes place throughout the useful vulnerabilities (e.g. the software or software
lifecycle of any item of software, or software intensive system is found to be vulnerable to a
intensive system. particular class of intrusion). Perfective change
Post-release assurance activities can be proactive involves the improvement of performance,
or reactive. Proactive activities include such dependability and maintainability (e.g. a better
things as the identification of threats and single logon solution is available which is more
vulnerabilities, the creation, assessment, and secure). Adaptive change involves adaptation to
optimization of security solutions within a a new or upgraded environment (e.g. a new
generalized security architecture, and the operating system with enhanced security
implementation of controls to protect the functionality is available). Emergency change
software and information it maintains. involves unscheduled corrective action (e.g. an
Reactive activities include such things as intrusion or violation has taken place).
responding to threats (potential events), During normal operation, post-release assurance
intrusions (external or internal), and violations. activities are undertaken to monitor the system’s
Sensing, analysis and responding operations capability to deliver services, identify and record
underwrite both proactive and reactive security problems for analysis, take the appropriate
responses. Sensing activities include monitoring, corrective, adaptive, perfective or preventive
testing, and assessment activities to identify action and confirm the restored capability. Post-
security intrusions (e.g. a break in), violations release assurance activities also encompass the
(e.g. an inappropriate access) and vulnerabilities migration and retirement of the software. The
(a weakness). Analysis activities include the process ends for any given item when the
activities undertaken to analyze identified software is retired (ISO/IEC 15288 – IEEE
exposures and vulnerabilities in order to 12207.0).
understand their risk and impact. Responding
activities include formal authorization for a 5.2 Operational Assurance
selected remediation option, monitoring and (Sensing)
assurance of the change process and assurance of
correct re-integration of the altered code. Operational assurance encompasses a set of
In proactive assurance, decision makers have the assurance activities that exist within the larger
option of authorizing preventive, perfective, or context of the Operations primary process (e.g.
adaptive actions. In reactive assurance, decision ISO 12207 section 5.4 – Operation Process).
makers have the option of authorizing corrective, Operational assurance involves the use of

98
defined policies, procedures, tools, and standards 5.2.1 Initial Startup
to monitor and review the software or software
intensive systems in their operating environment. It is necessary to ensure that the software is
This is done to identify and resolve security and installed in a secure fashion and with all
control vulnerabilities and violations. security functions enabled. Therefore it is a
Monitoring is an ongoing activity that focuses on requisite of good practice to
the software, the system, the policies, and the • Identify a feasible security perimeter
users. Because vulnerabilities can be associated
with application software, operating system • Prepare a concept of operations
software, network or device configuration, document
policies and procedures, security mechanisms,
physical security, and employee usage, • Prepare an operational testing plan
monitoring is not limited to software alone but • Prepare a policy to ensure appropriate
extends to the environment that surrounds the
response to unexpected incidents
software. Technical monitoring techniques
include intrusion detection, penetration testing, • Prepare a secure site plan
and violation processing using clipping levels.
• Prepare a BCP and DRP with explicit
Reviewing is a periodic activity that evaluates
the software, the system, the policies and RTO, NRO and RPO set for each item within
procedures, and the users usage against the secure perimeter
established standards. Reviews may consist of • Ensure the system staff is adequately
walkthroughs, inspections, or audits. They can trained in secure operation
be both managerial and technical (IEEE, 1028).
Identified threats, vulnerabilities, and violations • Ensure the security staff is capable of
are recorded and reported using a defined utilizing all embedded security functionality
problem reporting and resolution process (e.g. • Identify a competent security
ISO 12207 section 6.8 – Problem Resolution accreditation process and obtain valid
Process). The problem resolution process should certification of security for the
be tailored to allow decision makers to authorize operational system
responses based on their assessment of multi-
dimensional risks. Finally, the policies, 5.2.2 Operational Testing
procedures, tools, and standards used for It is necessary to monitor the ongoing
operational assurance are also continuously functioning of the software within the
assessed so that recommendations can be made operational environment. That is because
to improve or enhance them (e.g. ISO 12207 security threats can arise at any point in the
section 7.3 – Improvement process). process and can represent any range of
unanticipated hazards. This should be done on a
disciplined and regularly scheduled basis.
Therefore it is a requisite of best practice to:
• Deploy a continuous operational testing
process to identify security threats, and
vulnerabilities and control violations in
software and software intensive systems (e.g.
ISO 12207 section 5.4.2 - Operational
testing).

99
5.2.3 Environmental Monitoring of remediation. Therefore it is a requisite of
good practice to:
Operational threats originate from the
environment in which the software functions. In • Assess and audit the policies, procedures,
that respect the environment represents the place tools, and standards used for operational
where early warning of impending hazards, or assurance. Document assessments and audits
attacks can be obtained. Therefore it is a and make recommendations for
requisite of best practice to continuously: improvement to the designated approving
authority (e.g. ISO 12207 section 7.3 –
• Monitor the operating environment that
Improvement Process).
surrounds software and software intensive
systems to identify and address security and 5.2.6 Recommended References for
control threats, exposures, vulnerabilities,
and violations as they arise (threat
Operational Assurance
identification). DOD 5200.28-STD, "Department of Defense
Trusted Computer System Evaluation Criteria",
5.2.4 Incident Reporting 1985.
Incidents are reported through a disciplined CCIMB-2004-01-001, "Common Criteria for
process. The aim is to respond as quickly as Information Technology Security Evaluation",
possible to trouble arising from identification of 2004.
operational vulnerabilities, or malfunctions, or "COBIT 3rd Edition Executive Summary", IT
incidents that occur as a result of attempts to Governance Institute, Available from
exploit vulnerabilities or malfunctions. The Information Systems Audit and Control
process must be both standard in terms of its Association: http://www.isaca.org, Accessed:
procedures and documentation and well known July 2005
throughout the organization. Therefore it is a
"CobiT in Academia", IT Governance Institute,
requisite of good practice to institute a
2004, available from Information Systems Audit
systematic process to document and record threat
and Control Association: http://www.isaca.org,
exposures and vulnerabilities (trouble reporting).
Accessed: July 20, 2005.
Premature disclosure of a discovered or reported
Carter, Earl, Cisco Systems Inc., "CCSP Self-
vulnerability can lead to serious consequences.
Study: Cisco Secure Intrusion Detection
Software producers with poor reputations for
System", Cisco Press, 2004005.
fixing vulnerabilities or who do not keep
reporters of vulnerabilities informed on progress Escamilla, T. "Intrusion Detection: Network
towards repair increase the chances of premature Security Beyond the Firewall", Wiley, 1998,
disclosure. Chap. 5.Jones, Capers, Software Defect
Removal Efficiency, Computer, April 1996,
Identification occurs in KA 1 Operational
Vol.29, #4
Assurance but classification, prioritization, and
development of remediation options occurs in Kairab, S. "A Practical Guide to Security
KA 2 Response Management. Assessments", Auerbach Publications, 2005.
Krutz, R., R. Vines, "The CISSP Prep Guide:
5.2.5 Operational Process Assurance Mastering the CISSP and ISSEP Exams, Second
It is necessary to ensure that the process of Edition", John Wiley & Sons, 2004, Chap. 5, 6,
sustaining operational assurance is carried out in 10.
the most effective and efficient manner possible. Lee, E. Software Inspections: How to Diagnose
Thus, a continuous program is necessary to Problems and Improve the Odds of
monitor the functioning of the assurance process Organizational Acceptance, Crosstalk, Vol.10 #8
itself. This is done in order to identify and report 1997
deviations from correct practice for the purpose

100
Lukatsky, "Protect Your Information with IEEE/EIA 12207.2-1997, IEEE/EIA Guide:
Intrusion Detection", A-LIST Publishing, 2003, Industry Implementation of International
Chap. 1,4, 6 Standard ISO/IEC 12207:1995, 1998
Northcutt, S. "Computer Security Incident
Handling: An Action Plan for Dealing with 5.3 KA2 Analysis
Intrusions, Cyber-Theft, and Other Security- The Analysis Knowledge Area (KA) is
Related Events", SANS Institute, 2003 concerned with evaluating the impact change on
Peltier, T., J. Peltier, J. Blackley, "Managing a all relevant software units, versions and
Network Vulnerability Assessment", Auerbach documentation elements. This is done in order to
Publications, 2003 ensure a proper and correct response. Analysis
Riggs, S. "Network Perimeter Security: Building involves identification of the affected software
Defense In-Depth", Auerbach Publications, elements and all of its inherent dependencies and
2003, Chap. 9, 12 associated documentation.
Swiderski, F., W. Snyder, "Threat Modeling", Identified elements are examined to determine
Microsoft Press, 2004 impact of a requested change. Effects on existing
software and systems, as well as any interfacing
5.2.7 List of Further Readings systems and the organization as a whole are also
characterized. Security and safety impacts of the
5.2.8 List of Standards change are fully examined and documented and
BS ISO/IEC 17799:2000, Information communicated to the Response Management
Technology - Code of practice for information Function for authorization (KA 3).
security management In addition, to determining impacts the results of
ISO/IEC 12207: 1995, Information Technology - the analysis are formally recorded and
Software life cycle process. maintained by the organization in order to ensure
IEEE 730 1998 Software Quality Assurance increased understanding of the content and
Plans structure of the software asset, as well as to
support any additional causal analysis that might
IEEE 730.1 1995 Software Quality Assurance
be undertaken to understand security and control
Planning
issues that might be raised, or associated with
IEEE 1012 1986 Software Validation and the change.
Verification Plan
IEEE 1012a-1998 Content Map to IEEE/EIA 5.3.1 Understanding
12207.1-1997 It is necessary to fully understand all of the
IEEE 1028-1997 Standard for Software Reviews tangible elements and implications of a change
IEEE 1059 1993 Guideline for SVV Planning in order to implement the change properly. This
understanding involves fully comprehending the
IEEE 828 1998 Software Configuration
design architecture and the affected code. In
Management Plan
order to do this properly it is requisite good
IEEE 829 1998 Software Test Documentation practice to:
IEEE 1008 Standard for Software Unit Testing • Document the problem/modification request
IEEE Standard 1042-1987, Guide to Software and capture all requisite data in standard
Configuration Management form as specified by IEEE/EIA 12207.1 –
IEEE 1045-1992 IEEE Standard for Software 6.2.
Productivity Metrics
• Replicate or verify the existence of the
IEEE/EIA 12207.1-1997, IEEE/EIA Guide: reported problem – for the sake of resource
Industry Implementation of International coordination confirm that the problem really
Standard ISO/IEC 12207:1995, 1998. exists

101
• Verify the violation, exposure, or • Provide a formal statement of the criticality
vulnerability – understand the precise nature of the violation, exposure, or vulnerability
and implications of the anomaly and develop • Document all feasible options for analysis
overall response strategy
• Perform a comprehensive risk identification
• Identify elements to be modified in the – identification of the type and extent of risk
existing system – identify all system for each option
components that will be changed – develop
specific response strategy for each element • Perform a detailed risk evaluation – assess
using development good practice the likelihood and feasibility of each
identified risk for each option
• Identify interface elements affected by the
modification • Estimate of safety and security impacts if
change is implemented – based on likelihood
• Estimate impact of software change on percentages and feasibility for each option
system interfaces – perform impact analysis
on affected interfaces and from that design • Estimate of the safety and security impacts if
specific response strategy for each interface change IS NOT implemented – based on
using development good practice likelihood of occurrence of and financial and
operational impacts of each identified option
• Identify documentation to be updated
• Assess impact of change on security and
• Identify relevant security policies – validate control architecture – perform software
recommended response strategy against understanding and design description
relevant security and safety policy exercise for all automated security and
• Identify relevant legal, regulatory and control features. Estimate and assess
forensic requirements – validate implications of change as they impact policy
recommended response strategy against and procedure infrastructure
relevant legal, regulatory or forensic • Estimate impact of change of Business
requirements. Continuity/Disaster Recovery strategy –
provide feasible recovery time, NRO and
recovery point impact estimates for each
5.3.2 Analysis option
In order to develop and implement a specifically • Estimate ROI – estimate return on
tailored response, it is necessary to know what investment for each option, including total
the implications of a particular response strategy, cost of ownership and marginal loss
or action will be. That understanding requires the percentage.
conduct of a comprehensive and detailed
analysis based on a formal methodology. This • Estimate level of test and evaluation
methodology must ensure comprehensive and necessary for verification and validation –
unambiguous understanding of the software, its for each option prepare testing program –
requirements and its associated architecture. For sample test cases and methods of
each remediation option it is correct practice to: administration, estimate resource
requirements, staff capability and feasibility
• Identify the violation, exposure, or of administration.
vulnerability type – the threat is explicitly
classified by type1 • Financial Analysis – estimate financial
impacts where appropriate for each option
• Identify the scope of the violation, exposure,
or vulnerability – the extent, or boundary of • Feasibility Analysis – estimate feasibility
the threat is fully and explicitly itemized and timelines for implementing each option
– prepare project plan for each option if
1 detailed level of understanding indicated.

102
Evaluating Information Technology Investments,
Office of Management and Budget, at
5.3.3 Reporting www.itmweb.com, 1999
The results of the Analysis phase must Han, Jun, Designing for Increased Software
specifically support targeted decision making in Maintainability, International Conference on
the selection of the response. Therefore it is Software Maintenance (ICSM, 97), January 1,
essential that the body-of-evidence that is 1997
developed in the analysis phase is communicated
Hatton, L. (2002) Safer Language Subsets: an
in a highly understandable fashion to the
overview and a case history, MISRA C,
designated decision maker for authorization of
submitted to Information and Software
the change. For each change that is requested, it
Technology, June, 2002
is correct practice to:
Hatton, L. (2001) Exploring the role of
• Determine, or designate the appropriate Diagnosis in Software Failure, IEEE Software,
decision maker – This identification may be July.
based on the results of the analysis phase, or
carried out as a result of pre-designation. Hatton, L., (1999) Repetitive failure, feedback
This may also be done through a pre-selected and the lost art of diagnosis, Journal of Systems
control board composed of the appropriate and Software, 1999. Jones, Capers, Software
decision makers. If the decision is significant Project Management Practices: Failures versus
enough, it may also be implemented through Success, Crosstalk, October 1, 2004, pp, 5-9
a process of identification and selection that Lee, E. Software Inspections: How to Diagnose
is instituted at the conclusion of the analysis Problems and Improve the Odds of
phase. Organizational Acceptance, Crosstalk, Vol.10
#8 1997
• Report the results of the analysis with a full
explanation of the implementation Pfleeger, S. and Hatton, L. (1997) "Do formal
requirements for each remediation option to methods really work", IEEE Computer, Jan 1997
decision maker or decision-making body. Violino R, Measuring Value: Return on
This report is given to the designated Investment, Information Week, No. 637, (June
decision maker(s)– this report must clearly 30, 1997) pp. 36-44
outline the impacts of each option and it is Zimmerman, Michael, "Configuration
plainly understandable to lay-decision Management, Just a Fashion or a Profession",
makers. White Paper, usb GmbH, 1997
• Itemize the feasible remediation options,
these are expressed in a manner that is 5.3.5 List of Further Readings
understandable to lay-decision makers and Berry, John, IT ROI Metrics Fall Into Four
each option recommended is fully and Groups, Internet Week, July 16, 2001
demonstrably traceable to the business case. Babich, W., Software Configuration
Management, Addison-Wesley, 1986.
Bersoff, E., Henderson, V. and Siegel, S.,
5.3.4 Recommended References Software Configuration Management, Prentice-
Dart, Susan A., "Achieving the Best Possible Hall, 1980.
Configuration Management Solution", Crosstalk, Feiler, P., "Software Process Support in
September 1996 Software Development Environments", Fifth
Dorofee A.J., JA Walker, RC Williams, Risk International Software Process Workshop, ACM
Management in Practice, Crosstalk, Volume 10 Press, October 1990.
#4, April 1997 Feiler, P., "Configuration Management Models
in Commercial Environments", Tech. report

103
CMU/SEI-91-TR-7, ADA235782, Software management are also continuously assessed so
Engineering Institute, Carnegie-Mellon that recommendations can be made to improve
University, April 1991. or enhance the process (e.g. ISO 12207 section
SEI Bridge, "Configuration Management: State 7.3 – Improvement process).
of the Art", Software Engineering Institute,
Carnegie-Mellon University, March 1990.
5.4.1 Responding to Known
Vulnerabilities without Fixes
Whitgift, D., Methods and Tools for Software
Configuration Management, John Wiley and Good operational justifications for not yet fixing
Sons, England, 1991 a vulnerability in software might include such
things as the amount time needed to diagnose,
5.3.6 List of Standards prepare a repair, verify and validate it, and
ISO/IEC 12207: 1995, Information Technology - change the assurance case; it lack of resources;
Software life cycle process. difficulty or infeasibility of fixing; or
unwillingness to take down a critical operational
IEEE/EIA 12207.1-1997, IEEE/EIA Guide:
system. However, known vulnerabilities must be
Industry Implementation of International
monitored and managed. Therefore it is a
Standard ISO/IEC 12207:1995, 1998.
requisite of good practice to
IEEE/EIA 12207.2-1997, IEEE/EIA Guide:
Industry Implementation of International • Maintain continuous records of publicly
Standard ISO/IEC 12207:1995, 1998 known vulnerabilities.
IEEE Standard for Software Configuration • Maintain a continuous record of
Management Plans, 1998, IEEE/ANSI Standard privately known vulnerabilities
828-1998 • Monitor operational behavior of the
IEEE Guide to Software Configuration system in order to detect and recognize
Management, 1987, IEEE/ANSI Standard 1042- the signature of any attempt to exploit a
1987 known vulnerability
ISO/IEC 15288: 2002 Systems Engineering - • Set automated alarms to inform of
System Life Cycle Processes attempts to exploit a known vulnerability
ISO/IEC 15846: 1998 Information technology -- • Maintain a systematic and well-defined
Software life cycle processes -- Configuration response to any identified attempt to
Management, May 5, 1998 exploit a known vulnerability
• Ensure that the system staff understands
the required response to an attempt to
5.4 KA 3 Response exploit a known vulnerability
Management (Responding)
5.4.2 Change Control
Response management encompasses processes
To ensure that the response is correct and
that are executed within the larger context of the
effective, a formal process is necessary for
standard change process (e.g. ISO 12207 section
change control. This process must
5.5 – Maintenance Process). Response
unambiguously convey to the change agent all
management involves the management of the
technical and contextual requirements of the
remediation option using the development
selected remediation option. Valid and
process or the acquisition process (ISO12207
organizationally persistent controls are in place
sections 5.3 – Development process and 5.1 –
to ensure that this is done in a standard and
Acquisition process, respectively) as the agent of
disciplined fashion. Therefore it is good practice
change.
to: [ISO04, p.17, ISO96, p. 25]
Policies, e.g. awaiting patch, procedures, tools,
and standards that are employed for response

104
• Identify the appropriate change agent – this 5.4.2.1 Post-Change Analysis
may be either an acquisition, or a Changes to the software can eliminate
development entity vulnerabilities. However change can also create
• Develop and document a Statement of Work new ones. As a consequence attackers will
(SOW) and define the precise method for typically examine changed code to identify new
communicating this to the change agent. or different methods of exploitation and attack.
• Develop and document criteria for testing Conversely, the system staff must continue to
and evaluating the software or software analyze changes after the fact in order to
intensive system to ensure successful determine the long-term implications. Therefore
remediation. Communicate these to the it is a requisite of good practice to:
change agent prior to institution of the • Perform an analysis of the changed code
change process. within its operational environment in
• Develop and document criteria for ensuring order to identify potential points of
that elements and requirements that should future exploitation
not be modified remain unaffected. • Execute pen testing, or other stress
Thus, good practice calls for ensuring that the testing exercises to identify any points of
agent performing the change understands all the weakness, or failure
boundaries for the change. • Continue to update the countermeasure
set to reflect the current security status
and requirements of the software
• Ensure that all required operational
integration and testing processes are
executed on the changed code
• Ensure that all test results are reported to
the appropriate designated approving
authority for resolution.
5.4.2.2 Change Assurance
Although the change is implemented through the
agency of another process (either Acquisition, or
Development) the change assurance process
ensures that the change is made according to the
established criteria. [ISO04, p.17, ISO96, p. 25]
This is a continuous process involving the joint
review and problem resolution processes
specified by IEEE/EIA 12207.0. Therefore it is
good practice to:
• Monitor the change through joint reviews as
specified in the SOW- ensure that all reviews
specified by the statement of work are
conducted at their appropriate check points
• Ensure that action items issue out of each
review and closure criteria are specified
• Perform audits as specified in the SOW –
ensure any audit specified in the statement of
work is properly resourced and performed.

105
• Ensure that any audit findings involving The designated decision maker who authorized
non-concurrence are resolved. the change must provide approval for re-
• Monitor service levels as agreed to by integration. Once authorization is given the
contract – oversee the performance of the change will be re-integrated into the new
contract to ensure that required service levels controlled baseline In addition to certification of
are maintained the correctness of the re-integration it is
necessary to fully document and continuously
• Ensure that problems identified through joint maintain that new baseline configuration in an
reviews and audits are resolved - ensuring organizationally standard repository. [ISO04, p.
that any vulnerability, anomaly, or 21]. Therefore, to assure security it is good
vulnerability that is identified during the practice to:
review process is sufficiently addressed by
rework and that all closure criteria are • Confirm reintegration maintains correct level
satisfied of integrity and security [ISO04, p.21]. This
confirmation is certified by the results of the
• Following delivery of the change, testing program.
o Perform tests to ensure correctness in • Ensure documentation is updated at the
accordance with test and evaluation requisite level of integrity and security
criteria [ISO96, p.29]. [ISO04, p.21]. This is assured by confirming
o Conduct a verification and validation that the new baseline has been satisfactorily
program to sufficiently ensure the represented in the controlled baseline.
correctness and integrity of operation of • Ensure changed items maintain backward
the change as delivered. and forward traceability to other baselined
o Ensure functional completeness against states [ISO04, p.21]. This is assured by
requirements [ISO96, p.29]. maintaining a repository of prior baseline
o Ensure that the change satisfies all states that is typically called a static library,
function requirements specified in the or archive.
statement of work • Ensure configuration record is securely
o Ensure physical completeness against maintained throughout the lifecycle and
technical description [ISO96, p.29]. archived according to best practice [ISO04,
Validate correctness of the change p.21]
against technical description documents
5.4.3 Recertification and
Finally, assuring security requires audit trail Accreditation
documents’ traceability between change
authorization and delivered product [ISO96, In order to ensure public confidence, software
p.29]. Through rigorous review of the systems should be assessed and authorized by an
documentation, ensure that what was authorized appropriate third party agent using a commonly
is what will be delivered. accepted process. The findings of that process
are typically accredited by formal certification.
5.4.2.3 Change Reintegration In order to ensure continuing confidence re-
The change is correctly integrated back into the accreditation of these results should be obtained
system. To assure this, it is necessary to conduct on a periodic basis. This basis is typically
a technically rigorous process to certify that specified by organizational policy and/or
integration is correct and that all interfaces are regulation. Therefore, requirements of good
functioning properly. [ISO96, p. 30] A practice include:
comprehensive testing program should support • A legitimate third party agency is employed
this. This program should be defined by plan at to conduct the certification audits – the audit
the point where the changed artifact is delivered. standard could be by regulation, or contract

106
• Assessments for certification/re-certification this authentication process is
of accreditation are performed by properly documented and retained.
certified lead auditors o Integrity of the software, or system after
• Adequate resources are provided to ensure transition [ISO04, p.33]. All software
the effectiveness of the audit process operation and data integrity is confirmed
• The independence of the auditing/accrediting by an appropriate set of measures. The
agency is assured results of this process is documented and
retained.
• Consistent use of a standard audit method is
o Documentation reflects the changed state
assured
of the software, or system
• Re-certification audits are performed on a
timely enough basis to ensure continuing 5.4.5 Recommended References for
confidence Response Management
5.4.4 Secure Migration, Retirement, C. Riggs, "Network Perimeter Security: Building
Defense In-Depth", Auerbach Publications,
Loss, and Disposal
2003, Chap. 13.
The migration and retirement of software is
R. Krutz, R. Vines, "The CISSP Prep Guide:
controlled through organizationally standard and
Mastering the CISSP and ISSEP Exams, Second
rigorous processes. [ISO96, pp.25-26, ISO04,
Edition", John Wiley & Sons, 2004, Chap. 5, 6,
pp.37-38, pp.47-48] This control system ensures
10.
that the software asset base is evolved through a
rational and secure process and that the software 5.4.6 List of Further Readings
maintains the functional profile intended.
D. Mann, S. Christey, "Towards a Common
Therefore it is good practice to:
Enumeration of Vulnerabilities", The MITRE
• Ensure migration and retirement risks and Corporation, Bedford MA, 1999.
impacts are known [ISO96, pp.25-26] – this L. DeLooze, "Classification of computer attacks
is assured by means of the analysis function using a self-organizing map", Proceedings from
outlined in KA 2 the Fifth Annual IEEE SMC, 10-11 June 2004,
• Ensure a transition strategy is developed that Pages: 365-369.
will assure system services are maintained R. Martin, "Integrating your information
during transition [ISO04, p.33]. When this security vulnerability management capabilities
strategy is developed the safety and security through industry standards (CVE&OVAL)",
aspects of the transition are fully and Systems, Man and Cybernetics, 2003. IEEE
explicitly studied, characterized and the International Conference, Volume 2, 5-8 Oct.
understood. A designated approving 2003 Page(s):1528-1533
authority authorizes the strategy to execute
tests and inspections to assure 5.4.7 List of Standards
o Transition of the software, or system BS ISO/IEC 17799:2000, Information
[ISO04, p.33] will be accomplished in a Technology - Code of practice for information
safe and secure fashion – the transition security management
process is confirmed effective and ISO/IEC 12207: 1995, Information Technology -
correct. Software life cycle process.
o Proper functioning of the software, or IEEE/EIA 12207.1-1997, IEEE/EIA Guide:
system after transition [ISO04, p.33] – Industry Implementation of International
the effectiveness of the transition is Standard ISO/IEC 12207:1995, 1998.
confirmed and certified by proper
verification and validation. The results of

107
IEEE/EIA 12207.2-1997, IEEE/EIA Guide: are installed to assure the security and
Industry Implementation of International control of applications and infrastructure.
Standard ISO/IEC 12207:1995, 1998 • Assess, integrate, and optimize security
ISO/IEC 15288: 2002 Systems Engineering - architecture – the security architecture
System Life Cycle Processes should be assessed on a regular and
systematic basis in order to ensure that it
5.5 KA 4 infrastructure remains effective.
Assurance • Evaluate the state of security architecture
Infrastructure assurance encompasses the within the enterprise. The security
processes that sustain the underlying architecture is continuously evaluated to
organizational base or foundation of the ensure that it remains effective.
operational assurance analysis and response • Evaluate future security trends to set security
management knowledge areas. That is, it entails architecture strategy. Security trends are
both the security architecture and the policies, assessed as they impact the evolution of the
processes, and methodologies necessary for particular security architecture of the
those knowledge areas. An organization-wide organization
security architecture is the composite of all • Provide consultation and strategy on security
security solutions required to sense, analyze and architecture. Expertise is provided to lay-
respond to threats, vulnerabilities, and violations. practitioners to ensure a minimum
Furthermore, it also maintains the acceptable level of awareness of the
interrelationships between those solutions. implications and requirements of security
Organization-wide policies, processes, and architecture.
methodologies are required to rationally control • Evaluate and recommend direction and
the actions of the organization as the contextual decision regarding vendor purchased
threat situation changes (i.e. as risk changes the products and services and in-house
surety of defense changes to match it). Together, developed tools as they relate to security
these policies, processes and methodologies architecture. The security aspects of the
constitute the infrastructure that is required to acquisition process is continuously
sustain post-release assurance. In addition, this maintained in conformance with all
also ensures continuous coordination between assurance practices defined by the
the Post-Release Assurance and the Acquisition organization.
and Development functions.
• Provide on-call technical support for security
5.5.1 Security Architecture architecture. Any questions or concerns
Infrastructure assurance from a security raised in the day-to-day maintenance of the
architecture standpoint involves the security architecture is responded to in a Do-
development, deployment, and post-release It-Now (DIN) fashion.
support of the most appropriate and capable set • Maintain an enterprise security and control
of security solutions, tools, frameworks, and knowledge base and an information store
components in order to maintain a dynamic that will share and cascade security
security context. To accomplish this, the architecture knowledge and information to
following activities are conducted: all levels in the organization.
• An organization-wide set of security and 5.5.2 Policy, Process and Methodology
control processes is planned, designed,
administered and maintained. The aim of this Assurance
is to ensure that effective vision, expertise, Infrastructure assurance from a policy, process,
and the correct technology-based solutions and methodology standpoint involves the

108
development, deployment, and sustainment of 5.5.3 Recommended References for
the most appropriate and capable set of security Infrastructure Assurance
policies, processes, and methodologies. The aim
is to maintain a dynamically effective security DOD 5200.28-STD, "Department of Defense
and controls context. In order to accomplish this, Trusted Computer System Evaluation Criteria",
the following activities are conducted: 1985.
• Develop and deploy the most current CCIMB-2004-01-001, "Common Criteria for
security methodologies, processes, and Information Technology Security Evaluation",
associated documentation. This is kept in an 2004.
organizationally standard and accessible "COBIT 3rd Edition Executive Summary", IT
repository Governance Institute, Available from
Information Systems Audit and Control
• Collaborate to develop and implement
Association: http://www.isaca.org, Accessed:
communications and training materials for
July 20, 2005.
deployment of process. These materials will
be organizationally standard. Their "CobiT in Academia", IT Governance Institute,
coordination will be done through a formal 2004, available from Information Systems Audit
process. and Control Association:
http://www.isaca.org, Accessed: July 20,
• Develop and collect security metrics. 2005..
Security metrics will be standard and they
will be utilized to perform causal analysis for S. Kairab, "A Practical Guide to Security
the purpose of optimization of the ongoing Assessments", Auerbach Publications, 2005,
security process. Chap. 6,7.
• Coach teams in the application of C. King, C. Dalton, E. Osmanoglu, "Security
methodologies and process and assists with Architecture: Design Deployment and
assessing compliance. Operations", McGraw-Hill/Osborne, 2001,
Chap. 5, 14
• Define and analyze metrics to improve
R. Krutz, R. Vines, "The CISSP Prep Guide:
methodology and process efficiency, usage,
Mastering the CISSP and ISSEP Exams, Second
and results.
Edition", John Wiley & Sons, 2004, Chap. 5, 6,
• Perform modeling of application and 10.
infrastructure from a security perspective. S. Purser, "A Practical Guide to Managing
• Drive ongoing security and control Information Security ", Artech House, 2004,
certifications of applications and Chap. 1, 8.
infrastructure. W. Van Grembergen, "Strategies for
• Perform modeling of application and Information Technology Governance ", Idea
infrastructure from a security perspective. Group Publishing, 2004, Chap. 11.
• Champion and promote security and control 5.5.4 List of Further Readings
awareness, policies, procedures, tools, and
standards. Ashton, Gerry, Cleaning up your Security Act
for Inspection, Computer Weekly Jan 18, 2001
• Perform modeling of application and
infrastructure from a security perspective. Dorofee A.J., JA Walker, RC Williams, Risk
Management in Practice, Crosstalk, Volume 10
#4, April 1997
GASSP, Generally Accepted System Security
Principles, International Information Security

109
Forum, June 1999General Accounting Office, Price Waterhouse Coopers, Information Security
GAO Internal Control Standard, 1999 Breaches 2003, Department of Trade and
ISACA, “Due Professional Care”, IS Auditing Industry (DTI), U.K., 2004
Guideline, 1999 Swanson, Marianne, Security Self Assessment
ISACA, “Audit Sampling”, IS Auditing Guide for Information Technology Systems,
Guideline, 1999 National Institute of Standards and Technology
NIST 800-26, November 2001
ISACA, “Use of Risk Assessment”, IS Auditing
Guideline, 2000 5.5.5 List of Standards
ISACA, “Control Risk Self Assessment”, IS BS ISO/IEC 17799:2000, Information
Auditing Guideline, 2003 technology - Code of practice for information
ISACA, “Intrusion Detection”, IS Auditing security management
Guideline, 2003 ISO/IEC 12207: 1995, Information Technology -
ISACA, “SDLC Reviews”, IS Auditing Software life cycle process.
Guideline, 2003 IEEE/EIA 12207.1-1997, IEEE/EIA Guide:
IT Governance Institute, IT Control Objectives Industry Implementation of International
for Enterprise Governance, ISACA, 1999 Standard ISO/IEC 12207:1995, 1998.
IT Governance Institute, IT Strategy Committee, IEEE/EIA 12207.2-1997, IEEE/EIA Guide:
ISACA, 2003 Industry Implementation of International
IT Governance Institute, CobiT Mapping, Standard ISO/IEC 12207:1995, 1998
ISACA, 2004 ISO/IEC 15288: 2002 Systems Engineering -
IT Governance Institute, IT Control Objectives System Life Cycle Processes
for Sarbanes-Oxley, ISACA, 2004

110
6 ACQUISITION/SUPPLY OF SECURE
SOFTWARE
This section is primarily for individuals who provide project/program (herein after referred to as
“program”) management of software intensive systems. This includes both program management
officers who lead the acquisition (e.g., the receiving end), and those who manage the supply of
software intensive systems.
It is understood that program managers need to make a myriad of decisions in balancing cost,
schedule, and performance as well as risks. One difficulty is that software intensive systems are
vulnerable to security problems that may be introduced either intentionally or unintentionally. This
section attempts to lay out software vulnerabilities and programmatic risks that need to be addressed in
the acquisition/supply process, provide an understanding of the risks (versus cost, schedule, and
performance), and provide guidelines for software risk mitigation and management.
In addition, it is recognized that software assurance is a subset of systems assurance. Most engineers
will agree that, while software may provide much of the functionality of a system, it has two inherent
characteristics which make it difficult to “engineer”: complexity and invisibility. Software is arguably
the most complex artifact made by humans, and this complexity makes “perfection” all but impossible:
there are too many interconnections for anyone to consider, or to keep track of. Additionally, the
invisibility of software means that it is difficult to judge a program’s “goodness”, and we are forced to
base our judgments on various second-order measures: adherence to process, test coverage metrics,
etc. Thus, it is easy to understand that it is also difficult to judge the “goodness” of “software
assurance”.
This section on Acquisition/Supply of Secure Software encompasses the program initiation
(acquisition), request for proposals (acquisition), preparation of response (supply), contract
management (acquisition/supply), and source selection (acquisition) of secure software. Certain
aspects of acquisition/supply are related to the Development and Sustainment sections. Those sections
will be cross-referenced in this section.
This section contains subsections covering the “additional” knowledge needed to acquire/provide
secure software:
• More on Fundamental Concepts
• Program Initiation and Planning
• Request for Proposals--Acquirer
• Preparation of Response--Supplier
• Source Selection--Acquirer
• Contract Preparation and Management--Acquirer
• Contract Preparation and Management—Supplier
Together these cover technical, management, support, and interactions within a larger context.
6.1 More on Fundamental Concepts
This section adds acquisition management-related concepts, principles, and details to those in the prior
fundamentals section, which spanned development, sustainment, and acquisition.
6.1.1 Terms and Definitions
Acquisition operational, and technical controls) for
“Acquisition” means the acquiring by contract information and information systems in
with of supplies or services through purchase each such category.
or lease, whether the supplies or services are Federal Information Processing Standard 199
already in existence or must be created, requires that all U.S. Federal government
developed, demonstrated, and evaluated. agencies: “use the security categorizations
Acquisition begins at the point when needs are described in FIPS Publication 199 whenever
established and includes the description of there is a federal requirement to provide such
requirements to satisfy those needs, solicitation categorization of information or “information”
and selection of sources, award of contracts, systems..”
contract financing, contract performance, The publication establishes security categories
contract administration, and those technical and for both information and information systems.
management functions directly related to the
“Information” is categorized according to is
process of fulfilling needs by contract. [FAR
information type. “Information type” means a
Subpart 2.101(b)(2) definition is revised to be
specific category of information (e.g., privacy,
generally applicable to both government and
medical, proprietary, financial, investigative,
nongovernment]
contractor sensitive, security management),
Availability defined by an organization, or in some
Availability: Ensuring timely and reliable instances, by a specific law, Executive Order,
access to and use of information. [44 U.S.C., directive, policy, or regulation. [FIPS Pub 199]
Sec. 3542] “The security categories are based on the
Categorization of Information and Information potential impact on an organization should
Systems certain events occur which jeopardize the
Title III of the E-Government Act of 2002, information and information systems needed
entitled the Federal Information Security by the organization to accomplish its assigned
Management Act of 2002 (FISMA), tasked the mission, protect its assets, fulfill its legal
National Institute of Standards and Technology responsibilities, maintain its day-to-day
(NIST) to develop standards to be used by all functions, and protect individuals. Security
U.S. Government federal agencies, including categories are to be used in conjunction with
the development of: vulnerability and threat information in
assessing the risk to an organization.” [FIPS
• Standards to categorize all information and
Pub 199]
information systems based on the
objectives of providing appropriate levels “FIPS Publication 199 defines three levels or
of information security according to range potential impact on organizations or
of risk level; individuals should there be a breach of security
(i.e., loss of confidentiality, integrity, or
• Guidelines recommending the types of availability) as follows :
information and information systems to be
included in each category; and • Low Impact: The loss of
confidentiality, integrity, or availability
• Minimum information security could be expected to have a limited adverse
requirements (i.e., management, effect on organizational operations,
organizational assets, or individuals (may loss of life or serious life threatening
include, but not limited to, loss of privacy) injuries.” [FIPS Pub 199]
A limited adverse effect means that, for Commercial Item (the Federal Government
example, the loss of confidentiality, definition)
integrity, or availability might: (i) cause a
“Commercial item” means—
degradation in mission capability to an
extent and duration that the organization is (1) Any item, other than real property, that is of
able to perform its primary functions, but a type customarily used by the general public
the effectiveness of the functions is or by non-governmental entities for purposes
noticeably reduced; (ii) result in minor other than governmental purposes, and—
damage to organizational assets; (iii) result (i) Has been sold, leased, or licensed to the
in minor financial loss; or (iii) result in general public; or
minor harm to individuals. (ii) Has been offered for sale, lease, or license
• Moderate Impact: The loss of to the general public;
confidentiality, integrity, or availability (2) Any item that evolved from an item
could be expected to have a serious adverse described in paragraph (1) of this definition
effect on organizational operations, through advances in technology or
organizational assets, or individuals. performance and that is not yet available in the
A serious adverse effect means that, for commercial marketplace, but will be available
example, the loss of confidentiality, in the commercial marketplace in time to
integrity, or availability might: (i) cause a satisfy the delivery requirements under a
significant degradation in mission Government solicitation;
capability to an extent and duration that the
(3) Any item that would satisfy a criterion
organization is able to perform its primary
expressed in paragraphs (1) or (2) of this
functions, but the effectiveness of the
definition, but for—
functions is significantly reduced; (ii) result
in significant damage to organizational (i) Modifications of a type customarily
assets; (iii) result in significant financial available in the commercial marketplace; or
loss; or (iii) result in significant harm to (ii) Minor modifications of a type not
individuals that does not involve loss of life customarily available in the commercial
or serious life threatening injuries. marketplace made to meet Federal Government
• High Impact: The loss of requirements. Minor modifications means
confidentiality, integrity, or availability modifications that do not significantly alter the
could be expected to have a severe or nongovernmental function or essential physical
catastrophic adverse effect on characteristics of an item or component, or
organizational operations, organizational change the purpose of a process. Factors to be
assets, or individuals. considered in determining whether a
A severe or catastrophic adverse effect modification is minor include the value and
means that, for example, the loss of size of the modification and the comparative
confidentiality, integrity, or availability value and size of the final product. Dollar
might: (i) cause a severe degradation in or values and percentages may be used as
loss of mission capability to an extent and guideposts, but are not conclusive evidence
duration that the organization is not able to that a modification is minor;
perform one or more of its primary (4) Any combination of items meeting the
functions; (ii) result in major damage to requirements of paragraphs (1), (2), (3), or (5)
organizational assets; (iii) result in major of this definition that are of a type customarily
financial loss; or (iii) result in severe or combined and sold in combination to the
catastrophic harm to individuals involving general public;
(5) Installation services, maintenance services, exclusively at private expense and sold in
repair services, training services, and other substantial quantities, on a competitive basis, to
services if— multiple State and local governments.
(i) Such services are procured for support of an “Component” means any item supplied to the
item referred to in paragraph (1), (2), (3), or (4) Government as part of an end item or of
of this definition, regardless of whether such another component, except that for use in—
services are provided by the same source or at Confidentiality
the same time as the item; and
Confidentiality: Preserving authorized
(ii) The source of such services provides restrictions on information access and
similar services contemporaneously to the disclosure, including means for protecting
general public under terms and conditions personal privacy and proprietary information.
similar to those offered to the Federal [44 U.S.C., 3542]
Government;
COTS
(6) Services of a type offered and sold
“COTS” means commercial off the shelf
competitively in substantial quantities in the
software. In the Federal Government, COTS is
commercial marketplace based on established
considered a commercial item.
catalog or market prices for specific tasks
performed or specific outcomes to be achieved Information Assurance Architecture
and under standard commercial terms and “Information Assurance Architecture” is an
conditions. This does not include services that abstract expression [used by the U.S.
are sold based on hourly rates without an Department of Defense (DoD)] of information
established catalog or market price for a assurance (IA) solutions that assigns and
specific service performed or a specific portrays IA roles, and behavior among a set of
outcome to be achieved. For purposes of these information technology assets, and prescribes
services— rules for interaction and interconnection. An IA
(i) “Catalog price” means a price included in a architecture may be expressed at one of three
catalog, price list, schedule, or other form that levels: DoD information system-wide, DoD
is regularly maintained by the manufacturer or Component-wide, or Defense-wide. DoD
vendor, is either published or otherwise Component-wide and Defense-wide IA
available for inspection by customers, and architectures provide a uniform and systematic
states prices at which sales are currently, or way to assess and specify IA across multiple,
were last, made to a significant number of interconnecting DoD information systems to
buyers constituting the general public; and ensure that they take advantage of supporting
IA infrastructures. [DoD Instruction 8500.2,
(ii) “Market prices” means current prices that
Enclosure 2)
are established in the course of ordinary trade
between buyers and sellers free to bargain and Security Accreditation
that can be substantiated through competition The official management decision given by a
or from sources independent of the offerors. senior agency official to authorize operation of
(7) Any item, combination of items, or service an information system and to explicitly accept
referred to in paragraphs (1) through (6) of this the risk to agency operations (including
definition, notwithstanding the fact that the mission, functions, image, or reputation),
item, combination of items, or service is agency assets, or individuals, based on the
transferred between or among separate implementation of an agreed-upon set of
divisions, subsidiaries, or affiliates of a security controls. [NIST Special Publication
contractor; or 800-37]
(8) A nondevelopmental item, if the procuring Security Certification
agency determines the item was developed
A comprehensive assessment of the “Security category” means the characterization
management, operational and technical security of information or an information system based
controls in an information system, made in on an assessment of the potential impact that a
support of security accreditation, to determine loss of confidentiality, integrity, or availability
the extent to which the controls are of such information or information system
implemented correctly, operating as intended, would have on organizational operations,
and producing the desired outcome with organizational assets, or individuals. [FIPS Pub
respect to meeting the security requirements for 199]
the system. [NIST Special Publication 800-37] Security Controls
Information System “Security controls” mean the management,
“Information system” means a discrete set of operational, and technical controls (i.e.,
information resources organized for the safeguards or countermeasures) prescribed for
collection, processing, maintenance, use, an information system to protect the
sharing, dissemination, or disposition of confidentiality, integrity, and availability of the
information. [44 U.S.C., SEC. 3502] system and its information. [FIPS 199]
Information Technology: Security Objective
“Information technology” means any “Security objective” means the confidentiality,
equipment or interconnected system or integrity, availability. [FIPS Pub 199]
subsystem or equipment that is used in the Software Assurance Case
automatic acquisition, storage, manipulation,
“Software assurance case” means a reasoned,
management, movement, control, display,
auditable argument created to support the
switching, interchange, transmission, or
contention that the defined software intensive
reception of data or information by the
system will satisfy software security
executive agency. For purposes of the
requirements and objectives. [derived from
preceding sentence, equipment is used by an
MOD Def Std 00-42, Part 3, 2003]
executive agency if the equipment is used by
the executive agency directly or is used by a Software Assurance Levels
contractor under a contract with the executive “Software assurance levels” refer to levels of
agency which: (i) requires the use of such assurance based on the absence of security
equipment; or (ii) requires the use, to a vulnerabilities within a piece of code:
significant extent, of such equipment in the • Level 1: No assurance. No knowledge of
performance of a service or the furnishing of a specific developmental practices, or
product. The term information technology objective evidence (e.g., test results) that
includes computers, ancillary equipment, would demonstrate the absence of
software, firmware, and similar procedures, vulnerabilities.1
services (including support services), and
• Level 2: Minimal assurance. The software
related resources [emphasis added]. [40
producer has provided tangible evidence of
U.S.C. SEC 1401]
either specific developmental practices, or
Integrity other objective evidence, that would
Integrity: Guarding against improper demonstrate the absence of vulnerabilities.
information modification or destruction, and • Level 3: Demonstrated minimal assurance.
includes ensuring information non-repudiation An external assessment of the software
and authenticity. [44 U.S.C., Sec. 3542] development practices and/or objective

Security Category 1
Note: It should be noted that a history of “correct behavior”
should not be considered as evidence of the absence of
security vulnerabilities.
testing has been conducted, and has (deliberate, accidental); or by their method of
validated the producer’s claims. operation (buffer overflow, trapdoor codes,
• Level 4: Maximal assurance. Extensive etc.) Within the SwA program, the Acquisition
independent testing of the software portion of the CBK is concerned with
(including static examination of the source programmatic actions that can be taken to
code) reveals the absence of detectable limit:
software vulnerabilities. Qualified by a list • The deliberate introduction of
of actual tests conducted, versions tested, vulnerabilities by individuals with
results of reports, and other evidence. malicious intent.
Software Vulnerability Classes • The accidental introduction of
Software vulnerabilities can be classified in vulnerabilities through either carelessness
two ways: by their method of introduction or poor coding practice.
6.2 Program Initiation and Planning
6.2.1 Scope • Develop vendor/supplier pre-qualification
Assurance begins with program initiation and criteria;
planning. • Determine the impact of foreign ownership
—control or influence restrictions;
6.2.1.1 Define systems capabilities
that pose software assurance risk • Gather and assess intelligence on potential
bidders and/or their partners—specify
limitation
possible restrictions;
• Determine whether the system is to be
• Develop a software assurance risk-related
connected to a network (if unsure, assume
cost/benefit analysis;
that it will);
• Determine whether to outsource or develop
• Identify specific operating system (if
in-house based on risk assessment;
applicable);
• Evaluate the software assurance-related
• Determine degree of COTS integration;
aspects of the acquisition strategies under
• Assess criticality of the information to consideration.
decision makers;
6.2.1.3 Define software assurance
• Determine level(s) of classification for
either the data or information to be
capabilities based on the systems
processed; capabilities that pose software
assurance risks and the risk analysis.
• Identify user-defined capabilities dealing
with software assurance and/or protection 6.2.1.4 Determine software
profiles; assurance-related program
• Determine stress-related capabilities governance.
impacting software assurance. • Specify the need for and criteria of
6.2.1.2 Perform a program risk qualified personnel to provide software
assurance oversight;
analysis based on defined system
capabilities. • Resolve software assurance-related source
selection board issues (e.g., software
• Determine acceptable level of software assurance-related evaluation criteria, board
assurance risk (system and network, if member qualifications to evaluate software
applicable); assurance capabilities of the proposed
solution).
6.2.2 Request for Proposals—Acquirer
6.2.2.1
6.2.2.2 Scope that provide profiles for secure software and
When an organization needs to enter into a security controls for secure software—DoD
contract for the development of secure Instruction 8500.2, Common Criteria, NIST
software, a written request is issued to Special Publication 800-53 ]
prospective suppliers (developers/offerors). 6.2.2.6 Develop Software Assurance
This request is sometimes referred to as a Service Level Agreement (SLA)
request for proposal. The request for proposals Measures and Metrics.
should:
[Suggested wording will be provided in later
• Include special terms and conditions versions. We will review various references for
related to software assurance; measurement concepts. Once such concept that
• Include software assurance capabilities is likely to be used is the Ministry of Defence’s
(defined in the Program Initiation stage) software assurance case. ]
and requirements in a statement or work.
This includes security controls and
6.2.2.7 Develop Software Assurance
measures/metrics; Language for a Statement of Work to
Develop Secure Software, Including
• Provide instructions to suppliers
Incentives
(offerors/respondents) regarding the
information they must submit to be This written request should include a statement
evaluated; and of requirements. The statement of requirements
is often called a Statement of Work (SOW),
• Appropriate evaluation criteria for software Statement of Objectives (SOO), Work
assurance. Statement (WS), or Performance-Based Work
In addition to the request for proposal, the Statement (PBWS).
organization should develop other documents The following is recommended language for
to manage the selection of the supplier. In the the secure software section of a software
Federal Government this document is often development work statement. This language is
called the Source Selection Plan. derived from the “Requirements for Secure
6.2.2.3 Determine Software Software” contained in this document.
Assurance Terms and Conditions “1.0 Secure Software Requirements
[Suggested wording will be provided in later 1.1 Key definitions:
versions. We will review various references 1.1.1 “Secure software” means software that
that provide “security” related language for incorporates the appropriate software
terms and conditions] security controls for a software
6.2.2.4 Develop Software Assurance intensive system’s security category in
Requirements order to meet software security
objectives.
Software assurance requirements
1.1.2 “Software security controls” mean the
6.2.2.5 Specify Software Assurance- management, operational, and technical
Specific Profiles (Common Criteria) controls (i.e., safeguards or
and Security Controls countermeasures) prescribed for a
[Suggested wording will be provided in later software information system to protect
versions. We will review various references the confidentiality, integrity, and
availability of the system and its [Additional suggested wording will be
information. included in subsequent versions. The Case
1.1.3 “Security category” means the would be used by the acquirer as an
characterization of information or an acceptance condition—could also be used
information system based on an to develop incentives]
assessment of the potential impact that 6.2.2.8 Develop Software Assurance
a loss of confidentiality, integrity, or Language for a Statement of Work to
availability of such information or
Acquire COTS or Commercial Items,
information system would have on
organizational operations,
Including Incentives
organizational assets, or individuals. [Suggested wording will be provided in later
1.1.4 “Software security objectives” versions. We will review various references
confidentiality, integrity, and that provide “security” related language for
availability. terms and conditions]
1.1.5 “Software assurance case” means a 6.2.2.9 Develop Software Assurance
reasoned, auditable argument created to Language for the Instructions to
support the contention that the defined Suppliers
software intensive system will satisfy In addition to the statement of work, the
software security requirements and organization should also include instructions
objectives. regarding the information the supplier must
1.1.6 Include other appropriate definitions--- submit to enable the organization to evaluate
1.2 Security Category the supplier’s proposal for developing the
[Suggested wording will be drafted in (secure) software. 2
subsequent versions. We are developing a [Suggested wording will be refined in
recommended way to categorize software subsequent versions—will include the
intensive systems based on DoD’s mission instruction on software assurance case, etc. ]
assurance categories. See DoD Instruction “Instructions to Supplier (Offerors or
8500.2, FIPS Pub 199 and others] Responders)
1.3 Software Security Controls 1.0 Definitions. See the statement of work.
[Suggested wording will be drafted in
subsequent versions. We are developing a 2.0 In order for the acquirer to evaluate the
recommended way to identify appropriate proposed software assurance capabilities, the
software security controls based on DoD’s offeror must submit an initial Software
mission assurance categories. See DoD Assurance Case that addresses the software
Instruction 8500.2, FIPS Pub 199 and security controls to meet security objective of
others]” the software intensive system as specified in
the statement of work and other characteristics
1.4 Software Assurance Case as specified in paragraph 3.0 below. This initial
The contractor shall refine the Software Software Assurance Case will subsequently
Assurance Case throughout the 2
In the Federal Government when using the uniform
development process. The contractor shall contract format [FAR Subpart 15.204-1] as both a
submit the Case for review -----[specify solicitation and contract document , instructions to offerors
when the Case should be reviewed, such as or responders to a solicitation are separate from the work
statement and included at the very end of the solicitation.
with the submission of the software design, When a contract is awarded as a result of the solicitation,
etc.] those instructions are not included. Therefore, it is easy to
exclude those instructions when they are separate from the
work statement and included at the end of the solicitation
become a part of the contract and be used by system, and assessing the system are of
the acquirer as an acceptance condition. appropriate depth, breadth, and integrity. The
3.0 Software Assurance Case analyses crucially depend upon the formulation
of suitable models of the system, at various
A software assurance case should present a
levels of abstraction, produced during the
convincing argument that the software
development process. Given these
intensive system will operate in an acceptably
characteristics, it should be noted that software
secure manner. The case should present
assurance cases are resource intensive to create
definitive evidence, supported by process,
and support.
procedure, and analysis, that a system and its
software will be acceptably secure throughout
its life cycle, including termination. The case
should demonstrate that within the totality of
the environment in which the software will
operate, any problems created by the software
itself failing to operate as required, have been
identified and assessed and that any necessary
amelioration has been made. Subsequent to
contract award, the contractor shall modify the
case as development progresses, and
knowledge of the system and its software
increases. All the lifecycle products which
contribute to and are affected by the security of
the software also need to be covered by the
case.
Similar to a risk analysis, the following
questions are examples of what should be of
interest, be examined, and be analyzed through
the use of the software assurance case:
a) How can a software failure occur?
b) What might the causes and conditions of
failure be?
c) What are the consequences of failure?
d) What is the level of criticality of the
consequences?
e) How often is it likely that the failure will
occur?
Software assurance cases are likely to contain
significant numbers of complex inter-
dependencies that typically result from a wide
range of related analyses. They often rest upon
a number of explicit, as well as implicit,
assumptions and can have a long history, going
through multiple versions in the course of their
production. Both product and process issues
need to be addressed in the case; it must be
shown both that the system meets its software
assurance requirements and that the processes
for deriving the requirements, constructing the
6.2.2.10 Develop Software Assurance Language (Evaluation Criteria) for
the Source Selection Plan
6.2.3 Preparation of Response--Supplier

6.2.3.1 Scope assurance requirements, processes, procedures,


and means of verification, validation, and
The response to a request for proposals should
accreditation for the software system. An
reflect the software assurance capabilities that
example of its format is as follows:
have been specified in the terms and
conditions, statement of work, and evaluation Software Assurance Plan
factors including the incorporation of an a. Purpose and Scope (includes having valid
assurance case. security policy and meeting it)
6.2.3.2 Prepare Software Assurance b. Organization and Responsibilities
Design and/or Architecture c. Staff Qualification
An information assurance architecture is a d. Software Assurance Life Cycle (software
framework that assigns and portrays IA roles assurance tasks, products, and
and behavior among all IT assets, and documentation by phase)
prescribes rules for interaction and e. Software Assurance Records
interconnection. Likewise a Software f. Configuration Management (special or
Assurance Design and/or Architecture plays a unique software assurance requirements)
similar role for secure software assurance. g. Software Security Risks and Mitigation
6.2.3.3 Prepare a Software (identification of vulnerabilities and risks
associated with the development and
Assurance Plan deployment of software and the steps that
During the acquisition/supply of software there will be taken to mitigate those risks)
are a myriad of plans required to ensure h. Verification & Validation (special or
quality, security/safety, timely delivery within unique software assurance requirements)
cost, and appropriate engineering best practices
are used. i. Tool Support (special or unique software
assurance modeling and simulation)
These plans include Software Development
Plan (reflects security requirements and j. Subcontract Management (special or
assurance-oriented activities including unique software assurance requirements)
assurance case); Project Risk Management k. Process and Product Certification
Plan; Software Test Plan (software-related l. Assurance Case (structure, arguments, and
security vulnerability testing, user testing, evidence)
stress testing and/or IV&V); Systems
Engineering Plan; Project Quality Assurance
Plan; Project Configuration Management Plan;
Project Training Plan; Project Information
Assurance or Security Plan; Project Incident
Response Plan; Software Contingency Plan.
Program officers may opt to include
appropriate software assurance items in those
plans or opt to develop a stand-alone plan. In
any case the plans include creating and
sustaining an assurance case. A Software
Assurance Plan should be developed no later
than program initiation defining the software
6.2.4 Source Selection--Acquirer

6.2.4.1 Scope against the stated requirements, evaluation


Upon receipt of the proposals from the factors, and terms and conditions specified in
supplier, the source selection process occurs. the request for proposals.

6.2.4.2 Evaluate Proposals 6.2.4.3 Determine Acceptable


Review proposals and evaluate the Proposals
• Software security-oriented capabilities Determine the acceptability of the proposed
security policy and its flexibility, software
• Security-oriented aspects of the proposed assurance risks and associated mitigation
process and products (and services) strategies, and the complete assurance case.
• Software assurance capabilities and
proposed assurance case
6.3 Contract Finalization and Management—Acquirer

6.3.1.1 Scope • Periodically review software assurance


This phase includes the final decision on risks and mitigation to determine the need
contract language and the preparation of the for software assurance-related requirements
contract as well as the management of the changes.
performance under the contract. • Review and approve software assurance-
related documentation particularly that
6.3.1.2 Contract Finalization
assurance case is it is up-to-date and
• Determine and incorporate negotiated acceptable;
agreements, to include terms and
• Oversee software assurance-related
conditions, on software assurance into the
developmental testing (DT), user testing,
final contract;
and IV&V;
• Plan for interaction with, and oversight of,
• Oversee the testing for software assurance
development (if warranted and possible).
vulnerabilities.3
6.3.1.3 Contract/Program • Ensure all needed corrective actions are
Management taken. If risks rise to an unacceptable level,
• Develop a schedule to support software take appropriate actions.
assurance-related oversight and project
reviews and audits, including oversight of
any sub-contract management activities;
• Determine and oversee software assurance-
related certifications and/or certification
planning requirements;
• Review software assurance aspects of the
Program Risk Management Plan and
Software Assurance Plan in light of
proposed design development, and test
plans;
• Identify and provide qualified software
assurance personnel to
o provide oversight;
o review project documentation;
o perform Configuration Control Board
or Spiral Development Control Board
functions;
• Perform/request a software assurance
intelligence update (FOCI review).
• Review the intelligence update (FOCI
review) to determine changes in software
vulnerabilities and risks.

3
Commercial-off-the-shelf (commercial items) purchases.
6.3.2 Contract Finalization and Management—Supplier
6.3.2.1 Scope • Regularly review the assurance case to
The response to a request for proposals should ensure its quality and that it is up-to-date
reflect the software assurance capabilities and acceptable
specified in the terms and conditions, statement • Review software assurance-related changes
of work (or statement of objectives), and and adjustments requested by client and
evaluation factors. propose solutions;
6.3.2.2 Contract Finalization • Develop test documentation;
• Review software assurance-related • Perform DT, user testing, and IV&V;
changes/adjustments requested by client; • Test for software assurance
• Negotiate and incorporate vulnerabilities.4
changes/adjustments requested by client;
• Plan for interaction with, and quality
assurance of, work under the contract.
6.3.2.3 Contract/Program
Management:
• Develop a schedule to support software
assurance-related oversight, project reviews
and audits, including oversight of any sub-
contract management activities;
• Determine and obtain software assurance-
related certifications and/or certification
planning requirements;
• Make software assurance-related
modifications to the Program Risk
Management Plan and other project
documentation;
• Provide qualified supervisory and
development personnel;
• Identify and provide qualified software
assurance personnel;
• Identify and provide qualified software
assurance personnel to perform
Configuration Control Board or
Development Control Board functions;
• Perform a software assurance intelligence
update (FOCI review) of prime and 2nd tier
suppliers, contractors;
• Review the intelligence update (FOCI
review) to determine changes in software
vulnerabilities and risks; 4
Commercial-off-the-shelf (commercial items) purchases.
6.4 References
[Abrams 1998] Abrams, M. D., Security Reliability and Maintainability (R&M)
Engineering in an Evolutionary Acquisition Assurance Guidance. Part 3, “R&M Case,” 6
Environment. New Security Paradigms June 2003.
Workshop 1998. NIST Special Publication 800-37, May 2004,
[DoDD 5200.39] Department of Defense “Guide for the Security Certificatioin and
Directive 5200.39, Security, Intelligence and Accreditation of Federal Information
Counterintelligence Support to Acquisition Systems”
Program Protection. 10 September 1997. [NIST Spec Pub 800-53] National Institute of
[DoDI 8500.2] Department of Defense Standards and Technology (NIST) Special
Instruction 8500.2, Information Assurance Publication 800-53, Recommended Security
(IA) Implementation, 6 February 2003. Controls for Federal Information Systems.
[FISMA] Federal Information Security February 2005.
Management Act of 2002, 44 U.S.C. § 3541 NIST Special Publication 800-67 Rev 1,
et seq. June 2004, “Security Considerations in the
[FIPS Pub 199] Federal Information Information System Development Life
Processing Standard (FIPS) Publication 199, Cycle”
Standards for Security Categorization of [Rasmussen 2004] Rasmussen, Michael, with
Federal Information and Information Natalie Lambert, Security Assurance Needed
Systems. February 2004. In Software Development Contracts.
[IEEE Std 1062, R2002] Recommended Research Report, Forrester, May 24, 2004
Practice for Software Acquisition [Intek]
[MOD Def Std 00-42, Part 3, 2003] Ministry http://www.intek.net/Secure/PIS/PIS.htm
of Defence Standard 00-42 Issue 2,
7 Bibliography
[Abran 2004] Abran, Alain and James W. Moore, eds. Swebok: Guide to the Software Engineering
Body of Knowledge, 2004 Version. IEEE Computer Society, 2004.
[ACM] ACM Transactions on Information and System Security, Association for Computing
Machinery.
[Atallah, Bryant and Sytz 2005] Atallah, Mikhail, Bryant, Eric, and Styz, Martin, A Survey of Anti-
Tamper Technologies, Crosstalk – The Journal of Defense Software Engineering, Nov 2004.
[Anderson 2001] Anderson, Ross J., Security Engineering: A Guide to Building Dependable
Distributed Systems. John Wiley and Sons, 2001.
[Apvrille 2005] Apvrille, Axelle, and Makan Pourzandi, “Secure Software Development by Example,”
IEEE Security and Privacy, p. 10-17, July/August 2005
[Avizienis 2004] Avizienis, Algirdas, Jean-Claude Laprie, Brian Randell, and Carl Landwehr. “Basic
Concepts and Taxonomy of Dependable and Secure Computing,” IEEE Transactions on Dependable
and Secure Computing, vol. 1, no. 1, Jan.-Mar. 2004, pp. 11-33.
[Barker 2004] Barker, William C. “Guide for Mapping Types of Information and Information Systems
to Security Categories,” NIST Special Publication 800-60, June 2004.
[Barnes 2003] Barnes, John. High Integrity Software: The SPARK Approach to Safety and Security,
Addison Wesley 2003
[Beitler 2003] Beitler, Michael A., Strategic Organizational Change Practitioner Press International;
January 17, 2003.
[Bishop 2003] Bishop, Matt. Computer Security: Art and Practice, Addison-Wesley, 2003.
[Boehm 2003], Boehm, Barry, and Richard Turner, Balancing Agility and Discipline: A Guide for the
Perplexed. Addison-Wesley 2003.
[Bosworth and Kabay 2002] Bosworth, Seymour and Kabay, M. eds. Computer Security Handbook. 4th
Edition, John Wiley and Sons, 2002.
[Boudra 1993] Boudra,P., Jr. Report on rules of system composition: Principles of secure system
design. Technical Report, National Security Agency, Information Security Systems Organization,
Office of Infosec Systems Engineering, I9 Technical Report 1-93, Library No. S-240, 330, March
1993.
[Bourque and Dupuis 2004] Bourque, Pierre, and Robert Dupuis (Editors). Guide to the Software
Engineering Body of Knowledge. 2004 Edition. Los Alamitos, California: IEEE Computer Society,
Feb. 16, 2004.
[Broadfoot and Broadfoot 2003] Broadfoot, G. and P. Broadfoot, “Academia and Industry Meet: Some
Experiences of Formal Methods in Practice,” Proceedings of the Tenth Asia-Pacific Software
Engineering Conference, Chiang Mai, Thailand, December 2003, IEEE Computer Society.
[Bush 2000] Bush, W.R., J.D. Pincus, and D.J. Sielaff, “A Static Analyzer for Finding Dynamic
Programming Errors,” Software Practice and Experience, vol. 30, June 2000.
[Bynum 2001] Bynum, Terrell. “Computer Ethics: Basic Concepts and Historical Overview,” The
Stanford Encyclopedia of Philosophy, 2001 ed., Edward N. Zalta (ed.).
<http://plato.stanford.edu/archives/win2001/entries/ethics-computer/>.

127
[Bynum and Rogerson 2004] Bynum, Terrell (ed.) and Simon Rogerson. Computer Ethics and
Professional Responsibility: Introductory Text and Readings, Blackwell Publishing, 2004.
[Chaves et al, 2005] Chaves,C. H. P. C., L. H. Franco, and A. Montest. “Honetnet Maintenance
Procedures and Tools,” Proc. 6th IEEE Systems, Man and Cybernetics Information Assurance
Workshop (IAW 05), IEEE CS Press, 2005, pp.252-257.
[Chirillo 2002] Chirillo, John. Hack Attacks Revealed: A Complete Reference for UNIX, Windows,
and Linux with Custom Security Toolset. Wiley Publishing, Inc., 2002.
[Christensen 2003] Christensen, Clayton M., The Innovator's Dilemma. HarperBusiness; January 7,
2003.
[Clark and Wilson 1987] Clark, David D. and David R. Wilson. “A Comparison of Commercial and
Military Computer Security Policies,” Proc. of the 1987 IEEE Symposium on Security and Privacy,
IEEE, 1987, pp. 184-196.
[Common Criteria Part 1] Common Criteria Project, Common Criteria for Information Technology
Security Evaluation Part 1: Introduction and general model, Version 2.1, CCIMB-99-031, August
1999.
[Common Criteria Part 2] Common Criteria Project, Common Criteria for Information Technology
Security Evaluation Part 2: Security Functional Requirements, Version 2.1. CCIMB-99-031, August
1999.
[COBIT ??] COBIT ( Control Objectives for Information and Related Technology) framework for IT
governance at http://www.isaca.org/Template.cfm?
Section=COBIT_Online&Template=/ContentManagement/ContentDisplay.cfm&ContentID=15633.
[Croll 2004] Croll, Paul R. “Best Practice Approach for Software Maintenance - Sustaining Critical
Capability,” Proceedings, 16th Annual Systems and Software Technology Conference, IEEE, April
2004, pp. 1355-1440.
[CSTB 2004] Committee on Certifiably Dependable Software Systems. Summary of a Workshop on
Software Certification and Dependability. National Academies Computer Science and
Telecommunications Board, National Academies Press, 2004.
[Davis and Mullaney 2003] Davis, Noopur, and Mullaney, Julia, “The Team Software Process in
Practice: A Summary of Recent Results,” Technical Report CMU/SEI-2003-TR-014, September 2003.
[DCID 6/3 2000] Director Central Intelligence. Protecting Sensitive Compartmented Information
Within Information Systems (DCID 6/3) Manual. 24 MAY 2000.
[Deming 1986] Deming, W. Edward. Out of the Crisis. Cambridge, MA: MIT Center for Advanced
Engineering, 1986.
[Despotou 2004] Despotou, Georgios, and Tim Kelly, “Extending the Safety Case Concept to Address
Dependability,” Proceedings of the 22nd International System Safety Conference, p. 645-654, - 2004
[DIAM 50-4 1997] DIAM 50-4 Security of Compartmented Computer Operations. Department of
Defense (DoD) Intelligence Information System (DODIIS) Information Systems Security (INFOSEC)
Program, 30 April 1997.
[DITSCAP 1997] DoD Instruction 5200.40, DoD Information Technology Security Certification and
Accreditation Process (DITSCAP), December 30, 1997
[DoD8510.1-M 2000] DoD8510.1-M Application Manual, July 2000
[DoDI S-3600.2] DOD Instruction S-3600.2 Information Operations (IO) Security Classification
Guidance, 6 August 1998

128
[Evans 2005] Evans, S. and J. Wallner. “Risk-Based Security Engineering Through the Eyes of the
Adversary,” Proc. 6th Ann. IEEE Systems, Man and Cybernetics Information Assurance Workshop
(IAW 05), IEEE CS Press, 2005, pp.158-165.
[Fowler 1995] Fowler, C. A., & Nesbit, R. F. “Tactical deception in air-land warfare,” Journal of
Electronic Defense, vol. 18, no. 6. June, 1995, pp. 37-44 & 76-79.
[Fench 1999] French, Wendell L. Organization Development and Transformation: Managing Effective
Change. McGraw-Hill/Irwin; 5th edition July 13, 1999.
[Fichman 1993] Fichman and Kemerer, “Adoption of Software Engineering Process Innovations: The
Case of Object Orientation,” Sloan Management Review, Winter 1993, pp. 7-22.
[Flickenger 2003] Flickenger, Rob. Wireless Hacks. O’Reilly and Associates, Inc., 2003.
[Gasser] Gasser, M. Building a Secure Computer System. Van Nostrand Reinhold, 1988.
[Goertzel and Goguen 2005] Goertzel, Karen and Goguen, Alice, et al. Application Developer’s Guide
to Security-Enhancing the Software Development Lifecycle (DRAFT). UNPUBLISHED, June 2005.
[Goldenson and Gibson 2003] Goldenson, Dennis R. and Gibson, Diane L. “Demonstrating the Impact
and Benefits of CMMI”, Special Report CMU/SEI-2003-SR-009, The Software Engineering Institute,
Carnegie Mellon University, 2003.
[Gotterbarn 1991] Gotterbarn, Donald. “Computer Ethics: Responsibility Regained,” National Forum,
vol. 71, 1991, pp. 26-31.
[Graff and van Wyk 2003] Graff, Mark G.and Kenneth R. Van Wyk. Secure Coding: Principles and
Practices. O'Reilly & Associates, 2003.
[Gutmann 2004] Gutmann, P. Cryptographic Security Architecture: Design and Verification. Springer-
Verlag, 2004.
[Hall 2002] Hall, Anthony and Rodrick Chapman. “Correctness by Construction: Developing a
Commercial Secure System,” IEEE Software, vol. 19, no. 1, Jan./Feb. 2002, pp.18-25.
[Hall 2004] Hall, Anthony, and Rod Chapman. “Correctness-by-Construction.”. Paper written for
Cyber Security Summit Taskforce SubgIbraroup on Software Process. January 2004.
[Harris 2003] Harris, Shon. All-in-One CISSP Certification. McGraw-Hill Inc., 2003.
[Hayes and Over 1997] Hayes, W. and J. W. Over, “The Personal Software Process (PSP): An
Empirical Study of the Impact of PSP on Individual Engineers.” CMU/SEI-97-TR-001, ADA335543.
Pittsburgh, PA: The Software Engineering Institute, Carnegie Mellon University, 1997.
[Herbsleb et al, 1994] Herbsleb, J. et al. "Benefits of CMM-Based Software Process Improvement:
Initial Results." CMU/SEI-94-TR-013, Software Engineering Institute, Carnegie Mellon University,
1994.
[HMAC 2002] “The Keyed-Hash Message Authentication Code (HMAC)”, FIPS 198, March 2002.
[Hoglund 2004] Hoglund, Greg, and Gary McGraw. Exploiting Software: How to break code.
Addison-Wesley, 2004
[Hoglund 2005] Hoglund, Greg and Jamie Butler. Rootkits : Subverting the Windows Kernel. Addison-
Wesley Professional, 2005
[Holz and Ravnal 2005] Holz, T. and F. Ravnal. “Detecting Honeypots and Other Suspicious
Environments,” Proc. 6th Ann. IEEE Systems, Man and Cybernetics Information Assurance Workshop
(IAW 05), IEEE CS Press, 2005, pp.29-36.
[Honeynet 2002] The Honeynet Project. Know Your Enemy. Addison-Wesley, 2002.

129
[Houston and King 1991] Houston, I., and S. King, "CICS Project Report: Experiences and Results
from the Use of Z," Proc. VDM 1991: Formal Development Methods, Springer-Verlag, New York,
1991.
[Howard 2005] Howard, Michael, David LeBlanc, John Viega, 19 Deadly Sins of Software Security,
McGraw-Hill Osborne Media; 1 edition, 2005
[Howard 2003] Howard, M., and S. Lipner, "Inside the Windows Security Push," IEEE Security &
Privacy, vol.1, no. 1, 2003, pp. 57-61.
[Howard 2002] Howard, Michael, and David C. LeBlanc. Writing Secure Code, 2nd ed., Microsoft
Press, 2002.
[Howard] Howard, M., J. Pincus, and J. Wing. “Measuring relative attack surfaces.” To appear in:
Proceedings of the Workshop on Advanced Developments in Software and Systems Security.
[Humphrey 2000] Humphrey, Watts S. Introduction to the Team Software Process, Reading, MA:
Addison Wesley, 2000.
[Humphrey 2002] Humphrey, Watts S. Winning with Software: An Executive Strategy. Reading, MA:
Addison-Wesley, 2002.
[Hunt et al, 2005] Hunt, C., J. R. Bowes, and D. Gardner. “Net Force Maneuver,” Proc. 6th Ann. IEEE
Systems, Man and Cybernetics Information Assurance Workshop (IAW 05), IEEE CS Press, 2005,
pp.419-423.
[IAD 2004] U. S. Department of Defense. Interim Report on “Software Assurance: Mitigating
Software Risks in the DoD IT and National Security Systems”. Washington, D.C.: Information
Assurance Directorate, Office of Assistant Secretary of Defense, (Networks and Information
Integration), September 2004.
[Ibrahim et al, 2004] Ibrahim, Linda, et al. Safety and Security Extensions for Integrated Capability
Maturity Models. Washington D.C.: United States Federal Aviation Administration, Sept. 2004.
[IEEE] IEEE Security and Privacy magazine and IEEE Transactions on Dependable and Secure
Computing. Institute for Electrical and Electronics Engineers Computer Society.
[IEEE 12207] IEEE/EIA Std. 12207.0:1996, Industry Implementation of International Standard
ISO/IEC 12207:1995 – Standard for Information Technology – Software Lifecycle Processes, Institute
of Electrical and Electronics Engineers, March, 1998.
[IEEE 1062] IEEE Std. 1062:1993, IEEE recommended practice for software acquisition, Institute for
Electrical and Electronics Engineers, 1993.
[ISC 2005] International Information Systems Security Certification Consortium, Code of Ethics,
<http://www.isc2.org>.
[ISO 12207] ISO/IEC Std. 12207:1995, Information Technology - Software Life Cycle Processes,
International Standards Organization, 1995.
[ISO 12207b] ISO/IEC Std. 12207:1995/2002, Information Technology - Software Life Cycle
Processes (Amendment 1), International Standards Organization, 1995.
[ISO 14764] ISO/IEC Std. 14764:1999, Information Technology – Software Maintenance,
International Standards Organization, 1999.
[ISO 15026] ISO/IEC Std. 15026:1998, Information Technology - System and Software Integrity
Levels, International Standards Organization, 1998.
[ISO 15288] ISO/IEC Std. 15288:2002 E, Systems Engineering – System Lifecycle Processes,
International Standards Organization, 2002 .

130
[ISO-15448] ISO/IEC TR 15446:2004, Information technology - Security techniques - Guide for the
production of Protection Profiles and Security Targets, JTC1/SC27 Technical Report, International
Standards Organization, 2004
[ISO 15408-3] International Standards Organization, International Standard ISO/IEC 15408-3:1999
Information technology – Security techniques – Evaluation criteria for IT security.
[ISO 17799] ISO/IEC Std. 17799:2000, Information Technology - Code of Practice for Information
Security Management, International Standards Organization, 2000.
[Jacquith 2002] Jacquith, Andrew. “The Security of Applications: Not All Are Created Equal.” At
Stake Research. http://www.atstake.com/research/reports/acrobat/atstake_app_unequal.pdf, February
2002.
[JDCSISSS 2001] Joint DoDIIS/Cryptologic SCI Information Systems Security Standards. DIA
DoDIIS Information Assurance (IA) Program, Official Use Only 31 March 2001.
[Jones 2000] Jones, Capers. Software Assessments, Benchmarks, and Best Practices, Reading, MA:
Addison-Wesley, 2000.
[Karger et al, 1990] Karger, Paul A., Mary Ellen Zurko, Douglas W. Benin, Andrew H. Mason, and
Clifford E. Kahn. “A VMM Security Kernel for the VAX Architecture,” 1990 IEEE Symposium on
Security and Privacy, IEEE, 1990.
[Kazman 2000] Kazman R., Klein M., Clements P. ATAM: Method for Architecture Evaluating the
Quality Attributes of a Software Architecture. Technical Report CMU/SEI-200-TR004. Software
Engineering Institute, Carnegie Mellon University, 2000.
[Kazman 2002] Kazman R., Asundi J., Klein M., Making Architecture Design Decisions: An
Economic Approach. SEI-2002-TR-035. Software Engineering Institute, Carnegie Mellon University,
2002.
[Kelly 2003] Kelly, T. P. Managing Complex Safety Cases, Department of Computer Science
University of York, 2003. www.cs.york.ac.uk/~tpk/sss03.pdf
[King et al. 2000] King, Steve, Jonathan Hammond, Rod Chapman, and Andy Pryor “Is Proof More
Cost-Effective Than Testing?” IEEE Transactions of Software Engineering, VOL. 26, No. 8, August
2000.
[Kotter 1996] Kotter, John P., Leading Change. Harvard Business School Press; 1st edition January
15, 1996.
[Koziol et al. 2004] Koziol, Jack et al. The Shellcoder’s Handbook: Discovering and Exploiting
Security Holes. Wiley Publishing, Inc., 2004.
[Krutz and Vines 2003] Krutz, Ronald and Russell Vines. The CISSP Prep Guide. John Wiley and
Sons, 2003.
[Landwehr 2001] Landwehr, Carl, Computer Security, IJIS vol. 1, 2001, pp. 3-13.
[Leveson 1995] Leveson, Nancy G. Safeware: System Safety and Computers, Addison-Wesley, 1995.
[Leveson 2004] Leveson, Nancy. “A Systems-Theoretic Approach to Safety in Software-Intensive
Systems.” IEEE Transactions on Dependable and Secure Computing 1, 1 (January-March 2004): 66-
86.
[Linger 1994] Linger, Richard. “Cleanroom Process Model,” IEEE Software, IEEE Computer Society,
March 1994.
[Linger 2004] Linger, Richard, and Stacy Powell, “Developing Secure Software with Cleanroom
Software Engineering”. Paper prepared for the Cyber Security Summit Task Force Subgroup on
Software Process, February 2004.

131
[McGraw 2003] McGraw, Gary E., “On the Horizon: The DIMACS Workshop on Software Security”,
IEEE Security and Privacy, March/April 2003.
[McGraw and Morrisett] Gary McGraw and Greg Morrisett, “Attacking Malicious Code: A report to
the Infosec Research Council”, submitted to IEEE Software and presented to the Infosec Research
Council. http://www.cigital.com/~gem/malcode.pdf.
[McGraw 2004] McGraw, Gary, “Software Security”, IEEE Security and Privacy, to appear March
2004.
[Manadhata and Wing 2004] Manadhata, P. and J. M. Wing. "Measuring A System's Attack
Surface,"http://www-2.cs.cmu.edu/afs/cs/project/calder/www/tr04-102.pdf, CMU-TR-04-102, January
2004.
[Mantel 2002] Mantel, Heiko, “On the Composition of Secure Systems,” IEEE Symposium on Security
and Privacy, p. 88, 2002
[Meier 2004] Meier, J.D., Alex Mackman, Srinath Vasireddy, Michael Dunner, Ray Escamilla, and
Anandha Murukan, Improving Web Application Security: Threats and Countermeasures, Microsoft,
2004 http://download.microsoft.com/download/d/8/c/d8c02f31-64af-438c-a9f4-
e31acb8e3333/Threats_Countermeasures.pdf
[Meier 2005a] Meier, J.D., Alex Mackman, Blaine Wastell, Threat Modeling Web Applications,
Microsoft Corporation, May 2005 http://msdn.microsoft.com/security/default.aspx?pull=/library/en-
us/dnpag2/html/tmwa.asp
[Meier 2005b] Meier, J.D., Alex Mackman, Blaine Wastell, Cheat Sheet: Web Application Security
Frame, Microsoft Corporation, May 2005 http://msdn.microsoft.com/security/default.aspx?
pull=/library/en-us/dnpag2/html/tmwacheatsheet.asp?_r=1
[Meier 2005c] Meier, J.D., Alex Mackman, Blaine Wastell, Template Sample: Web Application Threat
Model, Microsoft Corporation, May 2005 http://msdn.microsoft.com/security/default.aspx?
pull=/library/en-us/dnpag2/html/tmwatemplatesample.asp?_r=1
[Merkow and Breithaupt 2004] Merkow, Mark S. and Jim Breithaupt. Computer Security Assurance.
Thomson Delmar Learning, 2004.
[Meunier 2004] Meunier, Pascal, “Secure Programming Educational Material,” Purdue University,
http://www.cerias.purdue.edu/homes/pmeunier/secprog/, 2004.
[Mills and Linger 2002] H. Mills and R. Linger, “Cleanroom Software Engineering,” Encyclopedia of
Software Engineering, 2nd ed., (J. Marciniak, ed.), John Wiley & Sons, New York, 2002.
[Ministry of Defence 2003b] Ministry of Defence. Defence Standard 00-42 Issue 2 Reliability and
Maintainability (R&M) Assurance Guidance Part 3 R&M Case, 6 June 2003.
[Ministry of Defence 2004a] Ministry of Defence. Interim Defence Standard 00-56 Safety
Management Requirements for Defence Systems Part 1: Requirements, 17 December 2004
[Ministry of Defence 2004b] Ministry of Defence. Interim Defence Standard 00-56 Safety
Management Requirements for Defence Systems Part 2: Guidance on Establishing a Means of
Complying with Part 1, 17 December 2004
[Moffett 2004] Moffett, Jonathan D. Charles B. Haley, and Bashar Nuseibeh, Core Security
Requirements Artefacts, Security Requirements Group, The Open University, UK, 2004
[Moffett and Nuseibeh 2003] Moffett, Jonathan D. and Bashar A. Nuseibeh. A Framework for Security
Requirements Engineering. Report YCS 368, Department of Computer Science, University of York,
2003.
[Moore 1999] Moore, Geoffrey A., Inside the Tornado : Marketing Strategies from Silicon Valley's
Cutting Edge. HarperBusiness; Reprint edition July 1, 1999.

132
[Moore 2002] Moore, Geoffrey A. Crossing the Chasm. Harper Business, 2002.
[Moteff 2004] Moteff, John, Computer Security: A Summary of Selected Federal Laws, Executive
Orders, and Presidentidal Directives (Order Code RL32357), Congressional Research Services, April
16, 2004.
[NASA 1995] Formal Methods Specification and Verification Guidebook for Software and Computer
Systems: Volume 1: Planning and Technology Insertion. Available at
http://www.fing.edu.uy/inco/grupos/mf/TPPSF/Bibliografia/fmguide1.pdf, July 1995.
[Naur 1993] Naur, P. "Understanding Turing's Universal Machine - Personal Style in Program
Description", The Computer Journal, Vol 36, Number 4, 1993.
[Neumann 1995] Neumann, Peter G. Architectures and Formal Representations for Secure Systems,
Final Report SRI Project 6401, October 2, 1995
[Neumann 2000] Neumann, P. G. Practical architectures for survivable systems and networks.
Technical report, Final Report, Phase Two, Project 1688, SRI International, Menlo Park, California,
2000.
[Neumann 2003] Neumann, P.G. Principled Assuredly Trustworthy Composable Architectures (Draft),
Dec., 2003.
[NIST 2005] The National Institute of Standards and Technology. Common Criteria v. 3.0, July, 2005.
[NRC 1999] Committee on Information Systems Trustworthiness, Trust in Cyberspace, Computer
Science and Telecommunications Board, National Research Council, 1999.
[NSA 1990] National Security Agency, NSA/CSS Manual 130-1, Operational Computer Security,
October 1990.
[NSA 2002] National Security Agency, The Information Systems Security Engineering Process
(IATF) v3.1., 2002.
[NSA 2004] Information Assurance Directorate, National Security Agency, U.S. Government
Protection Profile for Separation Kernels in Environments Requiring High Robustness Version 0.621.
National Security Agency, 1 July 2004
[Open Group] Open Group. Security Design Patterns (SDP) Technical Guide v.1, April 2004.
[Parnas 1985] Parnas, D. L. and Weiss, D. M. “Active design reviews: principles and practices.” In
Proceedings of the 8th international Conference on Software Engineering (London, England, August
28 - 30, 1985). International Conference on Software Engineering. IEEE Computer Society Press, p.
132-136. 1985.
[Patterson and Conner 1983] Patterson, Robert W. & Conner, Darryl R. “Building Commitment to
Organizational Change.” Training and Development Journal, April 1983, pp. 18-30.
[Payne 2004] Payne, Jeffery E. “Regulation and Information Security: Can Y2K Lessons Help Us?”
IEEE Security and Privacy. March/April 2004
[Pfleeger 1997] Pfleeger, Charles, Security in Computing, Prentice Hall PTR, 1997.
[Pfleeger] Pfleeger, Shari Lawrence, and Les Hatton, "Investigating the Influence of Formal Method",
IEEE Computer, vol. 30, no. 2, Feb 1997.
[Powell 1999] Powell, S., C. Trammell, R. Linger, and J. Poore, Cleanroom Software Engineering:
Technology and Process, Addison Wesley, Reading, MA, 1999.
[Prasad 1998] Prasad, D. Dependable Systems Integration using Measurement Theory and Decision
Analysis. PhD Thesis, Department of Computer Science, University of York, UK, 1998.
[Pullum 2001] Pullum, L. L. Software Fault Tolerance, Artech House, 2001.

133
[Radack 2005] Radack, Shirley, editor. Standards for Security Categorization of Federal Information
and Information Systems. Federal Information Processing Standard (FIPS) 199, July 10, 2005.
[Rasmussen 2004] Rasmussen, Michael, with Natalie Lambert, Security Assurance Needed In
Software Development Contracts, Research Report, Forrester, May 24, 2004.
http://www.forrester.com/ER/Research/List/Analyst/Personal/0,2237,830,00.html
[Redwine 2004] Redwine, Samuel T., Jr., and Noopur Davis (Editors). Processes for Producing
Secure Software: Towards Secure Software. vols. I and II. Washington, D.C.: National Cyber Security
Partnership, 2004.
[Redwine 2005a] Redwine, Samuel T., Jr. Dependability Properties: Enumeration of Primitives and
Combinations with Observations on their Measurement. Commonwealth Information Security Center
Technical Report CISC-TR-2004-001, June 2005.
[Redwine 2005b] Samuel T. Redwine, Jr., Principles for Secure Software: A Compilation of Lists.
Commonwealth Information Security Center Technical Report CISC-TR-2005-002.
[Rogers 1995] Rogers, Everett. Diffusion of Innovations. Free Press, 1995.
[Ross 2005] Ross, Ron et al. “Recommended Security Controls for Federal Information Systems,”
NIST Special Publication 800-53, Feb. 2005.
[Rowe 2004] Rowe, Neil C. “Designing Good Deceptions in Defense of Information Systems,”
ACSAC 2004 http://www.acsac.org/2004/abstracts/36.html.
[S4EC} System Security, Survivability, and Safety Engineering Criteria (S4EC) Project,
www.s4ec.org
[SafSec Introduction] “Why SafSec?” SafSec: Integration of Safety and Security.
http://www.safsec.com/safsec_files/resources/50.6_Why_SafSec.pdf.
[SafSec Guidance] “SafSec Methodology: Guidance Material.” SafSec: Integration of Safety and
Security. http://www.safsec.com/safsec_files/resources/50_3_SafSec_Method_
Guidance_Material_2.6.pdf.
[SafSec Standard] “SafSec Methodology: Standard.” SafSec: Integration of Safety and Security.
http://www.safsec.com/safsec_files/resources/50_2_SafSec_Method_Standard_2.6.pdf
[Saltzer and Schroeder 1975] Saltzer, J. H. and M. D. Schroeder. “The protection of information in
computer systems,” Proceedings of the IEEE, vol. 63, no. 9, 1975, pp. 1278-1308.
Available on-line at http://cap-lore.com/CapTheory/ProtInf/.
[SCC Part 2 2002] Information Assurance Engineering Division SPAWAR System Center Charleston,
Security Certification Criteria for Information Assurance Enabled Systems and Infrastructures Part 2:
Functional Certification Criteria, Version 0.2, Information Assurance Engineering Division SPAWAR
System Center Charleston, 1August 2002. http://www.s4ec.org/scc_part2_ver0-21.pdf
[Schell 2005] Roger Schell, Keynote Talk, International Workshop on Information Assurance, March
24, 2005.
[Schneier 2000] Schneier, Bruce. Secrets and Lies: Digital Security in a Networked World, John Wiley
& Sons, 2000.
[SDI 1992] Department of Defense Strategic Defense Initiative Organization. Trusted Software
Development Methodology, SDI-S-SD-91-000007, vol. 1, 17 June 1992.
[SEI] SEI, Technology Transition Practices, http://www.sei.cmu.edu/ttp/value-networks.html.
[Senge 1994] Senge, Peter M., The Fifth Discipline. Currency; 1st edition October 1, 1994.
[SHS 2002] “Secure Hash Standard (SHS)”, FIPS 180-2, August 2002.

134
[Skoudis 2002] Skoudis, Ed, Counter Hack: A Step-by-step Guide to Computer Attacks and Effective
Defenses, Prentice Hall, 2002.
[Smedinghoff 2003] Smedinghoff, Thomas J. “The Developing U.S. Legal Standard for
Cybersecurity,” Baker and McKenzie, Chicago, <www.bakernet.com/ecommerce/us%20cybersecurity
%20standards.pdf>, May 2003.
[Smedinghoff 2004] Smedinghoff, Thomas J. “Trends in the Law of Information Security,” World
Data Protection Report, The Bureau of National Affairs, Inc. vol. 4, no. 2, August 2004.
[Spivey 1992] Spivey, J.M. The Z Notation: A Reference Manual, 2nd Edition. Prentice-Hall, 1992.
[Sommerville 2004] Sommerville, I. Software Engineering. 7th ed. Pearson Education, 2004.
[Stoneburner et al, 2004] Stoneburner, Gary, Hayden, Clark, and Feringa, Alexis. Engineering
Principles for Information Technology Security (A Baseline for Achieving Security).NIST Special
Publication 800-27 Rev A, June 2004.
[Swiderski 2004] Swiderski, F. and Snyder, W. Threat Modeling. Microsoft Press, 2004.
[Szor 2005] Szor, Peter. The Art of Computer Virus Research and Defense. Addison-Wesley
Professional, 2005
[Thompson 2005] Thompson, H. H. and S. G. Chase. The Software Vulnerability Guide. Charles River
Media, 2005.
[US Army 2003] US Army, Field Manual (FM) 3-13: Information Operations: Doctrine, Tactics,
Techniques, and Procedures, 28th Nov., 2003.
[US DoD 1996] Joint Chiefs of Staff, DoD JP 3-58 Joint Doctrine for Military Deception, 31 May,
1996.
[US DoD 2004] U. S. Department of Defense. Interim Report on “Software Assurance: Mitigating
Software Risks in the DoD IT and National Security Systems”. September 2004.
[Vaughn 2003] Vaughn, Steven J. “Building Better Software with Better Tools”, IEEE Computer,
September 2003, Vol 36, No 9.
[Viega 2005] Viega, J. The CLASP Application Security Process. Secure Software, 2005.
[Viega 2000] Viega, John, et al. “Statically Scanning Java Code: Finding Security Vulnerabilities,”
IEEE Software, vol. 17, no. 5, Sept./Oct. 2000 , pp. 68-74.
[Viega and McGraw 2001] Viega, John, and Gary McGraw. Building Secure Software: How to Avoid
Security Problems the Right Way, Reading, MA: Addison Wesley, 2001.
[Viega and Messier 2003] Viega, John and Matt Messier. Secure Programming Cookbook. O’Reilly
and Associates, Inc., 2003.
[Walsh 2003] Walsh, L. "Trustworthy Yet?" Information Security Magazine, Feb. 2003.
See http://infosecuritymag.techtarget.com/2003/feb/cover.shtml.
[Weinberg] The Virginia Satir change model, adapted from G. Weinberg, Quality Software
Management, Vol. 4: Anticipating Change, Ch 3.
[Whitman and Mattord 2005] Whitman, Michael and Mattord, Herbert, Principles of Information
Security, 2nd Edition, Thomson Course Technology, 2005.
[Whitten 1999] Whitten, A. and Tygar, J.D. “Why Johnny Can’t Encrypt: A Usability Evaluation of
PGP 5.0,” Proc. Ninth USENIX Security Symposium, 1999
[Whittaker and Thompson 2004] Whittaker, James, and Herbert Thompson. How to Break
Software Security. Addison-Wesley, 2003.

135
[Whittaker and Thompson 2003] Whittaker, J. A. and H. H. Thompson. How to Break Software
Security: Effective Techniques for Security Testing. Pearson Education, 2004.
[Wheeler 2003] Wheeler, David A. "Secure Programming for Linux and Unix HOWTO,"
http://www.dwheeler.com/secure-programs, March 2003.
[Wyk and McGraw 2005] van Wyk, Kenneth and Gary McGraw, After the Launch: Security for App
Deployment and Operations. Presentation at Software Security Summit, April 2005.
[Yee 2002] Ka-Ping Yee. User interaction design for secure systems. In Proceedings of the 4th
International Conference on Information and Communications Security. Springer-Verlag, 2002. LNCS
2513.
[Yee 2003] Ka-Ping Yee. Secure interaction design and the principle of least authority. Workshop on
Human-Computer Interaction and Security Systems, part of CHI2003. ACM SIGCHI, 2003.
http://sims.berkeley.edu/˜ping/sid/yee-sidchi2003-workshop.pdf.
[Yee 2004] Ka-Ping Yee. “Aligning security and usability.” Security & Privacy Magazine, 2:
48–55, Sept–Oct 2004.
[Yee 2005] Ka-Ping Yee. Guidelines and strategies for secure interaction design. In Lorrie Cranor and
Simson Garfinkel, editors, Security and Usability. O’Reilly, 2005.
[Zwicky et al, 2000] Zwicky, Elizabeth D., Simon Cooper, and D. Brent Chapman. Building Internet
Firewalls (2nd ed.), O'Reilly, 2000.

136
8 Standards

ISO 6592:2000 Information Technology - Guidelines for the Documentation of


Computer-Based Application Systems

ISO 9126-1: 2001 Software Engineering - Product Quality - Part 1: Quality Model

ISO/TR 9126-2: 2003 Software Engineering - Product Quality - Part 2: External Metrics

ISO/TR 9126-3: 2003 Software Engineering - Product Quality - Part 3: Internal Metrics

ISO 12119: 1994 Information Technology - Software Packages - Quality Requirements


and Testing

ISO/CD 12119: 200x Software Engineering - Software Product Evaluation - Requirements for
quality of COTS software product and Instructions for Testing

ISO 12207: 1995 Information Technology - Software Life Cycle Processes

ISO 12207: 1995/2002 Information Technology - Software Life Cycle Processes (Amendment
1)

ISO/TR 13335-1: 1996 Information Technology - Guidelines for the Management of IT


Security - Part 1: Concepts and Models for IT Security

ISO/FCD 13335-1: 200x Information Technology - Management of Information and


Communications Technology Security - Part 1: Concepts and Models
for Information and Communications Technology Security Management

ISO/TR 13335-2: 1997 Information Technology - Guidelines for the Management of IT


Security - Part 2: Managing and Planning IT Security

ISO/TR 13335-3: 1998 Information Technology - Guidelines for the Management of IT


Security - Part 3: Techniques for the Management of IT Security

ISO/TR 13335-4: 2000 Information Technology - Guidelines for the Management of IT


Security - Part 4: Selection of Safeguards

137
ISO/TR 13335-5: 2001 Information Technology - Guidelines for the Management of IT
Security - Part 5: Management Guidance on Network Security

ISO 14143-1: 1998 Information Technology - Software Measurement - Functional Size


Measurement -
Part 1: Definition of Concepts

ISO/AWI 14143-1: 200x Information Technology - Software Measurement - Functional Size


Measurement -
Part 1: Definition of Concepts

ISO 14143-2: 2002 Information Technology - Software Measurement - Functional Size


Measurement -
Part 2: Conformity Evaluation of Software Size Measurement Methods

ISO/TR 14143-3: 2003 Information Technology - Software Measurement - Functional Size


Measurement -
Part 3: Verification of Functional Size Measurement Methods

ISO/TR 14143-4: 2002 Information Technology - Software Measurement - Functional Size


Measurement -
Part 4: Reference Model

ISO/TR 14143-5: 2004 Information Technology - Software Measurement - Functional Size


Measurement -
Part 5: Determination of Functional Domains for use with Functional
Size Measurement

ISO/TR 14516: 2002 Information Technology - Security Techniques -- Guidelines for the Use
and Management of Trusted Third Party Services

ISO 14598-1: 1999 Information Technology - Software Product Evaluation - Part 1: General
Overview

ISO 14598-2: 2000 Information Technology - Software Product Evaluation - Part 2:


Planning and Management

ISO 14598-3: 2000 Information Technology - Software Product Evaluation - Part 3:


Developer’s Guide

ISO 14598-4: 1999 Software Engineering - Software Product Evaluation - Part 4: Process
for Acquirers

138
ISO 14598-5: 1998 Information Technology - Software Product Evaluation - Part 5: Process
for Evaluators

ISO 14598-6: 2001 Software Engineering - Product Evaluation - Part 6: Documentation of


Evaluation Modules

ISO 14756: 1999 Information Technology - Measurement and Rating of Performance of


Computer-based Software Systems

ISO 15026: 1998 Information Technology - System and Software Integrity Levels

ISO 15288: 2002 Systems Engineering - System Life Cycle Processes

ISO/TR 15504-1: 1998 Information Technology - Software Process Assessment - Part 1:


Concepts and Introductory Guide

ISO/CD 15504-1: 200x Information Technology - Software Process Assessment - Part 1:


Concepts and Vocabulary

ISO 15504-2: 2003 Software Engineering - Process Assessment - Part 2: Performing An


Assessment

ISO/TR 15504-3: 2004 Information Technology - Software Process Assessment - Part 3:


Guidance on Performing an Assessment

ISO/TR 15504-4: 1998 Information Technology - Software Process Assessment - Part 4: Guide
to Performing Assessments

ISO/FDIS 15504-4: 200x Software Engineering - Process Assessment - Part 4: Guidance on use
for Process Improvement and Process Capability Determination

ISO/TR 15504-5: 1999 Information Technology - Software Process Assessment - Part 5: An


Assessment Model and Indicator Guidance

ISO/CD 15504-5: 200x Information Technology - Software Process Assessment - Part 5: An


Exemplar Process Assessment Model

ISO/TR 15504-7: 1998 Information Technology - Software Process Assessment - Part 7: Guide
for Use in Process Improvement

ISO/TR 15504-8: 1998 Information Technology - Software Process Assessment - Part 8: Guide

139
for Use in Determining Supplier Process Capability

ISO/TR 15504-9: 1998 Information Technology - Software Process Assessment - Part 9:


Vocabulary

ISO/TR 15846: 1998 Information Technology - Software Life Cycle Processes: Configuration
Management

ISO 15910: 1999 Information Technology - Software User Documentation Process

ISO 15939: 2002 Information Technology - Software Measurement Process Framework

ISO/FCD 15940: 200x Information Technology - Software Engineering - Environment Services

ISO/DIS 16085: 200x Information Technology - Software Life Cycle Processes - Risk
Management

ISO/TR 16326: 1999 Software Engineering - Guide for the Application of ISO 12207 to
Project Management

ISO 17799: 2000 Information Technology - Code of Practice for Information Security
Management

ISO/CD 17799: 200x Information Technology - Code of Practice for Information Security
Management

ISO 18019: 2004 Software and Systems Engineering - Guidelines for the Design and
Preparation of User Documentation for Application Software

ISO/DTR 19759: 200x Software Engineering - Guide to the Software Enginering Body of
Knowledge

ISO/AWI 19796: 200x Information Technology - Learning, Education, and Training - Quality
Managemenbt, Assurance, and Metrics

ISO/CD 25000: 200x Software Engineering - Software Product Quality Requirements and
Evaluation (SQuaRE) - Guide to SQuaRE

ISO/CD 25020: 200x Software and System Engineering - Software Quality Requirements and
Evaluation (SQuaRE) - Measurement Reference Model and Guide

140
ISO/CD 25021: 200x Software and System Engineering - Software Product Quality
Requirements and Evaluation (SQuaRE) - Measurement Primitives

ISO/CD 25030: 200x Software Engineering - Software Quality Requirements and Evaluation
(SQuaRE) - Quality Requirements

141
142
Workforce Education and Training Working Group
DoD/DHS Software Assurance

14 August 2005 – For Review and Comment Only

143
Edited by Samuel T. Redwine, Jr.

144

Das könnte Ihnen auch gefallen