Sie sind auf Seite 1von 262

Standards

and
Guidelines
Notebook

July 1, 2009
This page is intentionally blank.
To: Members of the Special Committee on Joint Development and
Product/Project Task Forces Chairpersons
From: Technical & Application Architecture Task Force
Subject: AASHTOWare Standards & Guidelines Notebook

Enclosed for your use is the AASHTOWare Standards & Guidelines Notebook. This notebook
contains all currently approved standards and guidelines. Please refer to the notebook's
introduction for explanations of its content, organization, scope, maintenance, and compliance
requirements. The latter of these, because of its importance, bears restatement.
Compliance with the approved AASHTOWare Standards is required from their effective
date. Any exception to the application of approved standards requires the approval of
the Special Committee on Joint Development.
All new contracts should include the approved standards and guidelines in this notebook.
These standards are living documents. They should be expected to change along with
AASHTOWare development practices and technology. User input will be appreciated to insure
that these documents always reflect these changing circumstances.
A summary of changes made since the previous approved release of the notebook is provided
as an attachment to this letter.
Questions concerning application for exceptions should be directed to your SCOJD or AASHTO
Staff Liaison. Technical questions about the notebook and its contents may be directed to the
members of the T&AA Task Force.

cc: AASHTO Staff and T&AA Task Force members


Attachment: Summary of Changes
This page is intentionally blank.
Summary of Changes

Standards and Guidelines Notebook


Summary of Changes

The following summarizes the changes that have been made to this version of Standards and
Guidelines Notebook since the previous approved release of the notebook.
● The cover letter was updated and a summary of changes was attached.
● The Introduction was reformatted and updated.
● All standards and guidelines have been renumbered and reformatted, and include a new
cover page.
● Standard numbers are appended with an “S” and guideline numbers with a “G”. The
numbers of reference or informational documents are appended with a “R”.
● All standards and guidelines are published in the notebook in numerical order with a single
table of contents in lieu of the previous grouping of all standards and then all guidelines.
● The AASHTOWare Standards and Guidelines Definition Standard (1.010.01S) replaces the
previous Standards Proposal and Approval Standard (1.02.010.03). This standard
describes the process used by AASHTOWare to establish and maintain standards and
guidelines, the Standards and Guidelines Notebook, and associated collaboration
workspaces.
● The Development Methodology Guideline (1.020.02G) was reformatted and renumbered.
The previous guideline number was 1.02.G35.01.
● The Requirements Standard (3.010.02S) replaces the previous Requirements Management
Standard (3.01.001.01).
■ Provide provides new procedures for requirements development, analysis, and
validation, in addition to those for requirements management.
■ The System Requirements Specification (SRS) now must include security, accessibility,
interface, user interface, and performance requirements. Interface requirements should
include data transfer/exchange requirements and the use XML for new development.
■ The requirements that describe the approach for compliance with Section 508 of the
U.S. Rehabilitation Act and the Web Content Accessibility Guidelines (WCAG) of the
World Wide Web Consortium Web Accessibility Initiative (W3C WAI) must be included
with the accessibility requirements.
■ Some required elements of the User Requirements Specification (URS) and the
Requirements Traceability Matrix (RTM) have been eliminated.
■ The Requirements Activity Log and Requirements Acceptance Criteria work products
have been have eliminated.
● The XML Standard (3.015.01S) replaces the previous XML Implementation & Migration
Guideline (3.03.G20.01). Compliance with this standard is now required.
■ Supplements the Requirements Standard.
■ Provides direction for developing data transfer and data exchange requirements.
■ Requires the use of XML for data transfer/exchange for new development.
● The Security Standard (3.020.01S) is a new standard created to provide direction for
developing security requirements and implementing security in AASHTOWare products.

Page 1 06/16/2009
Summary of Changes

● The Product Graphical Interface Standard (3.030.03S) was reformatted and renumbered.
The previous standard number was 3.03.010.02.
● The Communication Interface Guideline (3.03.G40.01) was discontinued and removed from
the notebook.
● The Database Selection and Use Guideline (3.040.02G) replaces the previous Database
Selection Guideline (3.03.G50.01). This guideline, which provides guidance for applying
industry standards in the use of databases for AASHTOWare products, has been updated
and reformatted.
● The Product Documentation Standard (3.050.04S) was reformatted and renumbered. The
previous standard number was 3.04.020.03.
● The Glossary of Product Terminology Standard (3.060.03S) was reformatted and
renumbered. The previous standard number was 3.04.040.02.
● The Application Design Guideline (3.03.G30.03) was discontinued and removed from the
notebook.
● The Testing Standard (3.080.02S) was reformatted and renumbered. The previous
standard number was 3.06.001.01.
● The Product Release Checklists Standard (3.085.05S) was reformatted and renumbered.
References to previous standard numbers within the standard were also changed. The
previous standard number was 3.04.010.04.
● The Installation and Use Training Guideline (3.090.02G) was reformatted and renumbered.
The previous guideline number was 3.04.G50.01.
● The Quality Assurance Standard (4.01.02S) is a new standard that describes the
AASHTOWare quality assurance process.
■ Replaces interim standard used during a pilot QA process.
■ Deliverables are submitted for evaluation twice during the fiscal year.
■ The second evaluation involves a visit/meeting at the contractor site.
■ Evaluation reports are created by an AASHTOWare QA analyst to document areas of
non-compliance with approved standards.
● The Disaster Recovery Standard (4.020.02S) was reformatted and renumbered. The
previous standard number was 4.01.030.01.
● The AASHTOWare Lifecycle Framework (ALF) document (5.010.01R) replaces the
previous AASHTOWare Lifecycle Framework Process Areas (1.01.G01.01) and
AASHTOWare Lifecycle Framework Work Products (1.01.G02.01) documents. The revised
ALF document has been simplified and the amount of content has been reduced. The
document describes the framework used for AASHTOWare process improvements. In
addition, the document describes each process area with the related process areas, specific
goals, specific practices, and typical work products. A general reference to the existing
AASHTOWare standards, guidelines, policies, and procedures that support each process
areas is also provided. The ALF document is now located in the notebook Appendices.
● The Standards and Guidelines Glossary has been updated and moved to the notebook
Appendices.

Page 2 06/16/2009
AASHTOWare Standards and Guidelines
Table of Contents
Introduction
1-Process Management
1.010S AASHTOWare Standards and Guidelines Definition Standard
1.020G Development Methodology Guideline

2-Project Management
Project management standards will be created in future versions of
the notebook.

3-Software Engineering
3.010S Requirements Standard
3.015S XML Standard
3.020S Security Standard
3.030S Product Graphical Interface Standard
3.040G Database Selection and Use Guideline
3.050S Product Documentation Standard
3.060S Glossary of Product Terminology Standard
3.080S Testing Standard
3.085S Product Release Checklists Standard
3.090G Installation and Use Training Guideline

4-Support
4.010S Quality Assurance Standard
4.020S Disaster Recovery Standard

5- Appendices
5.010R AASHTOWare Life Cycle Framework
5.020R Standards and Guidelines Glossary
This page is intentionally blank.
Introduction

AASHTOWARE STANDARDS & GUIDELINES NOTEBOOK


1. Introduction
The Special Committee on Joint Development (SCOJD) formed the Technical & Application
Architecture (T&AA) Task Force to provide standards & technical guidance for the development
of AASHTOWare software products. The purpose of these standards and guidelines was and is
to maximize the return on investment, improve the quality, and increase the usefulness of the
products.
Later the SCOJD determined that there was a need for the improvement of the AASHTOWare
development practices. The AASHTOWare Lifecycle Framework (ALF) was developed to
investigate and recommend potential process improvements. ALF provides a framework for
creating AASHTOWare process improvement projects. These projects involve the development
of new standards and guidelines and the revision of existing standards and guidelines that are
based on goals and practices within the framework.
AASHTOWare’s goal is to improve its software development and maintenance processes, and,
subsequently, improve AASHTOWare products. Over time most of the existing standards and
guidelines will be redeveloped to include elements that address process improvement.
The AASHTOWare Lifecycle Framework document (05.02.02R) located in the Appendices of
the notebook describes the framework and the implementation within current and planned
standards and guidelines.
While pursuing the above purposes or objectives the following principles should be emphasized
and employed.
● Standards should be adaptable to changing technological and procedural circumstances so
as not to hamper product growth or viability. They should not be viewed as static, but rather
as dynamic specifications which can be easily revised whenever circumstances change and
be retired whenever they no longer achieve their objectives.

• Standards should not be developed or implemented for their own sake, but only where there
are apparent opportunities for benefiting AASHTOWare products.
• The development and implementation of standards should be a cooperative effort. All
participants in the AASHTOWare development process should be included in the
formulation, review, and implementation of standards and their perspectives and
requirements should be respected.
• Standards should not ordinarily be applied retroactively. Their application should be
coordinated with scheduled product enhancements in order to avoid any unnecessary
disruptions in service or wasteful use of resources.
• Standards should be designed to avoid, as far as possible, increasing the administrative
burdens of the Project and Product Task Forces.
This notebook is designed to provide a repository and vehicle of communication for the
AASHTOWare Standards and Guidelines.

Page 1 06/10/2009
Introduction

2. Organization
The notebook is divided into the following sections.
● 1 - Process Management
● 2 - Project Management
● 3 –Software Engineering
● 4 - Support
● 5 - Appendices
The standards and guidelines will be numbered to correspond to the sections to which they
belong. They will be ordered sequentially by number in their respective sections. For a more
detailed description of the numerical format refer to the “AASHTOWare Standards and
Guidelines Definition Standard” (01.010.02S), in this notebook.
Guidelines, which are identified by the suffix “G” following the guideline number, are used to
convey suggestions and recommendations which may be useful to Project and Product Task
Forces but are not binding. Standards, however, which include an “S” suffix, must be complied
with from their effective date. Only approved standards and guidelines will be included in the
notebook. Reference or informational documents, such as the Glossary, include an “R” suffix.

3. Format
The AASHTOWare Standards and Guidelines Definition Standard” (01.010.02S) describes the
format of standards and guidelines documents. Each standard and guideline includes sections
that provide the purpose of the standard and overviews of responsibilities and the
required/recommended deliverables and work products. This information should be useful to
the Project and Product Task Forces and the contractor for determining the applicability of the
standard to their endeavors.
Standards also include additional details regarding procedures, technical requirements, and
definitions of required work product and deliverable. Also, many of the standards use red
italicized text to highlight the activities that must be followed and work products that must be
produced in order to comply with the standard.

4. Requirements & Exemptions


All standards are in force from their effective date. Exemptions from their application require the
approval of the Special Committee on Joint Development.

Page 2 06/10/2009
1 – Process
Management
This page is intentionally blank.
AASHTOWARE
STANDARDS AND
GUIDELINES DEFINITION
STANDARD
S&G Number: 1.010.01S
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 5/01/2009 Initial Version. Replaces AASHTOWare Standards 05/26/2009
Proposal and Approval standard (1.02.010.03). Approved by
Presented to T&AA. Made corrections. Reviewed SCOJD
by stakeholders with no changes suggested.
------------------------------------------------------------------
Additional minor changes and format modifications
for publishing were approved by T&AA on
06/16/2009.

06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

Table of Contents
1. Purpose ............................................................................................................... 1
2. Responsibilities .................................................................................................. 1
3. Deliverables and Work Products....................................................................... 2
4. Procedures.......................................................................................................... 2
4.1 Establish the Standards and Guidelines Workspaces.....................................2
4.1.1 Create S&G Notebook Workspace ...................................................................... 2
4.1.2 Copy Content to S&G Notebook Workspace....................................................... 3
4.1.3 Create S&G Development Workspace ................................................................ 3
4.2 Develop or Revise Standards and Guidelines..................................................3
4.2.1 Analyze the Objective or Assignment .................................................................. 3
4.2.2 Develop or Revise Standard or Guideline ........................................................... 4
4.2.3 Obtain T&AA and Stakeholder Feedback ............................................................ 5
4.2.4 Approve Standard or Guideline............................................................................ 6
4.2.5 Store Standard or Guideline in S&G Notebook Workspace ................................ 7
4.2.6 Delete Development Files and Sub Folder .......................................................... 7
4.2.7 Update ALF Document ........................................................................................ 7
4.3 Create and Publish the Standards and Guidelines Notebook.........................7
4.3.1 Compile New Notebook ....................................................................................... 7
4.3.2 Move Current Notebook to History....................................................................... 8
4.3.3 Move New Notebook to Current........................................................................... 8
4.3.4 Notify Stakeholders .............................................................................................. 8
4.4 Develop and Maintain Lifecycle Model Descriptions.......................................8
4.4.1 Develop Lifecycle Model ...................................................................................... 8
4.4.2 Review and Approve Lifecycle Model .................................................................. 9
4.4.3 Store and Publish the Life Cycle Model ............................................................... 9
4.4.4 Maintain Life Cycle Model .................................................................................... 9
4.5 Establish Standard Requirements and Customization Guidelines.................9
4.6 Request an Exception to Standards .................................................................9
4.7 Establish Measurement Repository ................................................................10
4.8 Maintain the Standards and Guidelines..........................................................10
5. Technical Requirements .................................................................................. 10
6. Deliverable and Work Product Definitions ..................................................... 11
6.1 Standard or Guideline......................................................................................11
6.1.1 Description ......................................................................................................... 11
6.1.2 Content............................................................................................................... 11
6.2 Standards and Guidelines Notebook ..............................................................12
6.2.1 Description ......................................................................................................... 12
6.2.2 Content............................................................................................................... 12
6.3 S&G Notebook Workspace ..............................................................................12
6.3.1 Description ......................................................................................................... 12
6.3.2 Content (by Workspace Tabs) ........................................................................... 12
6.4 S&G Development Workspace ........................................................................14
6.4.1 Description ......................................................................................................... 14
6.4.2 Content (by Workspace Tabs) ........................................................................... 14
6.5 Updates to the AASHTO Lifecycle Framework (ALF) Specification .............14
6.5.1 Description ......................................................................................................... 14
6.5.2 Content............................................................................................................... 14

Page i 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

1. Purpose
The AASHTOWare Standard and Guideline Definition standard defines the process used to
establish and maintain AASHTOWare standards and guidelines.
A standard describes mandatory procedures that must be followed, results that must be
produced, and technologies and technical specifications that must be used or adhered to during
the development and maintenance of AASHTOWare products. AASHTOWare standards are
created and implemented in order to ensure a consistent approach is used to develop, maintain
and deliver software products.
A guideline describes procedures, results, technical specifications and/or technologies that are
considered good practices to follow, produce, or use; however, these are not required. A
proposed standard or standard process may be initially implemented as a guideline with future
plans to implement it as a requirement. Refer to the glossary in the Standards and Guidelines
Notebook for additional definitions of terms used in this document.

2. Responsibilities
Where most of the AASHTOWare standards focus on the project/product task force and
contractor responsibilities, this standard focuses more on the responsibilities of the Special
Committee on Joint Development (SCOJD) and the Technical and Application Architecture
(T&AA) Task Force. SCOJD and T&AA are responsible for the majority of the work associated
with the standards, while the responsibilities of the task forces and contractors are limited.
The responsibilities of all AASHTOWare participants impacted by this standard are summarized
below: Additional details on these responsibilities are provided in the Procedures section of this
document.
● The Special Committee on Joint Development (SCOJD) is responsible for:
■ Defining the needs and setting the objectives for AASHTOWare process improvement,
and for new or revised standards and guidelines.
■ Approving all new and revised standards.
● The Technical and Application Architecture (T&AA) Task Force is responsible for:
■ Developing, reviewing, revising, and maintaining AASHTOWare standards, guidelines,
and related documentation; and for performing analysis and research associated with
these activities.
■ Reviewing all standards and guidelines annually to ensure that each document is
correct, up-to-date, and relevant.
■ Communicating and coordinating with the other AASHTOWare stakeholders to report
status, provide information, and resolve reported issues.
■ Approving guidelines, reference documents, the Standards and Guidelines Notebook,
and any related documentation, other than standards.
■ Approving edit changes in standards that do not change the intent or the required
components of the standard.
■ Maintaining the Standards and Guidelines Notebook and the Groove workspaces used
to develop, collaborate, store, and share the standards and guidelines.
● The project/product task force chairpersons are responsible for:
■ Reviewing and reporting issues with all new or revised standards and guideline.
■ Reviewing the current version of each standard and guideline as each is used and
applied, and reporting any issues resulting from the review or use of a standard or
guideline.

Page 1 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

■ The chairpersons may choose to appoint designees (Task Force members, Technical
Advisory Group members, Technical Review Team members, or contractor personnel)
to assist in these efforts.
● AASHTO Staff is responsible for:
■ Reviewing and reporting issues with all revised or new standards, guidelines, and
related documentation.
■ Periodically reviewing the current version of each standard and guideline, and reporting
any issues resulting from these reviews.

3. Deliverables and Work Products


The following summarizes the work products that are created or updated by the T&AA Task
Force as a result of the procedures in document. Definitions and content requirements are
provided in the Deliverable and Work Product Definitions section of this document.
• Standard or Guideline
• Standards and Guidelines Notebook
• S&G Notebook Workspace
• S&G Development Workspace
• Updates to the AASHTO Lifecycle Framework (ALF) document and Lifecycle Models

4. Procedures
This section describes the procedures used to develop, maintain, store, and publish individual
standards and guidelines and the complete AASHTOWare Standards and Guidelines Notebook.
Many of the procedures are broken down into activities and some activities may be further
broken down into tasks.

4.1 Establish the Standards and Guidelines Workspaces


The purpose of this procedure is to establish a repository workspace to store and access
approved standards and guidelines and a development/collaboration workspace to develop
and revise standards and guidelines. Both workspaces are created, maintained, and
accessed using the Microsoft Groove software. A workspace administrator is appointed by
the T&AA Task Force Chairperson to develop and maintain these workspaces.
The following activities to create the standard and guidelines workspaces are prerequisites
to all other procedures and activities defined in this standard.
4.1.1 Create S&G Notebook Workspace
This activity should be performed if the S&G Notebook workspace does not exist or
becomes corrupted and cannot be restored. A new workspace should be created in
accordance with the S&G Notebook Workspace work product definition, which is defined
in the Deliverable and Work Product Definition section of this document.
When completed, the S&G Notebook workspace should include tabs for each of the
following. Each of these is discussed with the work product definition.
○ Current S&G Notebook
○ S&G History
○ Next Version
○ Document Library
○ Discussion
The workspace administrator and the T&AA chairperson should be provided with
read/write access to this work space. All project/product task force members,

Page 2 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

contractors, SCOJD members, T&AA members, and AASHTO staff members should be
provided with read access to this workspace.
4.1.2 Copy Content to S&G Notebook Workspace
After the workspace is created, the current version of the Standards and Guidelines
notebook should be copied to the Current S&G Notebook tab. Previous versions should
be copied to root level folders in the S&G History tab. Each version of the notebook
should include the complete notebook in both PDF and the word processing formats,
and all files and documents needed to create the complete notebook, including:
○ The word processing documents for all standards and guidelines included in the
notebook;
○ The word processing documents or other editable file for the title page, cover letter,
introduction, table of contents, glossary, and any other supplemental documents
used to create the notebook; and
○ All files or documents used to compose each word processing document, such as
inserted documents, tables, or drawings.
All documents and files should be copied to the tabs using the file structure, content,
format, and naming conventions described in the S&G Notebook Workspace work
product definition.
4.1.3 Create S&G Development Workspace
This activity should be performed if the S&G Development workspace does not exist or
becomes corrupted and cannot be restored. A new workspace should be created in
accordance with the S&G Development Workspace work product definition, which is
defined in the Deliverable and Work Product Definition section of this document. The
S&G Development workspace will not include any content. All content is created or
deleted through the Develop or Revise Standards and Guidelines procedure below.
All T&AA members and the AASHTO and SCOJD liaisons should be provided with
read/write access to the workspace. SCOJD members and AASHTO staff should be
provided with read access to the workspace.

4.2 Develop or Revise Standards and Guidelines


The purpose of this procedure is to define the activities that should be performed to develop
a new standard or guideline or to revise an existing standard or guideline. These activities
begin with the initial analysis to determine if a new or revised standard or guideline is
needed and end with an approved standard or guideline which is ready to be published in
the next version of the Standards and Guidelines Notebook.
The creation and publishing of a new version of the Standards and Guidelines notebook are
discussed in the Create and Publish the Standards and Guidelines Notebook procedure
below.
4.2.1 Analyze the Objective or Assignment
The activity is performed when T&AA is provided with an objective from the
AASHTOWare Strategic Plan or an assignment from SCOJD that may require an
addition or revision to the standards and guidelines. T&AA begins by analyzing the
objective or assignment to determine the best approach for satisfying the objective or
assignment. The analysis should consider each of the following:
○ Analyze the objective/assignment and determine if it can be satisfied by revising one
or more existing standards or guidelines. This analysis typically involves the review
of existing standards and guidelines, as well as review of the ALF process areas. If
revisions are required to address the objective/assignment, one or more action items
should be defined to revise the applicable standards and guidelines.
○ Analyze the objective/assignment and the ALF process areas and determine if the
goals and practices from one or more process areas should be used to satisfy the

Page 3 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

objective/assignment. When applicable, ALF based solutions should be chosen first


to satisfy the objective/assignment. In this case, one or more action items should be
defined to create a new standard or guideline based on ALF goals and practices.
○ Analyze the objective/assignment and determine it can be satisfied by creating one
or more new standards or guidelines for processes, practices, and/or requirements
that are not included in existing standards, guidelines or within the ALF process
areas. If this method is selected, one or more action items should be defined to
create a new standard or guideline.
If one or more action items are created from the above analysis, these items should be
included as tasks in future T&AA work plans or as tasks in the current fiscal year. When
approved, these action items should be implemented using the Develop or Revise
Standard and Guideline procedure described below.
If no actions items are created to revise or develop new standards or guidelines, the
T&AA chair should advise SCOJD that another approach will be required to satisfy the
objective/assignment.
4.2.2 Develop or Revise Standard or Guideline
This activity is performed when the T&AA chairperson assigns an analyst the
responsibility for developing a new standard or guideline or revising an existing standard
or guideline. This analyst is normally a T&AA Task Force member or contractor. The
analyst should follow the steps listed below to accomplish the development or revision of
the standard or guideline.
○ Create a new folder for the standard or guideline in the S&G Development
workspace in accordance with S&G Development Workspace work product
definition.
○ In the case of a revision, copy all documents and files associated with the standard
or guideline from the Current S&G Notebook tab to the standard or guideline folder in
the S&G Development workspace.
○ Collect documentation from the previous analysis above; including the
documentation on the ALF process areas, goals, practices, and work products that
are applicable to developing or revising the standard or guideline and for satisfying
the objective that initiated the assignment.
○ If needed, collect documentation from CMMI-DEV or other source documentation
used for defining applicable ALF process areas, goals, practices, and work products.
○ Perform the appropriate industry research required to develop or revise the standard
or guideline.
○ Review the applicable methods that are currently being used and work products
being created by the project/product task forces, contractors, SCOJD, AASHTO staff
or T&AA.
○ To ensure consistency in each standard or guideline, the AASHTOWare Standard
Template must be used when creating a standard or guideline. The template is used
to define the document layout, fonts, and the required type of content. The template
is stored in the S&G Development workspace. Refer to the Standard or Guideline
work product definition for information on the specific content required in each
standard or guideline and for information regarding the location and use of the
AASHTOWare Standard Template.
○ If a standard or guideline is to be based on one or more ALF process areas, then the
ALF specific practices process areas should be used to define the procedures of the
standard or guideline. Current methods used within AASHTOWare should also be
considered for inclusion in the procedures.

Page 4 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

○ For standards and guidelines not based on ALF specific practices, use the applicable
information gathered from industry research to define the appropriate procedures
needed to implement the objective of the standard or guideline.
○ Develop each procedure with the appropriate level of detail needed to understand
how to use the procedure. If needed for clarity or simplicity, divide the procedure into
lower level activities and tasks.
○ Develop each standard so it is clear on what is expected or required of the various
stakeholders and what elements of the standard may be customized. Refer to the
Establish Standard Requirements and Customization Guidelines procedure for
additional information.
○ Each standard and guideline should include the definition and content of the required
or recommended deliverables and work products to be produced during the lifecycle
stages associated with the standard or guideline.
For the purposes of AASHTOWare standards and guidelines, a work product is
defined as a result or artifact of the software development or project management
process. Many of the work products described in a standard are also defined as
deliverables. A deliverable is also a work product; however, deliverables must be
planned and tracked in the project/product work plan and must be formally submitted
to the task force for approval or rejection.
○ For those standards and guidelines based on ALF specific practices, the applicable
work products defined in the ALF specification document should be included in the
standard or guideline work products or deliverables. Work products and deliverables
recommended through industry research and current AASHTOWare practices should
also be considered.
○ Standards and guidelines that are not based on ALF specific practices should also
consider the work products and deliverables from industry research and current
AASHTOWare practices.
○ Ensure that the procedures describe when a deliverable or work product is to be
produced, who is responsible for producing it, and the type of review and approval
required.
○ Each standard and guideline should recommend that all deliverables and work
products are versioned, stored, and controlled using configuration management
procedures.
○ Define the applicable technical specifications for the standard or guideline.
○ In the case of guidelines, all procedures, work products and technical specifications
should be defined as recommendations or best practices; whereas, standards will
typically include compliance requirements for the majority of these items.
4.2.3 Obtain T&AA and Stakeholder Feedback
During the revision or development of a standard or guideline, the analyst should initiate
reviews by T&AA and AASHTOWare stakeholders, obtain feedback, and make
modifications, as required, to address the feedback. The steps listed below should be
followed.
○ Make presentations at T&AA Task Force meetings and communicate, as required,
with T&AA members and liaisons to review working documents, provide status
information, and to collect comments and issues. The T&AA review of the standard
or guideline should include review for gaps, overlaps, and proper integration with
other standard and guidelines. T&AA online reviews are implemented through
Groove document discussions in the S&G Development workspace.
○ Address the T&AA issues and comments and prepare the standard or guideline for
additional stakeholder review.

Page 5 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

○ If the document is a standard, the T&AA Chairperson will distribute the new or
revised standard to the SCOJD chairperson, AASHTO staff manager, and the
project/product task force chairpersons. Reviews should be requested from SCOJD,
task forces, contractors, and AASHTO staff; and comments and issues should be
solicited for return to T&AA by a specific date. The distribution of the standard and
the return of comments and issues are normally accomplished by email.
○ Since there is no requirement to comply with a guideline, guidelines will have limited,
if any impact, on the project/product task forces or contractors. T&AA will normally
request a review by the above stakeholders for feedback on the content of the
guideline and the usefulness of the guideline prior to approving the guideline.
○ Prior to beginning the review, the workspace administrator copies the standard or
guideline to the _Stakeholder Review folder in the S&G Development workspace in
the Files tab. The T&AA chairperson distributes the standard from this folder. After
the review is complete, the standard or guideline is deleted from this folder.
○ The stakeholders should review the standard or guideline for how it satisfies the
original objective or assignment, use and understanding of the document,
applicability to each task force, and applicability to the AASHTOWare organization.
For a standard, the review should also determine if the standard introduces any
problems or issues for the stakeholders.
○ The analyst responsible for the standard or guideline or other T&AA members should
communicate with the stakeholders as required to provide additional information or
answer questions. If needed, presentations should be made to assist with
communication and understanding.
○ T&AA will review the issues and comments from the stakeholder review and address
those that are warranted. For standards, stakeholder reviews are repeated, as
required, to address major issues.
4.2.4 Approve Standard or Guideline
After the T&AA and stakeholder reviews are completed and the appropriate revisions are
made, the standard or guideline must be approved as described below:
○ All guidelines are approved by the T&AA Task Force.
○ Revisions to standards which only affect the format or readability of the document,
and do not change the meaning or the impact on stakeholders, are also approved by
the T&AA Task Force.
○ All other revisions to standards and new standards must be approved by SCOJD as
follows:
□ The workspace administrator copies the standard to the _SCOJD Approval folder
in the S&G Development workspace in the Files tab. After SCOJD approval, the
standard is deleted from this folder.
□ The T&AA chairperson initiates the approval of a new or revised standard by
preparing a cover letter to the SCOJD chairperson requesting SCOJD approval
of the standard. The cover letter is sent to the SCOJD chairperson by email and
the SCOJD liaison to T&AA.
□ The SCOJD liaison to T&AA copies the standard to the appropriate location for
SCOJD balloting.
□ SCOJD approves or rejects the standard and the SCOJD chairperson notifies the
T&AA chairperson of the approval decision. When rejected, the reason for
rejection is included in the communication to the T&AA chairperson.
□ If the standard is rejected, T&AA reviews the reason for rejection and makes the
appropriate changes to the standard. The approval process is repeated with
submission of the new document to the SCOJD chairperson.

Page 6 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

4.2.5 Store Standard or Guideline in S&G Notebook Workspace


After a standard or guideline is approved, the workspace administrator should copy all
files used to compose and edit the standard or guideline to the S&G Notebook
workspace in the Next Version tab. Any presentations, educational materials, checklists,
or other materials that would be useful in implementation or use of the standard or
guideline should also be copied to the Next Version tab.
In addition, a file containing a brief summary of the revisions made should be created
and stored with a revised standard or guideline. For new documents, the file should
contain a brief summary of the purpose of the new standard or guideline.
The folder location and the naming convention for the above files are defined in the S&G
Notebook Workspace work product definition.
4.2.6 Delete Development Files and Sub Folder
After copying the files for the approved standard or guideline to the S&G Notebook
workspace, the workspace administrator should delete the folder and files for the
standard or guideline from the S&G Development workspace. Any remaining files that
are considered important should be copied to an archive workspace. If information from
a discussion tab is needed, it can be exported to XML and then imported into another
workspace. Information needed from other types of tabs in the S&G Development
workspace that is relevant to the standard or guideline may need to be copied manually.
4.2.7 Update ALF Document
The ALF document includes a general reference to the standards, guidelines, policies,
and procedures that support the implementation of each process area. Prior to
publishing a new or revised standard or guideline, the ALF document should be updated
to reflect those process areas supported by the standard or guideline. Other internal
AASHTOWare documentation that tracks the progress of the ALF implementation should
also be updated.
Any changes to the ALF document should be reviewed by the T&AA Task Force and the
AASHTO and SCOJD liaisons. The revised document should then by approved by
T&AA, stored in the S&G Notebook workspace as describe above, and published with
the next version of the notebook as described below. The ALF document is published
as a reference document in the notebook.

4.3 Create and Publish the Standards and Guidelines Notebook


This procedure is initiated when a new version of the Standards and Guidelines Notebook is
to be created. All approved standard and guidelines and changes to the ALF document will
be included in the next published version of the notebook.
Since standards must be compiled by their effective date, the earliest effective date of all
approved standards will determine the next notebook’s effective date. Most standards are
created with an effective date at the beginning of the next fiscal year (July 1); however,
SCOJD may deem that certain standards become effective at an earlier date. Guidelines
are not binding and normally will not drive the publication of a new version of the notebook.
4.3.1 Compile New Notebook
The workspace administrator begins this activity by verifying that all standards,
guidelines, and ALF documentation that have been approved for the next version of the
notebook have been stored in the Next Version tab of the S&G Notebook workspace,
along with all needed supplemental files. If any approved standard or guideline or file is
missing, the administrator copies the missing files to the appropriate locations following
the activities described previously.
Next, all unchanged documents and files from the Current S&G Notebook tab that are
needed to create a new version are copied to the Next Version tab. This includes all
unchanged standards and guidelines plus the cover page, cover letter, introduction,
table of contents, glossary, and any other reference documents and supplemental files

Page 7 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

used to create the complete notebook. All reference documents should be changed as
required to support the new notebook.
All files must be stored and named as described in the S&G Notebook Workspace work
product definition. After all files have been moved and verified, the title page, cover
letter, introduction, table of contents and other supplemental files are modified as
needed for the new version of the notebook. The summary files with each new or
revised standard or guideline should be used to create a summary of changes which is
included after the cover letter.
The complete Standards and Guidelines Notebook is created in PDF format. A separate
PDF file should be created for the current version of each standard, guideline, or
supplemental file needed to create the notebook. The new notebook is then created by
merging all of the PDF files into a PDF document which represents the complete
Standards and Guidelines Notebook. After reviewing the notebook document and
verifying that all content is correct and in the right order, the new notebook file must be
stored and named as described in the S&G Notebook Workspace work product
definition. The new notebook and all reference documents (cover page, cover letter,
summary of changes, table of contents, introduction, ALF document, and glossary) are
all approved by the T&AA Task Force.
4.3.2 Move Current Notebook to History
On or before the effective date of the new notebook, the previous version of the
notebook should be copied from the Current S&G Notebook tab to the S&G History
workspace. The root folder in the Current S&G Notebook should be copied to S&G
History workspace as a subfolder and renamed. Refer to the S&G Notebook Workspace
work product for the naming convention for S&G History folders.
After verifying that all content was copied correctly the S&G History tab, all subfolders
and content is the Current S&G Notebook tab should be deleted.
4.3.3 Move New Notebook to Current
After the copying the previous version to history, the subfolders in the Next S&G tab
should be copied to the Current S&G Notebook workspaces under the root folder. The
new notebook should be in place by the effective date.
After verifying that all content was copied correctly to the Current S&G Notebook tab, all
subfolders and content in the Next S&G Notebook tab should be deleted.
4.3.4 Notify Stakeholders
After the new version of the notebook is available for use, the T&AA chairperson will
provide the T&AA AASHTO Staff liaison with the complete notebook in PDF format, and
the location of the notebook in the S&G Notebook Workspace. AASHTO Staff will then
make all necessary copies and distribute both hardcopy and electronic copies of the
notebook.

4.4 Develop and Maintain Lifecycle Model Descriptions


The purpose of this procedure is to establish and maintain descriptions of the lifecycle
models approved for use in AASHTOWare development.
4.4.1 Develop Lifecycle Model
A lifecycle model is used to partition a project into major stages or phases, such as
requirements analysis, and construction. Each stage is typically divided into major
activities and tasks, and includes key milestones and deliverables that must be
completed before the stage can end and the next stage is approved to begin.
The T&AA Task Force develops and maintains the lifecycle models that should be used
for AASHTOWare development and maintenance. The AASHTOWare models should
be described in a level of detail that allows them to be adapted to most development
methodologies, including waterfall, incremental, iterative, and Agile.

Page 8 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

The procedures for adapting and customizing the lifecycle models for specific types of
projects and methodologies should be included with the lifecycle descriptions. Any
requirements that must be complied with regarding when customizing the lifecycle
models for specific projects should be clearly documented. Refer to the Establish
Standard Requirements and Customization Guidelines section for additional information.
Where applicable, each standard or guideline should also describe the requirements
and/or recommended practices that must or should be followed during lifecycle model
stages.
4.4.2 Review and Approve Lifecycle Model
The approved lifecycle models are documented in the Standards and Guidelines
Glossary located in the appendices of the Standards and Guidelines Notebook. When
changes are made to the lifecycle model, the glossary document or other documentation
containing the lifecycle models should be reviewed and approved using the activities
described previously for guidelines.
4.4.3 Store and Publish the Life Cycle Model
The Standards and Guidelines Glossary, which includes the approved lifecycle models
document should be stored in the S&G Notebook workspace and published in the
Standards and Guidelines Notebook using the procedures defined previously for
guidelines.
4.4.4 Maintain Life Cycle Model
The lifecycle models should be reviewed routinely with the other standards and
procedures and modified as needed to address strategic objectives and issues found
during the reviews. When modified, the activities described above should be used to
approve, store, and publish the modified models.

4.5 Establish Standard Requirements and Customization Guidelines


Each standard should clearly describe the procedures and activities that must be followed
and the deliverables and work products that must be created, submitted, and/or approved.
The required elements of each standard should be clearly identified and should be noted at
the beginning of each standard. All new and revised standards will use red italicized text to
identify the required elements.
The procedures, activities, deliverables, and work products that are not required are based
on best practices and are recommended. These may be implemented or customized as
seen appropriate by the by the project/product task force and contractor. Some standards
may also provide additional details on what may be customized and the approach that may
be used for customizing certain elements.

4.6 Request an Exception to Standards


The project/product task force has the responsibility of ensuring that the required elements
defined in each standard are complied with. In the event that a requirement of the standard
cannot be complied with, the task force chair should advise the SCOJD or T&AA liaison
early in the project/product life cycle.
A formal request for an exception to the standard must also be submitted to the SCOJD.
The exception request is typically sent by letter from the task force chairperson to the
SCOJD chairperson. The letter should include all proposed changes and/or exclusions to
one or more standards along with the documentation that describes or justifies the reasons
for the reason exception(s) and any additional documentation for SCOJD consideration.
The exception request must be submitted to SCOJD prior to beginning the stage of the
project where the applicable standards are to be used.
Approval of exceptions to the standards is under the purview of the SCOJD. SCOJD may
choose to obtain a recommendation from the T&AA Task Force and/or AASHTO staff prior
to making a decision to approve or reject the exception request. In this case, T&AA and/or

Page 9 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

AASHTO staff reviews the request and returns their recommendation along with the reasons
for their recommendation to SCOJD.
After reviewing the recommendations from T&AA and/or AASHTO staff, SCOJD makes the
final decision to approve or reject the exception request. The SCOJD chairperson sends a
letter to notify the task force chairperson of SCOJD decision to approve or reject the
exception request. If rejected, the reasons for the rejection are also provided. The SCOJD
members, AASHTO staff, and the T&AA chairperson should be copied on this letter.
The approval/rejection letter should also be submitted by the task force chairperson to the
AASHTOWare Quality Assurance Analyst prior to the evaluation of work products and
deliverables that are impacted by the exception request.

4.7 Establish Measurement Repository


The purpose of this procedure is to establish and maintain a repository of measurements for
use in AASHTOWare development. This procedure will be defined in a future update to this
standard, as processes are developed to use the measurements in the repository.

4.8 Maintain the Standards and Guidelines


The purpose of this procedure is to ensure that each standard and guideline is correct, up-
to-date, and relevant. The T&AA Task Force will perform annual reviews of all standards
and guidelines in the current Standards and Guidelines Notebook. These reviews should
include, but not be limited, to the following analysis:
■ Determine if any hyperlinks embedded in the standards and guidelines are invalid.
■ Determine if each standard and guideline is still relevant and up to date with industry
directions.
■ Determine it there are issues in consistency, readability, and/or format with specific
standards or guidelines when compared to the majority of the existing standards and
guideline.
T&AA may request assistance from the project/project task forces, contractors, and/or
AASHTO staff in reviewing the standards and procedures or in validating issues. In addition
the users of the standards and guidelines should report any issues found while applying the
standards and guidelines to ongoing development and maintenance efforts.
If any issues are found during the reviews or reported by stakeholders, T&AA will review
these and determine which issues warrant corrective actions. For those issues requiring
corrective actions, T&AA will create tasks for the current fiscal year or future work plans.
Issues found with hyperlinks and other minor issues that do not change the meaning or
impact of a standard or guideline are normally corrected in the current fiscal year.
All tasks to correct, revise, eliminate, or replace issues must follow the previously defined
procedures and activities for revising, developing, reviewing, approved, storing and
publishing standards and guidelines.

5. Technical Requirements
There are no technical requirements for this standard.

Page 10 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

6. Deliverable and Work Product Definitions


This section describes the deliverables and work products that are prepared, reviewed,
approved, and saved during the review, creation, and update of the AASHTOWare Standards
and Guidelines.

6.1 Standard or Guideline


6.1.1 Description
This work product definition is used to define the required content, optional content,
format, and structure for new standards and guidelines. The work product definition is
also used when revising an existing standard or guideline to bring an existing document
into compliance with the AASHTOWare Standards and Guidelines Definition Standard.
6.1.2 Content
The following describes the content required for each standard and guideline. The
AASHTOWare Standard Template is used to document the content in a consistent
format, font, style, and structure. The template is a Microsoft Word template and is
stored in the S&G Development Workspace under the Files Tab in the AASHTOWare
Standard Template folder.
○ Cover Sheet
□ Standard or Guideline Name
□ Standard or Guideline Number and Version – Each standard or guideline should
be named using a “C.NNN.VVS” format.
◊ C is the number 1-5 which represents the number of the category for the
standard or guidelines. The current categories are defined below with the
S&G Notebook Workspace definition.
◊ NNN is the number 001-099 which represents the standard or guideline
number within the category. These are currently numbered in increments of
5 and 10.
◊ VV is a number 01-99 which represents the version number of a standard or
guideline.
◊ S is a suffix that indicates the document type, with
− S for standards,
− G for guidelines, and
− R for reference or informational documents (cover page, cover letter,
summary of changes, table of contents, introduction, ALF document, and
glossary, etc.)
Examples of document numbers include 3.080.02S for the Testing Standard and
3.070.03G for the Application Design Guideline.
□ Effective Date of the Standard or Guideline
□ Document History – includes entries for each new version
◊ Version, Date, Revision Description, and Approval Date
○ Table of Contents
○ Purpose – Describe the purpose of the standard or guideline.
○ Task Force/Contractor Responsibilities – Summarize the task force and contractor
responsibilities regarding the standard. If needed, include the responsibilities of
T&AA, SCOJD, and AASHTO staff.
○ Required (or Recommended) Deliverables and Work Products – Summarize the
required deliverables and work products that must be prepared and saved in order to

Page 11 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

comply with a standard, as well as any optional ones . In the case of a guideline, the
work products and work products should all be designated as recommended. If this
section is not applicable to the standard or guideline, note “Not Applicable”.
○ Procedures – Define the procedures that must be carried out to comply with a
standard or to follow the intent of a guideline. If this section is not applicable to the
standard or guideline, note “Not Applicable”.
○ Technical Requirements (or Technical Recommendations) - Define the technical
requirements that must be met to comply with a standard or technical
recommendations for a guideline. If this section is not applicable to the standard or
guideline, note “Not Applicable”.
○ Appendices - Create one or more appendices as required to document any
information needed to supplement the primary content of the standard or guideline.

6.2 Standards and Guidelines Notebook


6.2.1 Description
The Standards and Guidelines Notebook is the official published document that contains
all AASHTOWare standards and guidelines. The currently approved notebook is stored
in the Current S&G Notebook tab of the S&G Notebook workspace.
6.2.2 Content
The notebook is divided into the following sections and subsections:
○ Cover Letter and Summary of Changes
○ Table of Contents
○ Introduction
○ Standards and Guidelines organized by Category and ordered by Standard or
Guideline Number within each Category. The Categories are discussed under the
S&G Notebook Workspace below.
○ Appendices
□ AASHTOWare Lifecycle Framework (ALF) document
□ Standards and Guidelines Glossary
The standards and guidelines are numbered as described in the Standards and
Guidelines work product definition. The format and content of the standards and
guidelines are also descried in that work product definition. The standards and
guidelines are ordered in the notebook sequentially by number in their respective
sections. The notebook should use a naming convention of “MMDDYYYY S&G
Notebook”, where the MMDDDYYYY contains the notebook effective date.

6.3 S&G Notebook Workspace


6.3.1 Description
The S&G Notebook Workspace is a repository workspace used to store and access the
current version of the Standards and Guidelines notebook, previous versions of the
notebook, approved content for the next version of the notebook. All files and
documents needed to create the complete versions of the notebook are also stored in
the workspace.
The workspace is created, maintained, and accessed using Microsoft Groove 2007
software.
6.3.2 Content (by Workspace Tabs)
○ Welcome – This tab provides a cover sheet for the workspaces and includes a
description, date, and creator of the workspace.

Page 12 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

○ Current S&G Notebook – This is a Files tab used to store the current version of the
notebook. The content of the notebook and the files used to create the notebook are
stored in the following hierarchical folder structure.
□ Category
◊ Standard or Guideline
− Construction
− Additional folders (see explanation below)
The following represent the current categories used for the notebook folder structure.
The first four categories represent grouping of the ALF process areas, where the
Notebook category is used to store the complete notebook and items that pertain to
the complete notebook.
□ (1) Process Management
□ (2) Project Management
□ (3) Software Engineering
□ (4) Support
□ (5) Notebook
Each standard and guideline folder is created under the category for the standard or
guideline. These folders are named with a “NNN AAA” format, where NNN is the
standard or guideline number and AAA is the standard or guideline name. The word
processing document and the PDF file for the standard or guideline is stored in this
folder. These files should be named with the same “NNN AAA” format as the folder.
Files needed to construct the word processing document are stored in the
Construction folder. These files should be named with a “NNN BBB” format, where
NNN is the standard or guideline number and BBB identifies the content of the
construction file.
Additional folders may be created, as required, to contain training material,
presentations, or other information needed to assist with using the standard or
guideline.
The complete notebook is stored in PDF format in the Notebook folder. The
Construction folder includes the word processing documents and PDF files for the
cover letter, title page, glossary, table of contents, and introduction.
○ S&G History – This is a Files tab used to store previous versions of the notebook.
Each of these version folders is organized the same and includes similar content to
that discussed with the Current S&G Notebook.
○ Next Version - This is a Files tab used to store approved standards and procedures
that will be included in the next version of the notebook. The folder structure and
content is the same as discussed with the Current S&G Notebook.
○ Document Library – This is a Files tab used to store reports, plans, example
deliverables, and other types of documents that are deemed useful to AASHTOWare
stakeholders. A folder is created for each major type of document that is stored or
grouping of documents.
○ Discussion – This tab is used for stakeholders to document issues and to initiate
discussion regarding the current version of the notebook.

Page 13 06/16/2009
AASHTOWare Standards and Guidelines Definition Standard 1.010.01S

6.4 S&G Development Workspace


6.4.1 Description
The S&G Development Workspace is a development and collaboration workspace used
to develop and revise standards and guidelines. The workspace is created, maintained,
and accessed using Microsoft Groove 2007 software.
6.4.2 Content (by Workspace Tabs)
○ Welcome – This tab provides a cover sheet for the workspaces and includes a
description, date, and creator of the workspace.
○ Files – This tab contains temporary folders for each standard or guideline that is
currently being created or revised
□ Standard or Guideline Folder – Each folder root level folder is named for the
standard or guideline that being created or revised. These folders contain the
documents that are in progress of being developed or revised for each standard
or guideline.
◊ Reference – This folder contains additional reference materials collected
during the development or revision of a standard or guideline, such as
research and CMMI information.
◊ Additional folders may be created, as required, to contain training material,
presentations, or other information needed to assist with creating or using the
standard or guideline.
□ _Stakeholder Review - This folder contains standards and guidelines that have
been reviewed and updated by T&AA and are ready for stakeholder review. The
documents are copied from the appropriate standard/guideline folders to this
location when T&AA is ready to send out the documents for stakeholder review
and comment. After the review is completed, all files are deleted from this folder.
□ _SCOJD Approval – This folder contains completed standards and guidelines
that are pending SCOJD approval. The completed documents are copied to this
folder after stakeholder review and updates are completed. After SCOJD
approval, the files are deleted from this folder.
○ Document Discussion – This tab is used to post new or revised standards and
guidelines for review and comment by T&AA.

6.5 Updates to the AASHTO Lifecycle Framework (ALF) Specification


6.5.1 Description
The AASHTOWare Lifecycle Framework (ALF) was developed as a means to implement
process improvement in the AASHTOWare software development and maintenance
processes. The ALF document describes this framework through target process areas,
goals, and practices, and serves as the roadmap for AASHTOWare process
improvement. The ALF document includes a general reference to the standards,
guidelines, policies, and procedures that support the implementation of each process
area. Prior to publishing a new or revised standard or guideline, the ALF document
should be updated to reflect those process areas supported by the standard or guideline.
Other internal AASHTOWare documentation that tracks the progress of the ALF
implementation should also be updated.
6.5.2 Content
Refer to the AASHTOWare Lifecycle Framework (ALF) document in the Standards and
Guidelines Notebook appendices for additional details.

Page 14 06/16/2009
DEVELOPMENT
METHODOLOGY
GUIDELINE
S&G Number: 1.020.02G
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 April 1995 Initial Version Nov. 1996
02 6/10/2009 Changed standard number from 1.02.G35.01 to 06/16/2009
1.020.02G. Applied standard template formatting. Approved by
Made minor changes and format modifications. T&AA

06/10/2009
Development Methodology Guideline 1.020.02G

Table of Contents
1. Background......................................................................................................... 1
2. Methodology Purpose and Requirements........................................................ 3
2.1 Why Use a Development Methodology.............................................................3
2.2 Requirements for an AASHTOWare Methodology ...........................................4
2.3 Proposal Overview .............................................................................................5
3. AASHTOWare Development Methodology ....................................................... 5
3.1 Methodology Background .................................................................................5
3.2 Methodology Overview ......................................................................................7
3.3 Methodology Objectives and Deliverables .....................................................12
4. Personnel .......................................................................................................... 18
4.1 Personnel Roles and Qualifications................................................................18
4.2 Personnel Organization ...................................................................................22
5. Training ............................................................................................................. 25
5.1 Contractor Training..........................................................................................25
5.2 Project/Product Task Force Training ..............................................................25
5.3 AASHTO Staff Training ....................................................................................25
6. Tools and Equipment ....................................................................................... 26
6.1 CASE Tool Selection........................................................................................26
6.2 Development Platform Considerations...........................................................28
6.3 Repository Platform Selection ........................................................................28
6.4 Target Platforms:..............................................................................................28
7. Projects ............................................................................................................. 28
7.1 Project Types....................................................................................................28
7.2 Roles Versus Project Types ............................................................................29
7.3 Project Types Versus the Model Levels .........................................................30
8. Recommendations ........................................................................................... 30
9. APPENDICES .................................................................................................... 32
9.1 APPENDIX A: AASHTOWARE LIFECYCLE CHARTS .....................................32
9.2 APPENDIX B: SYMBOLOGY EXAMPLES........................................................34
9.3 APPENDIX C: AASHTOWARE PROJECT MANAGEMENT CHARTS .............38

Page i 06/10/2009
Development Methodology Guideline 1.020.02G

1. Background
Most businesses and organizations have seen the importance that information has in the
achievement of business objectives. Those enterprises which can effectively tailor their
information resources to business needs can more effectively compete in the marketplace. This
recognition of the importance of information and the emerging automation technologies that
facilitate its use, has given rise to much interest in development methodologies that are
integrated in and driven by business planning.
These methodologies usually prescribe a process which starts in the business planning cycle by
identifying the business areas and information needed to support the organization’s missions
and objectives as expressed in the plan. Strategies are formulated for providing information
systems which support the objectives. By using modeling techniques and successive stages of
refinement, a logical description of the procedures (processes) and information (data elements),
sufficient to define a particular information system’s requirements, is produced. From these
descriptions, the system can be designed and developed.
The best known of the business planning driven methodologies is Information Engineering, a
phased approach to information systems development which closely parallels the phases of
highway development. The Phases of Information Engineering (IE) are:
● Planning - Information Strategy Planning
● Analysis - Business Area Analysis
● Design - External Design or Functional Design
● Internal Design or Technical Design
● Construction - Development
● Transition - Cutover
● Production.
The first attempts at IE development were frustrated by the lack of Computer Assisted Software
Engineering (CASE) Tools. Not until CASE tools which supported the methodology became
commercially available did IE become a practical reality.
This approach with its view toward both horizontal and vertical integration avoids many of the
problems inherent in vertical (only) project-by-project development efforts that are conducted
without a business systems plan. The project-by-project method often results in systems whose
data cannot be integrated or shared and, in many cases, the resulting systems overlap each
other or alternatively leave gaps. Another benefit of this approach is the employment of models
which are suitable for use by both subject matter experts as well as information systems
personnel thus insuring a good correspondence between business requirements and the
resulting applications. In addition, when it is supported at all levels by commercially available
CASE tools, there is the promise of greatly improved efficiency and accuracy in the
development of information systems.
Along with the potential benefits there are some concerns to be resolved and overcome:
1. IE follows the customary waterfall approach to development which contributes to lengthy
production cycles with little provision for course correction in mid-stream.
2. IE does not provide an effective means for incorporating user input into the design and
construction phases of the development cycle.
3. IE is best for very large projects within large organizations. It requires too much overhead in
time and cost for small self contained projects.
4. Integrated CASE (I-CASE) tools do not uniformly develop quality applications in every target
environment. Many of the I-CASE tools do not use the full potentials of their target

Page 1 06/10/2009
Development Methodology Guideline 1.020.02G
environments. This is especially true of the microcomputer Graphical User Interface (GUI)
environments.
5. IE and I-CASE cause large cultural changes in the DP organization, which if not planned for,
can severely hamper success of initial projects.
6. IE complexity lends itself to bureaucratization and can thus contribute to long development
cycles.
To reduce the impact of some of the above problems many organizations have adopted the
Rapid Application Development (RAD) methodology. This methodology can be either used by
itself or as a modification of IE in which case it replaces its last two phases (Design and
Construction). Some of the attributes of RAD which are improvements over standard IE are
described below:
1. RAD is designed to deliver applications within a fixed time period. This contributes to short
and predictable development cycles which will be useful in the AASHTOWare development
environment.
2. RAD provides for Joint Requirements Planning (JRP), Joint Application Design (JAD), and
Joint Application Construction (JAC) sessions which include the user in the planning,
design, and construction of the application. These sessions capture user information and
their approval, using the CASE planning, design, and construction tools. this minimizes the
requirement for manual transference of user information and insures that the application
satisfies user needs.
3. RAD uses iterative processes for design and construction which permits rapid cycling of
these phases as the application is refined. This also provides the flexibility to make course
corrections during the design and construction phases.
4. RAD maximizes the use of CASE tools by bringing them into play during user sessions. The
user can see his suggestions put into place during the session and can be assured that they
contribute to the usability of the application.
The combined phases of IE and RAD may be expressed as follows:
● Planning - Information Strategy Planning (ISP)
● Analysis - Business Area Analysis (BAA)
● Requirements- Joint Requirements Planning (JRP)
● Design - Joint Application Design (JAD)
● Construction - Joint Application Construction (JAC)
● Cutover - Implementation
The IE/RAD phases described above are a definite improvement over the IE approach when
used alone; however, they still have some drawbacks:
1. BAA is a global analysis of a whole business area. It requires a more specific analysis to
determine the needs of a particular application development effort. Also, within the context
of AASHTOWare development, it may be impossible to expend the efforts necessary to
perform a BAA. The RAD phases assume that the necessary analysis has been performed.
2. Although some iteration occurs (spiral development process) during the design and
construction phases of RAD, there is no provision for revisiting the analysis phase which
does not exist or is disguised in JRP sessions.
3. The RAD phases do not emphasize the importance of testing (validation) or reusability
(generalization) to the development cycle.
4. RAD emphasizes joint sessions with the user (JRP, JAD and JAC) while it ignores many of
the more technological aspects of the development process.

Page 2 06/10/2009
Development Methodology Guideline 1.020.02G
To remove these weaknesses the following phases are proposed for the AASHTOWare
Development Methodology:
● Global or background phases to be performed continuously
■ Information Strategy Planning (ISP)
■ Business Area Analysis (BAA)
● Phases specific to a particular development project
■ Application Planning
■ Spiral Application Analysis
■ Spiral Application Design
■ Spiral Application Construction
■ Spiral Application Validation
■ Spiral Component Generalization
■ Application Implementation
Because most AASHTOWare development projects will be performed without the benefit of ISP
and BAA projects, this document will emphasize the phases pertaining to the development of a
particular application. This document will also emphasize use of the most current development
technologies and topologies (i.e. CASE development tools, Object Oriented Programming, and
Client Server application structuring).

2. Methodology Purpose and Requirements


This document defines a methodology guideline, AASHTOWare Development Methodology
(ADM), which is intended to address some of the unique needs of AASHTOWare product
development. ADM is not meant to be a total methodology, completely defining every possibility
of the development process, nor is it, on the other hand, meant to be followed in all its
particulars. It is intended to provide information which is useful for defining a flexible
development framework that can be tailored to fit most commercially available CASE Tools, and
to support various project needs (Strategic Planning, Business Area Analysis, Application
Development and Maintenance).
This document is also meant to emphasize the importance of the methodological approach to
development, which can best be done by first explaining the purpose of development
methodologies in general.

2.1 Why Use a Development Methodology


1. Avoidance of Costly Mistakes - Methodologies such as IE and RAD have, over time,
become repositories of development experience. They have been amended and
supplemented as development pitfalls were discovered and new technology for
development and implementation became available.
2. Suitability to Requirements - Common failings of development projects have been that
their results do not address the real needs of the enterprise, they are not accepted by
their intended users, they are not completed on time and when they are needed, they
cost more than was anticipated, they have impossible implementation requirements (in
the areas of technology, implementation procedures, and cost), and their quality, upon
implementation, is unacceptable.
A good development methodology should provide tailorable procedures which can be
incorporated in development plans that address these failings.
○ It should provide for analysis of the enterprise’s organization, workflow, and culture
to plan for and insure product acceptability.

Page 3 06/10/2009
Development Methodology Guideline 1.020.02G
○ The methodology should provide techniques for defining the information and
procedural needs of the enterprise, for assessing their relative importance, and for
designing strategies for realization of these prioritized needs.
○ The methodology should provide metrics and procedures which permit the accurate
estimation of project cost and schedule. It should also require that projects are short
in duration, insuring that the product is still relevant to the circumstances it was
expected to address.
○ It should provide for implementation planning that covers all contingencies which
would prevent an easy transition to the new product. These include the introduction
of new technology, changes in work environment, changes in organizational
structure, changes in user skill requirements, preservation of enterprise historical
information, transitions to new work procedures and information, changes in quality
of business product (not necessarily always improved).
○ It should provide for testing procedures which touch all aspects of product quality.
These should include operational quality (no errors), performance, satisfaction of
business requirements, user satisfaction, ease of implementation, supportableness,
and documentation.
3. Technology Exploitation - The methodology should promote, certainly not impede,
automation of the development process (adaptable to the many useful and commercially
available CASE Tools). It should support the major application paradigms (Client-
Server, Batch, Data and Processing Distribution, and Object Oriented Programming). It
should also provide development techniques which insulate the application from the
technical environment improving product portability.

2.2 Requirements for an AASHTOWare Methodology


In addition to the more general methodology purposes (requirements for the AASHTOWare
Methodology) described above, there are requirements specific to AASHTOWare
development. They arise from the differences between the AASHTO organization and the
more ordinary software development enterprises. Some of the differences, which have
necessitated customization of the methodology, are the following:
1. AASHTOWare development is administered and managed through the Special
Committee on Joint Development and the Project/Product Task Forces whose members
are drawn from the user community (i.e. the State Transportation Agencies). Since
these persons are located at their home agencies where they have regular
responsibilities, project work and meeting time is of necessity limited. An AASHTOWare
methodology must maximize the use of this time and provide for the use of electronic
communication tools which facilitate remote conferences, issue resolution, review of
project deliverables, and project management.
2. The potential customers for AASHTOWare products are transportation agencies and
associated contractors. Since these agencies differ in technical environments and vary
in size, AASHTOWare products must be designed to be both portable and scalable.
This requires a methodology tailored to promote the use of CASE tools for technology
insulation and the employment of application structures such as client/server for scale
ability and object oriented programming which enhances code reusability.
3. Most AASHTOWare Projects and all Products are contracted with independent software
developers on a yearly basis to match the AASHTO planning and budget cycles. This
results in yearly development cycles and as a consequence requires a methodology
capable of dealing with variable sized projects within a fixed time frame of not greater
than a year.

Page 4 06/10/2009
Development Methodology Guideline 1.020.02G

2.3 Proposal Overview


To define an AASHTOWare Development Methodology which meets the requirements
described above, this guideline will provide the following:
1. It will define the personnel roles that are required and make suggestions as to how they
should be organized to permit transportation agency participation in requirements
planning, Project/Product Task Force management and contractor development using
the ADM framework.
2. It will suggest training that is appropriate to contractor, Project/Product Task Force and
AASHTO Staff personnel for development using the ADM framework.
3. It will make recommendations for selecting the CASE tools and hardware necessary for
supporting the methodology.
4. It will define a product life cycle which fits the requirements of AASHTOWare
development. Points in this cycle where information can be shared will be identified.
Validation points will also be indicated.
5. A methodology will be recommended which can be tailored to the needs of
AASHTOWare development projects. The phases of this methodology will be keyed to
the product life-cycle, and required deliverables will be described.
6. Suggested variations to the AASHTOWare Testing and Documentation Standards will
be described.

3. AASHTOWare Development Methodology


3.1 Methodology Background
In addition to the "traditional" system development stages of Planning, Analysis, External
Design, Internal Design and Construction, the three other variants from which developers
typically choose are "classical" Information Engineering Methodology (IEM), Business
System Implementation (BSI), Rapid Application Development (RAD), and more recently,
iterative or spiral development processes.
"Classical" IEM includes seven stages: Information strategy Planning (ISP), Business Area
Analysis (BAA), Business System Design (BSD), Technical Design (TD), Construction,
Transition, and Production. The first five stages correspond to the "traditional" system
development stages. Two additional stages, Transition and Production, address the
movement of executing applications into production and the monitoring of applications once
the system is operational.
An alternative approach, Business System Implementation, treats External Design, Internal
Design and Construction as a single stage. During this stage, developers define an outline
of the external and implementation layers, including the business systems and databases to
be implemented. Then they select a section of the outline and immediately begin iterating
through design, construction and test. They do the same for the next section, and so on
until a working system evolves.
Another approach, Rapid Application Development, is a more radical departure from
classical IEM because the durations and cutoff points of stages differ. RAD aims to build
systems very swiftly by relying heavily on group development techniques. It begins with
Joint Requirements Planning (JRP) for a business area over a one- to three-week period.
Users then join developers in Joint Application Design (JAD) sessions over a three- to five-
week period of User Analysis and Design (UD). This stage overlaps the last part of
traditional Analysis and the first part of "traditional" Design. It relies extensively on
prototyping to allow users to verify screen designs and overall system structure. During the
next stage, Joint Application Construction (JAC), a SWAT team of programmers who are
"Skilled with Advanced Tools" builds all the components required to complete the system

Page 5 06/10/2009
Development Methodology Guideline 1.020.02G
from the external design. The final stage, Cutover, is equivalent to the Transition stage of
"classical" IEM.
The most recent entrant to the methodology scene is an iterative process which we are
calling the Spiral Development Process (SDP). This iterative technique combines Analysis,
Design, Construction, Validation, and Generalization into a series of steps which are
repeated numbers of times as the application is developed. These iterated phases are
sandwiched between two non-repeating phases, Planning and Implementation.
The spiral process, which is especially well adapted to object oriented development,
recognizes that system requirements are never completely defined prior to design and
construction. It provides a mechanism for capturing changes in requirements, analysis or
design during the development process. The more times the spiral is completed the better
the delivered application will be.
The final stage in the spiral model is designed to take components, or objects, that were
developed for a specific system and to generalize them for use in other systems throughout
the enterprise.
The AASHTOWare Development Methodology combines attributes and phases from the IE,
RAD and the Spiral Development Process. The first two phases which are taken from IE are
Information Strategic Planning (ISP) and Business Area Analysis (BAA). The ISP activity is
usually performed continuously - its updates being synchronized with the business planning
cycle. BAA is performed as a continuous operation which maintains a high-level description
of the information resources and work activities of the major business areas of the
enterprise. These two phases are global and do not apply to a specific application
development effort. They are performed continuously to provide the foundational strategies,
architectures and business process/information definitions required to situate application
systems in the enterprise environment.
The remaining seven phases of the AASHTOWare Development Methodology are taken
from RAD and the Spiral Development Process. They apply to the development of a
specific application system. The following table illustrates the correspondences between IE,
RAD, and the Spiral Development Process phases.
Comparison of IE, RAD, SDP and ADM Phases
Information Rapid Application Spiral Development AASHTOWare
Engineering (IE) Development (RAD) Process (SDP) Development
Methodology (ADM)
Planning: Information Planning: Information
Strategic Planning Strategic Planning
(ISP) (ISP)
Analysis: Business Analysis: Business
Area Analysis (BAA) Area Analysis (BAA)
Requirements Planning Application Planning
Planning: using Joint
Requirements
Planning (JRP)
Sessions
(continued from Analysis Spiral Analysis: using
above) Joint Application
Development (JAD)
Sessions
Design: Functional & Design: using Joint Design Spiral Design: using
Technical Application Design Joint Application
Design (JAD)
Page 6 06/10/2009
Development Methodology Guideline 1.020.02G

Information Rapid Application Spiral Development AASHTOWare


Engineering (IE) Development (RAD) Process (SDP) Development
Methodology (ADM)
(JAD) Sessions Sessions
Construction Construction Construction Spiral Construction
Validation Spiral Validation
Generalization Spiral Generalization
Transition Cutover Implementation Application
Implementation

3.2 Methodology Overview


The following paragraphs define the phases of the AASHTOWare Development
Methodology.
1. Information Strategy Planning: This phase is usually performed continuously and is
integrated in the organization’s business planning cycle. Two types of studies are
carried out in ISP. The first determines strategic opportunities, goals, critical success
factors, and business function/information needs of different parts of the enterprise. A
determination of how new technologies might better be used to meet the goals better
and create new business opportunities. The second creates an overview model of the
enterprise, and splits this into segments appropriate to Business Area Analysis. These
segments, which are logical groupings of information and processes, define business
areas. The chief deliverable from this stage is a model of the entire enterprise that
includes information and function diagrams. ISP populates what is sometimes called the
architectural layer.
2. Business Area Analysis: During this phase designers/analysts examine a particular
segment of the business called a business area. They develop a detailed conceptual
model of this business area based on its information needs. This model includes entity
type and process diagrams. Analysis populates the conceptual layer. Sometimes a
business area analysis reveals serious problems with the processes and information use
of the business. When this occurs, it is possible to enter a stage called Business
Process Re-engineering (BPR) which, after analysis of impacts and costs, incorporates
in the plan the necessary activities to bring the business into conformity with the models
produced during business area analysis.
3. Application Planning: This is the first phase of application specific development. It is the
logical extension of ISP in that it applies strategy planning information to a specific
system development effort. This phase occurs once in the development project. It
either provides a transition between the upper global phases of ISP/BAA and the
remaining iterative development phases or it is the first phase of a complete
development project. During this phase, the need for the system and resources required
are balanced. The requirements and scope of the system are defined. If the
development is to be performed under contract, a request for proposal is developed and
a contractor is selected. If an ISP or a BAA were being developed, then a phase similar
to this would initiate those projects.
4. Spiral analysis can be divided into two activities: Requirements Analysis Process and
Domain Analysis Process. Refer to “Appendix B: Symbology Examples” for graphical
modeling symbols.
a. Requirements Analysis Process: This activity consists in defining the system
requirements both environmental and functional.
It is of first importance to establish the Information Technology Architecture (the
combination of computer technologies, methodologies, software engineering
Page 7 06/10/2009
Development Methodology Guideline 1.020.02G
processes, and tools) that will be used for development of the application system.
This combination is referred to as the Technical Framework for the development
project and may be derived from an ISP effort, where one exists.
Next the Organizational Framework needs to be established. This activity consists in
defining the organization of the users (drawn from the ISP if one exists) and that of
the project team. What functions are to be performed by each, what skills are
needed to perform them, and what reporting mechanisms will be used in the project?
A Requirements Model is developed which includes system goals/objectives,
problem scope, and functional requirements. This activity is a more detailed and a
corrected version of that defined in the previous phase. The requirements model
may be developed using RAD techniques in which case the sessions used to define
the system requirements could be called Joint Requirements Planning (JRP)
sessions. For the purposes of this document, all sessions with the user will be called
Joint Application Development (JAD) sessions.
A Use Case Model should be developed which communicates the system’s
functional requirements from the user’s viewpoint. Taken together, a system’s use
cases represent everything users can do with it. The use case model relates use
cases to actors. The system in a use case model is depicted as rectangle. Each
use case (type of user interaction) is shown as an oval within the rectangle. Each
actor that participates in a use case is shown outside of the system and is labeled
with a role.
Lastly, the models defined above are refined and finalized into a Formal Functional
Specification. This is not to say that this specification will not have to be changed in
future iterations of the development process. The changes, however, should usually
be limited to the scope of the phase which necessitated them. That is to say, major
changes in project scope should be avoided.
b. The Domain Analysis Process is based on information derived from requirements
analysis, described above. The remainder of this discussion of problem domain
analysis will center around Object Oriented Development techniques; so it is best to
pause here for some explanation of terminology.
An object is the primary element or basic building block of object oriented systems.
All the data and functionality in an object oriented system is contained within the
objects that make up that system. All the functions that act upon a particular object’s
data structures are contained within that object, and, conversely, all the data
structures that are acted upon by an object’s functions or subroutines are contained
in that object.
An object has three parts: its object name, its data attributes, and its processes or
methods. An example of an object might be the object name “person” where “name,
age, address, and job” are its data and “change job and change address” are its
processes.
A class is the description of a group of objects or the blueprint that tells the system
how to construct actual objects in memory. The class diagram is the basic building
block of object oriented analysis and design. A class diagram should exactly
describe how to write a class description in a programming language.
The term instance and object are interchangeable and both refer to a physical
manifestation of a class.
Object messaging is the communication between classes and their instances in
requesting an operation of another object. The requesting object is the client object
and the object supplying the service is the server object. Object messaging is
usually represented by a line connecting the methods portion of two object boxes
with a point in the direction of the client object.

Page 8 06/10/2009
Development Methodology Guideline 1.020.02G
Associations between objects describe either object messaging (described above) or
common attributes (similar to primary and foreign keys in a relational database).
Associations are usually indicated by a line between the attribute portions of the
object boxes with a dot on the end to indicate a “many” relationship. An association
is often described as “uses-a” or “used-by”
Aggregation is a special form of association where one class is subordinate to
another and is only used by that class. Aggregation is diagrammatically expressed
by a line connecting the object boxes with a diamond on the end near the superior
object. It is often described as a “has-a” or “part-of” relationship.
Inheritance consists of an ancestor class and a descendant class. The descendant
class can be an extension and/or restriction of the ancestor class. The relationship is
often diagrammatically described by a line connecting the objects with a “Y” opening
on the descending end. Inheritance is expressed as “is-a” or “kind-of”.
Returning to the domain analysis process, the first activity required is to identify the
potential objects of the new system. One approach consists in producing an
object/action list by examining, statement by statement, the functional requirements
described above. The basic steps to this process are: 1) identify subjects, verbs,
and direct & indirect objects; 2) categorize and remove duplicates; 3) subjects are
class candidates, 4) verbs are method candidates, 5) direct and indirect objects are
class and attribute candidates. Another method of developing objects, where case
models exist (described above), is to produce object models from each case model
and then combine all the attributes and methods belonging to each object.
After a set of classes, their attributes, and methods have been identified, the next
step is to formalize the definition of them and their relationships to each other with a
problem domain object model. This model should describe real world objects and
their relationship to one another. It should not contain implementation details.
Problem domain object models do not typically get created right the first time. As
system requirements get better defined and understood, the problem domain object
model gets refined. It is not unusual to have 3 to 4 iterations of this model. This
model can be produced using JAD work sessions.
Next a user interface prototype may be developed to facilitate communication with
the user. Many times use case modeling does not help the user to know sufficiently
his requirements. It requires visually seeing parts of the system. The goals of a user
interface are to: help define the requirements of the system, identify required
attributes, design the user interface, and explain corporate interface standards. The
prototype is not to: develop system edits (though many will be detected), program
business rules, or to construct an implementable system. The user interface
prototype should be built with tools which provide a realistic facsimile of the system
to be built. They should permit rapid changes (during work sessions if possible).
Examine for reuse during analysis. Reuse any existing problem domain classes,
components, or frameworks that have already been defined and identify any object
model patterns (a template of class attributes, methods, and relationships that have
been observed to occur again and again).
Perform risk analysis/resolution. One of the major features of the spiral development
process is that it creates a risk-driven approach to the software process rather than a
primarily document-driven or code-driven process. Becoming aware of areas of
uncertainty early, evaluating the possible consequences and devising ways of
handling or managing them are the primary elements of risk analysis.
The next process of domain analysis is estimating system development. This
process consists in evaluating the number of key classes and subordinate classes,
applying factors relating to class difficulty (type of user interface), and applying the
average number of person days required to develop each class. This results in a

Page 9 06/10/2009
Development Methodology Guideline 1.020.02G
total person time estimate for the project. See “Object Oriented Project Metrics: A
Practical Guide”, by Mark Lorenz and Jeff Kidd for more detail on this process.
The last activity of domain analysis is production of the systems development plan.
This plan is dependant upon: the estimated size and complexity of the system, the
number of systems or subsystems to be concurrently developed, user availability,
and development resources, the number of iterations planned. A common goal is to
produce subsystems or system increments every 3 to 6 months. A skeletal
development plan can be found in the Appendix C.
5. Spiral Design: During the design phase the following design models are produced:
Problem Domain Partition Object Model, User Interface Partition Object Model, System
Management Partition Object Model, Dynamic Models, Functional Models, Database
Model. Input to the design models is the requirements model and the analysis model.
The design activity populates the implementation layer. The following paragraphs
describe the models produced in the design phase:
a. Dynamic Models: Dynamic models are often created during system design to define
what states an object is in at any given time or stage of its use. It also defines the
events which cause state changes. There are many different kinds of dynamic
models, but all of them provide views of the execution paths of a system. They are
defined from case use models (described above).
b. Functional Models: Functional Models show the data an object needs and the
processes necessary to transform it. They specify what happens in a system.
Functional models show the flow of data between actors, processes, and data
stores. Functional models can also be derived from case models. Dynamic and
Functional Models are combined with the Object Model to make up a complete
object oriented design. The functional model specifies what is to be done, the
dynamic model specifies when it is to be done, and the object model specifies the
structure and relationships of the objects.
c. System Architecture Partitions (System Design Object Models): The System
Architecture Partitions are three models into which the System Design Object Model
is normally divided. These three partitions are called the Problem Domain Partition
Model (PDP), the User Interface Partition Model (UIP), and System Management
Partition Model (SMP). The advantages of dividing the System Design Object Model
into these three parts is that it: promotes creation of more reusable classes, creates
classes that are more easily distributed, promotes encapsulation of business rules,
and provides for faster functional prototyping.
The PDP is the partition which holds the classes containing the application systems
business rules. The classes in this partition contain only business logic. These
classes are based on actual entities used in the business, the rules that define how
business transactions are processed, and how calculations are applied. This
partition could be distributed to a server to reduce client processing.
The UIP is the partition which holds the visual classes. These classes define how
users interact with the system. Results from the user interface prototype and the use
case models are inputs to this partition.
The SMP is the partition which holds task-oriented classes. These are the classes
that typically contain logic that pertains to communication among two or more other
classes. These classes are all created at design time. Examples of classes that
would be included in the system management partition include: classes that manage
communication between a class and a database, operating system, network, or
peripheral device.

Page 10 06/10/2009
Development Methodology Guideline 1.020.02G
After the PDP, UIP, and SMP have been defined, they can be combined to form the
overall design object model. Classes in the various partitions are usually combined
using association relationships.
If the three partitions are designed correctly, then it will be easy to change the
classes in one partition without affecting the classes in the other partitions.
d. Database Model: The Database Model is another important model created in the
design phase. Even if a prototype database were created to support the user
interface prototype, the actual database design is not performed until the Problem
Domain Partition Object Model is complete.
There are rules for mapping an object model to a relational database design. These
rules often produce database designs which are over-normalized. Ordinary entity
relationship (ER) modeling theory can then be useful for de-normalizing the database
model for performance.
Other activities which should occur during the design phase are: identification of
frameworks of user interface, problem domain, and system management
components which can be reused in the system and refinement of work estimates
based on design metrics.
6. Spiral Construction: The key deliverable associated with the construction phase is the
production of a functional prototype which replaces the user interface prototype and
adds the following:
a. Business logic and system edits through connections to the problem domain partition
objects.
b. System management partition objects such as network middleware and security or
object request brokers.
During the construction phase the database is physically created, the user interface
objects are created (if not already), the non-visual objects for problem domain &
system management domain are created, and the object connections are
established.
7. Spiral Validation: The validation phase consists of testing the system. A system can be
subjected to two kinds of tests. Validation is the process of comparing the end product
with the true requirements of the users.
Verification is the process of detecting faults and deviations from the expectations of the
developers when they set out to build the system. Verification answers the question: Is
the system built correctly? Validation answers the question: Is the correct system being
built? The validation process consists of Component Testing, Integration Testing,
System/Subsystem Testing, Alpha Testing, and Beta Testing.
a. Component Testing: In object oriented systems the unit of work is an object, not a
program or a routine. An object contains both the data and the operations that work
on the data. Component testing is thus the testing of each object to be sure it
behaves as designed.
b. Integration Testing: Insures that assemblies of objects or components work together
as expected and that no failures occur as a result of their interaction. Because of the
nature of component development, as the services between objects are tested, the
integration between them is tested. As a consequence, component testing and
integration testing are very closely tied together.
c. System/Subsystem Testing: Test scenarios are created which correspond to each
use case in the subsystem and run through the system in a similar way to that for
walkthroughs. System/Subsystem Testing is really an external test of the system to
validate correspondence with the system requirements. It is no different than

Page 11 06/10/2009
Development Methodology Guideline 1.020.02G
customary testing except that it is based on scenarios defined in the Requirements
Analysis.
d. Alpha Testing: Alpha Testing is testing performed in the last iteration of the testing
phase which confirms to the developer that all functionality defined in Requirements
Analysis is present and performing correctly. The system is also configuration,
stress, and performance tested in the target environments until the developer is
satisfied that the system is ready for delivery and implementation.
e. Beta Testing: Beta Testing is testing performed in the last iteration of the testing
phase which confirms to the user that all functionality defined in Requirements
Analysis is present and performing correctly in the users own environment and that
the system is ready for delivery and implementation. Beta Testing also includes
review of all the project deliverables such as documentation. Beta Testing
culminates in user acceptance or rejection of the system.
8. Spiral Generalization: The generalization phase consists in maximizing the benefits of
reusability by identifying and generalizing components which may be used in future
systems and by managing component libraries and frameworks which support the
disciplines and technical environments of the user organizations.
9. Implementation: This stage includes the performance of all activities necessary for
successful implementation of the validated system. Some of these activities are: training
(technical support, product administration, and users), testing (setup and installation
procedures), and product packaging for distribution. This phase is performed once in
the development cycle.

3.3 Methodology Objectives and Deliverables


The objectives and deliverables of the stages of the AASHTOWare Development
Methodology are here described sequentially to avoid unnecessary confusion. Bear in mind
that this is an iterative process rather than sequential one. Two other points which should
be made are the following:
1. The terminology used to describe the different model layers (the Architectural Layer or
the Implementation Layer, for example) are drawn from the IE field and are not used
universally even there. The terminology has been adopted in this paper to make easier
references to different levels of the modeling hierarchy.
2. The IE Methodology emphasizes information as a resource over the business processes
which use and produce it. In object oriented development, information and process are
combined and have equal importance.
Information Strategy Planning (ISP) - The Architectural Layer
ISP populates the architectural layer. It is concerned with top management goals and
critical success factors and with how technology can be used to create new opportunities
or competitive advantages. A high level overview is created of the enterprise , its
functions, data, technical/organizational environment, and automation needs.
Planning supplies a way to plan for the use of technology to meet business objectives
and capitalize on business opportunities. Accordingly, planners typically begin with a
strategic evaluation of business goals and objectives. By interviewing key management
personnel and technical contributors, they determine such factors as the enterprise's
mission and critical success factors. Planners identify the functions and information that
the enterprise needs to carry out its mission, and build a framework for meeting these
goals and opportunities. From this data and from other sources such as the corporate
Business Plan, they build a blueprint (an architectural framework) for future
development.
The framework devised during planning consists of four major architectures: the
Information/Function Architecture, the Business System Architecture, the Technical

Page 12 06/10/2009
Development Methodology Guideline 1.020.02G
Architecture, and the Organizational Architecture. Together, these architectures address
the total information requirements of an enterprise. They form a blueprint from which an
enterprise can build an environment for addressing its long-term needs. By moving
towards this target environment, the enterprise can manage its information resources
effectively and support its business objectives.
The objectives of Planning are to:
1. Assess the function/information requirements of the enterprise.
2. Build an Information/Function Architecture that will meet those requirements. The
Information/Function Architecture defines the activities performed by the enterprise
and the information required to perform them. The result is an overall business
model - a shared architecture tied directly to business goals. This high-level view of
activities and data lays the groundwork for the detailed analysis of business areas
conducted during Analysis.
3. Build a Business System Architecture to support implementation of the Information
Function Architecture. The Business System Architecture describes probable
business systems needed to support the Information/Function Architecture.
Although more detailed analysis in later stages will determine its actual contents, the
Business System Architecture formulates business areas and attempts to predict
subsequent business systems to be developed. Before Design, designers can use
the Business System Architecture to segment business areas correctly into business
systems.
4. Identify the Technical Architecture necessary to support the Business System
Architecture. The Technical Architecture describes the hardware and software
environment required to support the Business System Architecture. During Internal
Design, the technical support for business systems is influenced by the definition of
the technical architecture during planning.
5. Identify the Organizational Architecture necessary to support the architectures
described above. Determine what organization and human resources are needed to
support the information and business systems needs of the organization.
6. Present the project's findings in a way top management can readily understand,
evaluate, and act upon.
The deliverables from Planning include the following:
1. Facts about the enterprise. Planners document these facts in a mission statement;
an information/function needs map; a list of objectives, strategies, and critical
success factors by organizational unit; a ranked list of objectives; and a number of
supporting matrices.
2. Facts about the current environment (existing organizations and hardware/software
systems). Planners document these facts in an organizational hierarchy list, a
technical inventory list, and supporting matrices.
3. A model of the Information/Function Architecture as it should be to serve the mission
of the enterprise. The model includes a Subject Area Diagram, a high-level Entity
Relationship Diagram, an overall Function Hierarchy Diagram, a set of Function
Dependency Diagrams and supporting matrices. This deliverable incorporates the
three components of Information Engineering: data, activities, and their interaction.
If object oriented development techniques are used the diagrams defined above can
be replaced with a high level object model which encompasses the enterprise.
4. The Business System Architecture. Planners determine this architecture by matrix
clustering and affinity analysis, based primarily on common use of data by activities.
They document the architecture in an Implementation Plan as a prioritized list of
potential Analysis projects.
Page 13 06/10/2009
Development Methodology Guideline 1.020.02G
5. The Technical Architecture. Planners document this architecture in a statement of
technical direction. This document describes technical alternatives based on
supporting matrices.
6. A formal Information Strategy Planning Report. This report documents the project
results to top management.

Analysis - The Conceptual Layer


Analysis populates the conceptual layer. During Analysis, analysts examine a selected
area of the business in detail, based on project boundaries established as part of
Planning. An Analysis project refines a business area - a subset of the
Information/Function Architecture developed during planning. In some cases,
businesses may wish to bypass Planning and begin by automating a chosen business
area.
During Analysis, analysts define and refine representations of activities of a business
(functions and processes), fundamental things relevant to the business (entity types),
and the interaction between the two. Analysis provides a foundation for developing
integrated information systems. Its results are independent of any hardware or software
technology. Designers use these results in successive stages to develop the
computerized systems needed to manage the enterprise's information resources.
The objectives of Analysis are to:
1. Fully identify and define the type of data required by the business area.
2. Identify and define the business activities that make up each business function.
3. Define the data required for each business activity.
4. Identify the necessary sequence of business activities.
5. Define how business activities affect data.
6. Produce a plan for Design based on a prioritized sequence of business systems
(normally, multiple business systems are defined to support a single business area).

The deliverables from Analysis are:


1. A data model for each business area. This model consists of an Entity Relationship
Diagram.
2. An activity model for each business area. This model consists of a process
hierarchy Diagram and a set of Process Dependency Diagrams.
3. An interaction model for each business area. This model consists of a set of
Process Action Diagrams.
Where object oriented techniques are to be employed, the above three models may
be replaced with a business area object model.
4. An Implementation Plan for each business area. Analysts arrive at this plan through
matrix clustering and affinity analysis. The plan is a prioritized list of potential
External Design projects.

Application Planning - The Architectural Layer


The purpose of Application Planning is to gather together sufficient information to initiate
the development of a specific application system. Where an ISP and a BAA have been
performed, the strategic need for the application and how it fits into the

Page 14 06/10/2009
Development Methodology Guideline 1.020.02G
Information/Function Architecture of the enterprise will have been established. The
Technical and Organizational Architectures will also have been established. What
remains to be defined are the resources required, the methodology of development, a
statement of scope, and a brief system requirements definition. Where the development
is to be performed under contract, a RFP will be developed and a contractor will be
selected.
The objectives of Application Planning are:
1. Gather all architectural and conceptual layer information relevant to the proposed
project.
2. Define the scope and requirements for the proposed system.
3. Decide whether the project should be initiated. This determination is made by
assessing the need for and benefits of the system and balancing these against the
estimated resources (funding, people, equipment, and software) necessary to carry
the project through implementation.
4. If the project is to be performed under contract, develop an RFP and select a
contractor.
5. Develop a project plan.
6. Assemble project participants and initiate the development project.

The deliverables from Application Planning are:


1. Sufficient cost/benefits analysis to initiate the project.
2. Statement of application scope and requirements.
3. Where the project is to be performed under contract, a contract containing a
definition of the deliverables and the work to be performed, the methodology to be
used, a project plan (defining schedules for producing and providing deliverables),
and specification of the standards to be followed. Where the project is to be
performed in-house, a project plan identifying all task schedules and resource
requirements/assignments.

Spiral Analysis - The Conceptual Layer


Spiral Analysis is the first iterative phase of the AASHTOWare Development
Methodology. Though it produces models for the conceptual layer, these models relate
to a specific application to be developed. Where and ISP and BAA have been
performed, Spiral Analysis uses this information, as well as that defined in Application
Development to produce models defining the application system’s requirements problem
domain objects.
The objectives of Spiral Analysis are:
1. Analysis of the technical architecture of the target environment to determine the
hardware, software, tools, components, methodologies, and procedures that will be
needed to develop and test a system which targets that environment.
2. Analysis of the user’s organizational requirements and their implications for the new
system. Determine the project development team composition and organization.
3. Through JAD sessions with the users, discover the system requirements.
4. Through JAD sessions with the users, discover all cases of how the system will be
used. Produce, as a result, a Use Case Model.

Page 15 06/10/2009
Development Methodology Guideline 1.020.02G
5. Analysis of the user requirements and use cases to determine the objects of the
systems problem domain and their relationships.
6. Through JAD sessions with users, using interface prototyping techniques, discover
user interface requirements.
7. Analyze the risks to the project and formulate resolutions.
8. Estimate system development resource requirements using object analysis metrics.
9. Revise the project plan based upon findings of analysis activities.

The deliverables of Spiral Analysis are:


1. The Formal Functional Specification is produced by refining and incorporating the
Technical Framework, the Organizational Framework, the Requirements Model, and
the Use Case Model.
2. A Problem Domain Object Model is developed.
3. A user interface prototype is developed.
4. Risk analysis is assessed.
5. A development plan is produced.

Spiral Design - The Implementation Layer


Spiral Design is an iterated process which populates the implementation layer. Using
terminology drawn from other methodologies both external/functional and
internal/technical design models are produced. Spiral Design draws on the conceptual
models produced during analysis and translates them into models representing the real
detailed requirements and objects of the system. The designer must define a system
which fits into a real technical environment using actual development tools, interfacing
with real databases, and meeting all of the specific user requirements.
The objectives of Spiral Design are:
1. Produce Functional and Dynamic Models, as needed, to assure that all aspects of
what the system does and when it does it are defined.
2. Identify and partition system objects into three partition models, separating the user
interface, the business logic or computation, and the technical environment
interfaces. This will permit distribution of portions of the system to servers.
Complete the three object models, including information derived from the Functional
and Dynamic Models. Combine them using object associations to produce a
structured object model of the system.
3. If the system uses a relational database, design the database by capturing attribute
information from the designed objects and converting it to an entity relationship
diagram, defining the database.

The deliverables from Spiral Design are:


1. A System Design Object Model consisting of the Problem Domain Partition Model,
the User Interface Partition Model, and the System Management Partition Model
connected by object associations
2. A Database Model in the form of an entity relationship model
3. A new development plan based on estimates made using design metrics

Page 16 06/10/2009
Development Methodology Guideline 1.020.02G

Spiral Construction - Execution Layer


Spiral construction populates the Execution Layer. The purpose of this iterated phase is
to produce a Functional Prototype of the system. This prototype will have applied all
system edits identified in previous iterations. The Functional Prototype is where new
system components are constructed and actual frameworks are added if they were not
used in the user interface prototype.
The deliverable from Internal Design is:
1. A Functional Prototype which replaces the user interface

Spiral Validation - May Cause Changes in all Layers


The purpose of Spiral Validation is to test each Functional Prototype to insure that it is
developed correctly and that it satisfies all user requirements. To perform these tests,
scenarios are developed which correspond to each use case defined for the system. If a
CASE testing tool is used to develop the scenarios, testing can be automated. Only
those scenarios which test additional new functionality need to be developed for each
iteration of the Construction and Validation phases. JAD sessions are held to
demonstrate developed functionality to the user to capture changes that need to be
made to adhere to user requirements.
The deliverables from Spiral Validation are:
1. A test plan
2. A collection of test scenarios which test all case uses of the system
3. A repository of test results
4. For the last iteration of Spiral Validation, the development team verifies that the
system is free of error from their perspective (Alpha Test) and the users validate that
the system meets their requirements (Beta Test). For this iteration the system is
packaged as it will be delivered and it is tested in a production-like environment. All
implementation enabling software is also tested.
5. Review of system documentation

Spiral Generalization - Effects all Layers


Spiral Generalization is the last iterated phase in the AASHTOWare Development
Methodology. Its purpose is to identify and collect any potentially reusable constructs
which were developed in the analysis, design, and construction phases. After review,
some constructs may be selected for generalization. Generalized objects are placed in
component libraries for future use.
The Deliverables for Spiral Generalization are:
1. Libraries of constructs which are candidates for reuse
2. Library of components which are ready for reuse

Implementation
The purpose of the Implementation phase is to identify and perform all activities
necessary for implementation of the system.
The objectives of Implementation are:

Page 17 06/10/2009
Development Methodology Guideline 1.020.02G
1. Insure that the training needs for users, product administrators, and technical help
are provided for.
2. Test all implementation scripts, instructions, and procedures.
3. Insure that product is properly packaged containing all materials (software and
documentation) necessary for implementation.
4. Make sure that all necessary hardware and software is present and operating in the
target environment.

The deliverable of Implementation is:


1. A successful system implementation.

4. Personnel
4.1 Personnel Roles and Qualifications
The following discussion is concerned with the roles and qualifications needed to develop
systems using elements of the AASHTOWare Development Methodology (ADM) supported
by CASE tools.
1. Enterprise (Business) Related Skills and Knowledge:
a. Enterprise Administration
There must be direction and input from executives and administrators who have a
broad understanding of the entire enterprise. They must know the organization's
missions, objectives, and the factors critical to its success. They should be familiar
with the organization’s information resources and needs. They should understand
the organization's major business areas, how they interact and the information they
need and produce. They should have business planning skills.
They should have understanding of how business systems can support the
information needs of the organization and what are potential areas where business
systems might be advantageous. They should also understand the organizational
requirements of the business areas of the enterprise and how they will be effected by
the introduction of business systems.
b. Business Area Management
Once the business areas of the enterprise are identified, there is need for the kind of
information normally possessed by the management of the business area selected.
Someone must have a detailed knowledge of the types of data required by the
business area. They must know all of the functions of the business area and all of
the activities/roles that go to make up those functions. They must know the data
required and produced by each of the activities, as well as their proper sequence.
c. Subject Matter and User Expertise
Once a specific business system, within a business area, is selected for
development, there is a need for the kinds of knowledge possessed by those who
perform the activities and use the data. They should know how the data should be
presented for its most efficient use, what data should be presented, what data should
be requested, and what activities and data may be combined. They should know the
order and flow of work activities and the data that is needed and produced at each
step along the way.
2. Project Management Activities and Skills:
a. Project Administration

Page 18 06/10/2009
Development Methodology Guideline 1.020.02G
Project administration requires an understanding and enforcement of the
organization’s policies and procedures and specifically those which apply to the
development of business systems. Also included, is the approval, allocation, and
disbursement of the fiscal resources necessary to begin and bring to conclusion
projects. Project administrators must have the skills necessary for reviewing and
approving technical contracts and insuring that their obligations are met. They
should have knowledge of the deliverables (the models created at each level of the
methodology) and the ability to inspect and verify them.
b. Project Management
Project management, which operates within the province of project administration,
requires the skills necessary to develop a project plan which includes all activities
necessary to complete the project, identification of all resources needed by each
activity, identification of the deliverables that should result from each phase of the
project, and the order and completion times for the activities and phases of the
project.
They should have a thorough knowledge of the methodology and a familiarity with
the CASE tools being employed to support it. The project manager must be able to
interpret all the model information contained in the layers (Architectural, Conceptual,
Implementation, and Execution) of the methodology.
The project manager must also be able to control priorities of project activities.
3. Information Systems Skills and Knowledge:
a. Planning
Planners, by interviewing key management personnel (those possessing the skills
and knowledge described above in "IV.A.1.a. Enterprise Administration"), gather the
necessary information about the enterprise for building the Architectural Layer of the
ADM. They must be thoroughly skilled in the concepts, models, and supporting tools
of the methodology, with special emphasis on the disciplines required to complete
the Architectural layer. This first layer comprises the Information, Business Systems,
Technical, and Organizational Architectures.
b. Analysis
The Business Process Analysts, using the information produced in the Architectural
Layer combined with that derived through interviews and meetings with key business
area managers (those possessing the skills and knowledge described in "IV.A.1.b.
Business Area Management"), build the Conceptual Layer as prescribed by the
methodology. Their skills are the same as those of the planners, except the
emphasis is on the concepts, models, and supporting tools necessary to complete
the Conceptual Layer. They should be experienced with the use of JAD work
sessions for capturing and defining user requirements
c. Design
System Architects (designers) are responsible for the overall design and
development of an integrated architecture for a system. Their responsibilities are to
design and develop the application architecture, oversee design and integration of
system components, develop the feasibility prototype, if required, and insure the
system meets functional objectives for performance. The design of new application
objects is usually performed by developers. Designers, using information from the
Architectural and Conceptual Layers combined with that derived through JAD work
sessions or interviews with users and subject matter experts (Those with the
knowledge and skills described in "IV.A.1.c. Subject-matter Experts and Users") build
the models of the Implementation Layer as prescribed by the ADM. Their skills are
the same as those of the planners and analysts, except the emphasis is on the

Page 19 06/10/2009
Development Methodology Guideline 1.020.02G
concepts, models, and supporting tools necessary to complete the Implementation
Layer.
d. Development
Developers of object oriented software are responsible for the construction and
design of application objects. Their responsibilities are to design and code new
application objects, assemble new and existing objects together, develop user
interface prototypes, build back-end SQL stored-procedures, build and optimize
programming interfaces to database servers, and build other low-level interfaces
such as remote procedure calls and security. Using information produced by the
previous levels, they construct an implementable business system. They must have
an overall knowledge of the Implementation Level models with emphasis on the
concepts, tools, and products necessary to complete the Execution Layer.
e. Validation
Testers are responsible for developing test cases and specifications for expected
results, execute test cases, examine the results, and identify those situations where
“actual results” do not match “expected results”. The developed test cases are
drawn from use case models and user requirements.
f. Generalization
Component Developers are responsible for the generalization and maintenance of
reusable components. They develop new components, user navigation styles, and
templates. They make modifications to existing components and frameworks. They
work with development teams in identifying opportunities to generalize objects. They
maintain component libraries and remove obsolete components.
g. Implementation
Those responsible for the implementation of a product must understand both the
product and the user business and technical environment in which it is being
situated. The implementation function includes determination of when the product
has met all requirements and is ready for implementation, planning the
implementation strategy (order and pace of implementation), measurement of user
acceptance of the product, and insuring that product defects are corrected.
h. Technology Specialists
Technical Infrastructure Experts provide information to the planning process for
determining the Technical Architecture, to the design process for determining the
characteristics of the target computing environment, and to the construction process
for determining implementation requirements. They must have strong technical
computing environment skills.
The Database Design Specialist develops and maintains data models, logical and
physical table design, and writes SQL based code.
The Scribe has expertise in using tools which support JAD sessions. The scribe
should be able to capture user interface requirements as they are discussed.
The GUI Designer assists in the design of graphical user interfaces, including the
navigation model, verifying that designs conform to enterprise and industry-wide
design standards.
The Technical Writer develops and maintains all on-line help and user
documentation for the system.
i. Model Management
Personnel with skills in developing, integrating, and maintaining information
engineering models are needed to keep the repositories of models produced as part

Page 20 06/10/2009
Development Methodology Guideline 1.020.02G
of the development effort. They must also be able to verify the correctness and
validity of these models which are project deliverables.
j. Technical Development Support
Technical development support personnel are responsible for implementing the
hardware and software needed to support the development effort and model
management. They should have skills in the installation and maintenance of the
chosen workstations and development tools.
The table on the following page illustrates how the above described rolls or skills are
utilized in the Planning, Analysis, Design, and Construction Layers of ADM development.
Information System Development Roles Versus The Modeling Layers of the
AASHTOWare Development Methodology
Layers\Roles EA BAM SE PA PM PLN ANL D DEV VAL GN IMP TS MM TDS

Planning (ISP) / IB A M P IT P S
Architectural
Layer
Analysis (BAA) IB A M P IT P S
/ Conceptual
Layer
Design (BSD) / IB A M P IT P S
Implementation
Layer
Construction/ A M P P P P IT P S
Execution
Layer

In the above table “A” indicates Administer, “M” indicates Manage, “IB” Information
resource for Business requirements, “IT” indicates Information resource for Technology,
“S” indicates Support, “MIE” Indicates Management of Information Engineering
Facilitation, while “P” indicates Performance of the Information Engineering function.

The acronyms used for column headings have the following meanings:
EA= Enterprise Administration DEV= Development
BAM= Business Area Management VAL= Validation
SE= Subject Matter and User Expertise GN= Generalization
PA= Project Administration IMP= Implementation
PM= Project Management TS= Technical Specialist
PLN= Planning MM= Model Management
ANL= Analysis TDS= Technical Development Support
D= Design

Page 21 06/10/2009
Development Methodology Guideline 1.020.02G

4.2 Personnel Organization


The following descriptions of AASHTO organizational components are supplied to facilitate
comparison of the roles required by ADM, described above, with those of the existing
AASHTO organization. The are not intended to be exhaustive definitions, but rather to
supply only that information which is useful for determining where the skills required can
best be situated and to define additional skills that are needed in that area. For more
complete definitions of many of these roles see the “AASHTO Cooperative Computer
Software Policies, Guidelines and Procedures” publication.
1. Special Committee on Software Systems (SCOSS):
This special committee formulates and oversees the policies and guidelines needed for
the proper conduct of joint computer software development. They insure that actions by
the Special Committee on Joint Development (SCOJD) and the Project/Product Task
Forces are consistent with AASHTO policies. This committee can be a resource to the
strategic planning effort. It can also promote the advantages of joint development with
the transportation agencies.
2. AASHTO Committees:
The AASHTO committees are most usually engaged in defining standard practices in
transportation business areas. Some AASHTOWare products are centered around
standards developed by these committees. Since the standards and reports of these
committees reflect the consensus of many transportation agencies, their input
concerning business practices and rules can be valuable guides to planning, analysis,
and design in a given business area. Examples of AASHTOWare Products which are
based on committee publications are IGRDS and DARWin which are coordinated with
the publications “A Policy on the Geometric Design of Highways” and “AASHTO Guide
for Design of Pavement Structures” respectively.
3. Special Committee on Joint Development (SCOJD):
The chairperson and the members of the SCOJD perform strategic planning for the
AASHTOWare projects and products. They direct and oversee the activities of the
Project/Product Task Forces. They are customarily chosen for their experience in
managing and utilizing information technologies in transportation agencies. The skills
required for these positions are: management experience, knowledge of transportation
agency planning processes, incite into the application of information technologies to
achieve transportation agency goals and a general knowledge of agency business
areas.
4. AASHTO Executive Staff:
The Executive Director and his authorized delegate(s) perform the contractual and fiscal
operations of joint development. These activities include contract negotiation and
revision, contract administration and management guidelines, solicitation, billing, budget
preparation, contractor payment, and financial planning. The AASHTO Executive Staff
along with the Project Staff are the primary resources for determining the legal, financial,
and contractual policies governing the cooperative development of AASHTOWare.
Since these persons are retained on a permanent full-time basis, they provide for
continuity and consistency in policy over time.
5. AASHTO Project Staff:
The AASHTO Project Staff are authorized by the Executive Director to participate, as
non-voting members, in the Product/Project Task Forces and the SCOJD activities.
They assist the task forces in contractual, fiscal and AASHTO policy areas while
providing a liaison between the task force and the Executive Staff.
6. Project/Product Chairperson:

Page 22 06/10/2009
Development Methodology Guideline 1.020.02G
The Chairperson directs the activities if the Project/Product Task Force, described
below, and is the person who is primarily responsible for supervising the efforts of the
contractor towards a product which satisfies user requirements and stays within budget.
Some useful skills for a project chairman are: management experience in the business
area of the system development, project management experience, familiarity with the
capabilities of automation technologies, and knowledge of AASHTOWare policy,
procedures, standards and guidelines. Since the chairperson must interact effectively
with users, user groups, the task force members, the contractor, AASHTO staff, and the
SCOJD, communication skills are required.
7. Project/Product Task Force Members:
The Project/Product Team is normally composed of the representatives from states
which have some interest in the information system being developed. Either their
agency is sponsoring or using the application in question. The designation of Project or
Product Task Force distinguishes between groups supervising new development and
those which are performing maintenance and continued enhancement of an existing
product. Personnel who are appointed to a task force have the responsibility, under the
direction of their chairperson, of planning for future development and enhancement,
supervising current development, and insuring that user requirements are met. This
implies that subject matter expertise, knowledge of user business requirements, and an
overall understanding of the business area are attributes which should be sought in task
force members.
Since the task force members are usually chosen for their business knowledge, they
must depend on the SCOJD, AASHTO Staff, T&AA Liaisons, and TAG Groups for
information systems development experience.
8. State Sponsors:
Development of new information systems or major additions to the functionality of
existing ones is usually financed through state sponsorship. The decision to sponsor an
AASHTOWare development project is primarily based on whether the proposed project
fits within the business systems plans of the sponsoring states, whether there is
sufficient consensus on product requirements to insure that all sponsors are satisfied,
whether the system is affordable, and whether it is perceived to be the safest, most
flexible, and most efficient way to provide the required functionality. The states should
choose personnel who have the requisite skills to make the above described
determinations.
9. Product Users:
Product users are those who have the right to use the AASHTOWare Product as a result
of belonging to a subscribing organization.
10. User Groups:
User group members represent their product licensed agency for the purpose of
providing advice on product effectiveness, deficiencies and needed enhancements. The
user group should also identify training and support needs. Once maintenance,
enhancements and support are identified, they are prioritized and submitted to the
Project/Product Task Force. User group members should have actual experience using
the product and should have knowledge of the business area involved.
11. Technical & Application Architecture Task Force (T&AA):
This group was formed by the Special Committee on Joint Development and works
under its direction to develop AASHTOWare Application Standards and to provide
consultation services relating to Application Architecture and Technology.
12. Technical Advisory Group (TAG):

Page 23 06/10/2009
Development Methodology Guideline 1.020.02G
A Technical Advisory Group is an ad hoc group formed to provide technical advice (can
be either automation or business area related) to the Project/Product Task Force.
Members of these groups are chosen for their expertise in the area where advice is
needed. They are expected to provide advice that is independent of that of the
contractor.
13. Contractor:
AASHTOWare is typically developed and maintained under contract. The contractor is
usually selected on the basis of response to a RFP. The subsequent contract is
composed from the proposal and the customary AASHTO contractual requirements.
The chosen contractor should have all of the technical and business skills necessary to
complete efficiently the contract within the estimated time and budget. The contractor is
usually expected to perform the following functions: project management, planning,
analysis, design, construction, testing, implementation, maintenance, documentation,
support, and sometimes training.
14. Unassigned Support:
Skills that would be useful to support ADM which are not presently assigned to any
AASHTO position are:
a. component and framework development library management skills for maintaining
reusable objects,
b. and model management skills for verifying, building, maintaining, and integrating
models for purposes of reusability and to assist the task forces in validating
deliverables.
The following table maps the roles of the AASHTO organization with those needed to
support the AASHTOWare Development Methodology.
ADM Role/Skill Requirements Versus Existing AASHTOWare Development Roles
EA BAM SE PA PM PLN ANL D DEV VAL GN IMP TS MM TDS

SCOSS R

AASHTO R R R
Committees
SCOJD X X X R
AASHTO R R+
Exec Staff

AASHTO R R R R R
Proj Staff
Proj/Prod X X X R+ X X R X R+ X
Chair
Proj/Prod R+ R+ R R R R R R
Task Force

State R+
Sponsors
Product R R R
Users

User Group R R R R?
T&AA R R R R
TAG Team R? R R? R? R
Contractor R? R+ R R+ R+ X R+ X R+ X X X

Unassigned R R

Page 24 06/10/2009
Development Methodology Guideline 1.020.02G
EA BAM SE PA PM PLN ANL D DEV VAL GN IMP TS MM TDS

Support

In the above table “X” indicates responsibility for skill area. “R” indicates resource for the
skill. “R?” indicates a possible resource for the skill. “R+” indicates the primary resource for
the skill.

The acronyms used for column headings have the following meanings:
EA= Enterprise Administration DEV= Development
BAM= Business Area Management VAL= Validation
SE= Subject Matter and User Expertise GN= Generalization
PA= Project Administration IMP= Implementation
PM= Project Management TS= Technical Specialists
PLN= Planning MM= Model Management
ANL= Analysis TDS= Technical Development Support
D= Design

5. Training
The table at the end of this chapter describes in summary form the experience or educational
requirements for implementation of ADM.

5.1 Contractor Training


For contracts covering development of new systems, it would be preferred that the
contractor’s staff has experience in skill areas required by the project. This may be
determined by requiring that staff profiles, detailing experience in the appropriate skill areas
and with the tools to be used on the project, be supplied with the proposal. For contracts
covering the continuation of support and maintenance for an existing system, inclusion of
contractor education might be considered as part of the contract. It is not, however,
recommended that existing systems be converted to the use of ADM unless they are
undergoing major revisions or if there are serious problems resulting from the methodology
being used.

5.2 Project/Product Task Force Training


As the education/experience chart indicates, the Project/Product Chairperson should be
experienced in the management of the business area, with the subject matter of the
business area, and in project management. The chairperson should have conceptual
knowledge of the phases of ADM. The task force members usually need experience with
the subject matter only.
Training for these personnel can be limited to conceptual training on project management
and the phases of ADM.

5.3 AASHTO Staff Training


The AASHTO Project Staff will require training in Project Administration and Project
Management if they do not already have experience in these areas. In addition, they will
require conceptual education in all other areas where they do not have experience. The
conceptual education should emphasize the phases of ADM. These personnel would
benefit greatly from development experience or training as well.

Page 25 06/10/2009
Development Methodology Guideline 1.020.02G
Beyond the existing AASHTO development roles, there is a need for personnel with
experience in model management and the generalization process. Each of these roles will
require experienced personnel.

ADM Training or Experience Requirements for Existing AASHTOWare Development


Roles
EA BAM SE PA PM PLN ANL D DEV VAL GN IMP TS MM TDS

SCOSS E A
AASHTO A? A? A
Committees
SCOJD E C C E C E C C C C C C C C C
AASHTO E C C A C C
Exec Staff
AASHTO C C C E/T E/T C C C C C C E C C C
Proj Staff
Proj/Prod E E E C C C C C C C
Chair
Proj/Prod E C C C C C C C C
Task Force

State A
Sponsors
Product E C E
Users
User Group A A C C E
T&AA C C C C C A C C C C E C E E C
TAG Team A? A? A? A? A?
Contractor E? ET ET ET ET ET ET ET E ET ET E
Unassigned E E
Support
In the above table “E” indicates Experience Required. “T” indicates Training Required. “ET”
indicates experience with tools employed on project required. “C” indicates Conceptual
Understanding Required. “A” indicates the Source of Authoritative Information.”?” after any
of the codes indicates that the function (training, experience ...) contingent upon the
circumstances of the project. The acronyms used for column headings have the following
meanings:

EA= Enterprise Administration DEV= Development


BAM= Business Area Management VAL= Validation
SE= Subject Matter and User Expertise GN= Generalization
PA= Project Administration IMP= Implementation
PM= Project Management TS= Technical Specialists
PLN= Planning MM= Model Management
ANL= Analysis TDS= Technical Development Support
D= Design

6. Tools and Equipment


6.1 CASE Tool Selection
The development of complicated software systems, especially where definition of the
planning, analysis, and design layers is expected, requires the use of CASE Tools. The

Page 26 06/10/2009
Development Methodology Guideline 1.020.02G
question that occurs is: Can a single Integrated CASE (ICASE) tool serve all AASHTOWare
development needs? In answer, ICASE tools usually impose too many restrictions and
have too little flexibility to be used in the development of systems targeted for the diverse
environments of the transportation agencies. ICASE tools often do not support new
technology or support it with severe functionality restrictions. These tools are therefore not
recommended for the development of AASHTOWare. It makes little difference how efficient
the development process is, if the resulting product in inadequate in “look and feel”,
performance, or functionality.
The alternative to the one-tool-does-all approach is to pick multiple tools which are as
compatible as possible and cover all of the areas where automation of the development
process can produce savings. Where multiple tools are used, the construction tool (the tool
which supports the development phase of ADM and produces the executable modules of
the system) is by far the most important.
Some of the criteria for selecting the construction tool follow:
1. It must efficiently exploit the full functionality of the target environment. The resulting
systems must perform and fit well in the intended environment. The developer should
be able to use most of the environment’s functionality.
2. The tool should support Object-Oriented Programming techniques and should provide
for object partitioning to facilitate the development of client-server applications.
3. Must support ODBC connection with a wide variety of relational database systems either
locally or across a wide area network.
4. Should permit quick development of prototypes and have an efficient screen painter to
facilitate use in RAD sessions.
5. Should require no fees or licenses for use of developed software.
If the system is to use a relational database, the next most important tool is a database
design and modeling tool which has a two-way interface with the construction tool. If the
database design tool is included in the construction tool, one needs look no farther. If a tool
needs to be selected, it should support the generation and loading of all brands of
databases that will be used by the system. It should also produce entity relationship
diagrams.
If it is not already included with the construction tool, select an analysis and design modeling
tool which is as compatible as possible with the construction tool. Some of the desirable
models to be produced are the use case model, the dynamic model, the functional model,
and the partitioned object design model.
Other tools which may be useful are: validation, configuration management/version control,
project management, and help authoring tools. The validation, configuration
management/version control, and help authoring tools should be closely compatible with the
construction tool.
A category of tool which requires special attention in the joint development context is the
communication tool. Because AASHTOWare projects are managed by transportation
agency personnel, whose time is limited and who are geographically distributed, usually no
more than five project meetings per year are possible. This limitation in meeting time can
cause delays in product development decisions, incomplete participation in product
analysis/design, and reduces considerably the interaction between contractor and task force
personnel. A communication tool which facilitates issue notification and resolution, review of
deliverables (at the analysis, design, construction, and validation levels), and project
documentation (updated project plans/status, meeting minutes, and project correspondence)
is strongly recommended for AASHTOWare development.

Page 27 06/10/2009
Development Methodology Guideline 1.020.02G

6.2 Development Platform Considerations


Though many construction tools are capable of generating applications for target
environments dissimilar to the development environment, this approach has the
disadvantage of reducing the ease of validation of source additions and changes. In fact,
the testing operation cannot be integrated with the construction tool, since, to be valid, it
must occur on the target platform. The many problems which arise as a result of
development/target environment differences (database data type support, SQL processing
capabilities, client/server system performance, network capability, and middleware
functionality) are not readily apparent to the developer until the application is used in the
production environment.
As a result, the construction tool environment should be as compatible as possible with the
target environment. In the case of client-server systems, the server hardware, the client
hardware, the database software (connectivity standard: i.e. DRDA or ODBC and structure
type: i.e. relational or object oriented), the network (protocols and APIs), and all the
supporting systems software and middleware should be, where possible, of the same type
as that of the production environment. Where the environments cannot be the same,
additional steps should be added to the project plan which defines proof of concept and
validation operations covering the above mentioned technology differences.

6.3 Repository Platform Selection


The repository should reside on a server platform which has the following characteristics:
6. Supports server versions of the construction tool, the configuration management/change
control tool, the communication tool, and, where possible, the target relational database.
7. Supports file sharing for all of the other tools.
8. Should be, where possible, of the same type as the target server.

6.4 Target Platforms:


See the “Product Technical and Platform Standards/Guidelines” sections in the
“AASHTOWare Standards & Guidelines Notebook” for recommendations for product target
environments.

7. Projects
7.1 Project Types
1. Information Strategy Plan (ISP):
The Information Strategy Plan populates the Architectural Layer of the ADM (see section
III of this document for a detailed description of the objectives and deliverables of this
layer). ISP can be performed as an independent project or as the first phase of system
development. All of the other layers of the methodology are dependant on the planning
layer and consequently it must be developed first. When the ISP is developed as a
separate product and under separate contract, it is useful if the deliverables which are
machine readable are usable by the developers of the lower layers.
This type of project is not currently employed for the development of AASHTOWare.
2. Business Area Analysis (BAA):
The Business Area Analysis populates the Analysis or Conceptual Layer of ADM (see
section III pf this document for descriptions of the objectives and deliverables of this
layer). The Analysis Layer is dependant upon the product of the Planning Layer and
thus the BAA cannot be begun until there is an ISP. When the BAA is developed as a
separate product and under a separate contract, it is important to pick CASE tools which
use compatible data formats to those employed in developing the ISP and those
intended for use in populating the lower layers.

Page 28 06/10/2009
Development Methodology Guideline 1.020.02G
This type of project is not currently employed for the development of AASHTOWare.
3. System Development:
System Development requires the completion and population of all of the layers of ADM.
System Development may be based on a preexistent ISP or BAA, or it can include their
development.
An AASHTOWare System Development project can begin with the Application Planning
phase of ADM. In this case the Application Planning and Spiral Analysis phases will be
used in place of an ISP and BAA to populate the architectural and conceptual layers.
Present AASHTOWare development practice relies upon sponsor and user
requirements to define the architectural and conceptual layers. System development is
thus begun at the implementation layer with an external design that is based on
requirements expressed in a contract, or informally by sponsors and users.
4. Business Process Re-Engineering (BPR):
Often the ISP and BAA reveal inefficiencies in business processes, or the External
Design, Business System Design (BSD), reveals that automation can facilitate beneficial
changes in business processes. If this occurs, a Business Process Re-Engineering
project can be initiated to make the indicated changes to the business. When the BPR
project results from and depends on System Development, there must be careful
planning and coordination of the changes to business processes and the implementation
of the new information system.
Though the implementation of AASHTOWare products may require changes to user
business processes, there is no formal use of BPR techniques, and the changes
required are often the result of accident rather than of plan.
5. Maintenance:
Maintenance projects may include all four model layers, but most usually, they effect
only the last two (Implementation and Execution). The maintenance project must begin
at the highest layer that is affected by the maintenance. This implies that all higher
layers are wholly complete and correct with regard to the planned maintenance.
AASHTOWare product maintenance never begins at a higher level than Design.
Whatever information there is about the higher layers (Architectural and Conceptual) is
contained or implied in user requirements.

7.2 Roles Versus Project Types

Information Engineering Roles Versus The Project Types of Information Engineering


EA BAM SE PA PM PLN ANL D DEV VAL GN IMP TS MM TDS
Information IB A M P IT S S
Strategy Plan
(ISP)
Business Area IB A M P IT S S
Analysis (BAA)

System IB IB IB A M P P P P P P IT S S
Development

Business A M P S C S S S P IT* S S
Process Re-
Engineering

Maintenance IB* IB* IB* A M P* P* P P P P P IT* S S

Page 29 06/10/2009
Development Methodology Guideline 1.020.02G
In the above table “A” indicates Administer, “M” indicates Management, “IB” Information
resource for Business requirements, “IT” indicates Information resource for Technology, “S”
indicates Support, “C” indicates Coordination with those who are managing and performing
the activities, while “P” indicates Performance of the development function. An “*” following a
table symbol indicates that there may not be a need for the role in some cases.

The acronyms used for column headings have the following meanings:

EA= Enterprise Administration DEV= Development


BAM= Business Area Management VAL= Validation
SE= Subject Matter and User Expertise GN= Generalization
PA= Project Administration IMP= Implementation
PM= Project Management TS= Technical Specialist
PLN= Planning MM= Model Management
ANL= Analysis TDS= Technical Development Support
D= Design

7.3 Project Types Versus the Model Levels

Project Types Versus The Methodology Model Levels of ADM


Architectural Layer Conceptual Layer Implementation Layer Execution Layer
(Planning) (Analysis) (Design) (Construction)
Information Strategy Plan P N/A N/A N/A
(ISP)
Business System ME P N/A N/A
Analysis (BAA)
System Development ME or P ME or P P P

Process Re-Engineering ME ME
Maintenance ME ME or P P P

In the above table “P” indicates that the layer in question is produced in this type of
development project, “N/A” indicates that the layer is not applicable to this type of
development project, and “ME” means that this layer must be populated but is not produced
by this kind of development project.

8. Recommendations
Future RFPs should require, as part of the proposal, a description of the methodology which will
be used to complete or continue the proposed development. All of the phases of the
methodology should be described as well as the deliverables which can be expected as a result
of completion of each phase.
Also there should be required, as part of the proposal, a project plan which corresponds to the
proposed methodology. It should describe the tasks, including durations and resource
requirements, needed to accomplish the work. It should also indicate when the deliverables will
be submitted.

Page 30 06/10/2009
Development Methodology Guideline 1.020.02G
The AASHTOWare Development Methodology, as expressed in this document, should be
treated as a starting point for a process which probably will not be completed. This process
should consist in continuously revising the methodology based on information from developers,
changes in information technologies, and new requirements of participating transportation
agencies and joint development. Where the document does not fit the realities of joint
development, contractors should feel free to modify or omit activities which do not apply.
Constructive critiques of this document are encouraged.

Page 31 06/10/2009
Development Methodology Guideline 1.020.02G

9. APPENDICES
9.1 APPENDIX A: AASHTOWARE LIFECYCLE CHARTS

Page 32 06/10/2009
Development Methodology Guideline 1.020.02G

Page 33 06/10/2009
Development Methodology Guideline 1.020.02G

9.2 APPENDIX B: SYMBOLOGY EXAMPLES

Page 34 06/10/2009
Development Methodology Guideline 1.020.02G

Page 35 06/10/2009
Development Methodology Guideline 1.020.02G

Page 36 06/10/2009
Development Methodology Guideline 1.020.02G

Page 37 06/10/2009
Development Methodology Guideline 1.020.02G

9.3 APPENDIX C: AASHTOWARE PROJECT MANAGEMENT CHARTS

Page 38 06/10/2009
2 – Project
Management
No project management standards or guidelines exist at this time.
Future versions of the notebook will include project management
standards and/or guidelines.
This page is intentionally blank.
3 – Software
Engineering
This page is intentionally blank.
REQUIREMENTS
STANDARD
S&G Number: 3.010.02S
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
02 2/02/2009 Replaces REQM Standard (3.01.001.02). 03/04/2009
Reviewed and modified after T&AA and Approved by
AASHTOWare stakeholder reviews. SCOJD
-------------------------------------------------------------------
Additional minor changes and format modifications
for publishing were approved by T&AA on
06/16/2009.

06/16/2009
Requirements Standard Version 3.010.02S

Table of Contents
1. Purpose ............................................................................................................... 1
2. Task Force/Contractor Responsibilities........................................................... 1
3. Required Deliverables and Work Products ...................................................... 1
4. Procedures.......................................................................................................... 2
4.1 Develop and Document User Requirements ....................................................2
4.1.1 Elicit Business and User Needs........................................................................... 2
4.1.2 Develop User Requirements................................................................................ 2
4.2 Analyze and Approve User Requirements........................................................3
4.2.1 Review and Analyze User Requirements ............................................................ 3
4.2.2 Approve User Requirements Specification .......................................................... 3
4.2.3 Create Initial Requirements Traceability Matrix ................................................... 4
4.3 Develop and Document System Requirements................................................4
4.3.1 Develop System Requirements ........................................................................... 4
4.3.2 Develop Use Cases and Functional Model.......................................................... 4
4.3.3 Allocate Systems Requirements and Update Requirements Traceability Matrix 5
4.4 Analyze, Validate, and Approve Systems Requirements ................................5
4.4.1 Analyze and Validate System Requirements....................................................... 5
4.4.2 Approve Systems Requirements Specification.................................................... 6
4.5 Manage Changes to Requirements...................................................................6
4.6 Maintain Bi-Directional Traceability..................................................................7
4.7 Identify Inconsistencies.....................................................................................8
5. Technical Requirements .................................................................................... 8
6. Deliverable and Work Product Definitions ....................................................... 9
6.1 User Requirements Specification .....................................................................9
6.1.1 Description: .......................................................................................................... 9
6.1.2 Content................................................................................................................. 9
6.2 System Requirements Specification (SRS) ......................................................9
6.2.1 Description ........................................................................................................... 9
6.2.2 Content................................................................................................................. 9
6.3 Requirements Traceability Matrix ...................................................................10
6.3.1 Description ......................................................................................................... 10
6.3.2 Content............................................................................................................... 11
6.4 Deliverable Acceptance ...................................................................................11
6.4.1 Description ......................................................................................................... 11
6.4.2 Content............................................................................................................... 11
6.5 Change Request Acceptance ..........................................................................12
6.5.1 Description ......................................................................................................... 12
6.5.2 Content............................................................................................................... 12

Page i 06/16/2009
Requirements Standard Version 3.010.02S

1. Purpose
The purpose of the AASHTOWare Requirements Standard is to define the responsibilities of the
product task forces and contractors in developing and managing business needs, user
requirements, and system requirements for AASHTOWare product development. This standard
applies to new development and major enhancement projects and does not apply to minor
enhancements and software maintenance efforts. Refer to the Glossary in the Standards and
Guidelines Notebook for definitions of the types of projects and efforts.
The standard defines activities and outcomes that are considered best practices and should be
followed to ensure that AASHTOWare product development uses quality processes for
requirements development and requirements management that can be measured and
subsequently improved.
In addition, the standard defines certain activities that must be followed and work products that
must be produced in order to comply with the standard. These requirements are shown in red
italicized text.

2. Task Force/Contractor Responsibilities


The product task force and contractor responsibilities in regards to the AASHTOWare
Requirements Standard are summarized below. Additional details on these responsibilities are
provided in the “Procedures” section of this document.
● The contractor must prepare and submit the User Requirements Specification (URS),
System Requirements Specification (SRS), and Requirements Traceability Matrix (RTM) for
the proposed product or enhancement(s).
● The task force must review, analyze, validate, and approve or reject the URS, SRS, and
RTM and document and communicate the approval decision.
● The contractor must determine the impact of requested changes to the approved
requirements on the project/product work plan, deliverables, and other planned work
products.
● The task force must approve or reject all changes, additions, or deletions to the user and
system requirements after the URS and SRS have been approved.
● The task force and contractor must manage the impact of changes to requirements on the
project/product work plan, deliverables, and other planned work products, and changes to
these items on the requirements.
● The task force and contractor must ensure that the approval or rejection of the above
deliverables and change requests are documented and communicated to the appropriate
parties.
● All requirements deliverables and work products must be versioned, stored, and controlled
using configuration management procedures.
In addition, the task force has the responsibility of ensuring that the required submissions,
approvals, communications, documentation, and technical requirements defined in this standard
are complied with. In the event that a requirement of the standard cannot be complied with, the
task force chair should advise the SCOJD or T&AA liaison early in the project/product life cycle.
A request for an exception to the standard must be submitted to the SCOJD with any necessary
documentation for their consideration. Approval of exceptions to the standards is under the
purview of the SCOJD.

3. Required Deliverables and Work Products


The following summarizes the required work products that must be prepared and saved in order
to comply with the Requirements Standard. The work products designated as deliverables are

Page 1 06/16/2009
Requirements Standard Version 3.010.02S

planned and tracked and must also be formally submitted to the task force for approval or
rejection. Definitions and content requirements are provided in the “Deliverable and Work
Product Definitions” section of this document.
● User Requirements Specification (URS) – Deliverable
● System Requirements Specification (SRS) – Deliverable
● Requirements Traceability Matrix (RTM) – Deliverable (A RTM is not normally created for
enhancements; refer to section 4.6 for further details.)
● Deliverable Acceptance for each URS, SRS, and RTM
● Change Request Acceptance for each change request

4. Procedures
The procedures described below define activities that are to be followed by the task force and/or
contractor and the results of those activities.

4.1 Develop and Document User Requirements


4.1.1 Elicit Business and User Needs
♦ Prior to developing the product/project work plan, the task force should provide the
contractor with a list of prioritized business/user needs, enhancements, expectations,
high-level security needs, and constraints. This list should be maintained and
updated over the life of the project/product as items are translated into user
requirements and implemented.
♦ The task force should collect these items by requesting business stakeholders to
provide them, as well as, using eliciting techniques to proactively identify additional
needs not explicitly provided by stakeholders.
♦ Interfaces with other business processes or systems should be identified and
documented while eliciting needs.
♦ Users, user groups, Technical Advisory Groups (TAGs), Technical Review Teams
(TRTs), and/or other types of business stakeholder groups should participate in the
definition and review of these items.
4.1.2 Develop User Requirements
♦ The task force and/or the contractor should review and analyze the list of business
and user needs, enhancements, expectations, constraints, high-level security needs,
and interfaces and determine if any conflicts exist.
♦ For existing applications, the analysis should determine if the proposed needs,
enhancements, expectations, etc. conflict with or “undo” previous change requests
and/or requirements. The analysis should also determine if the proposed items will
have an adverse impact on the application’s current processes and logic; and
determine if any existing interfaces will need modifications based on the proposed
requirements.
♦ Items that were collected in prior years, such as enhancement requests, should be
reviewed to determine if they are still valid and still need to be implemented.
♦ After eliminating conflicts and invalid needs/enhancements, the contractor should
translate the business/user needs, enhancements, expectations, constraints, security
needs, and interfaces into a set of user requirements. User requirements should
describe what the users or business stakeholders expect from the proposed product.

Page 2 06/16/2009
Requirements Standard Version 3.010.02S

♦ The user requirements must be compiled into a User Requirements Specification


(URS) which includes the content listed in the “Deliverables and Work Products
Definition” section of this document.
♦ The URS is a working document that will eventually become a component of the
project/product work plan. The URS may also be maintained as a separate file
(document, spreadsheet, repository, etc.) and referenced in the work plan..

4.2 Analyze and Approve User Requirements


4.2.1 Review and Analyze User Requirements
♦ Before the URS is completed, the task force and contractor should both review and
analyze the user requirements to ensure that:
○ Each user requirement is traceable to a source business/user need,
enhancement, expectation, constraint, security need, and/or interface.
○ Both parties have a common understanding of the intent of each requirement,
□ Reviews or consultations with the requirement originators or user group
representatives should be held, as needed, to help understand specific
requirements.
○ Each requirement is accepted based on standard set of acceptance criteria. The
recommended acceptance criteria is listed below:
□ Each user requirement should be clear, complete, uniquely-identified,
consistent with each other; appropriate to implement, testable, and traceable
to a source.
○ There is an appropriate balance between the needs of the user versus known
constraints. This is especially important for those requirements that have a
significant impact on cost, schedule, functionality risk, maintainability, or
performance. Cost versus requirements tradeoffs may need to be made to
ensure the most cost effective solution that meets the priority needs of the user
organizations.
♦ If the analysis of the URS reveals any conflicts or problems, then:
○ Requirements should be modified as required, and
○ Requirements that are not needed or cannot be justified should be removed from
the URS.
♦ After defining, analyzing, deleting, and modifying the user requirements; the list of
user needs, enhancements, expectations, constraints, high-level security needs, and
interfaces should be reviewed and updated. Any items that will be implemented
should be removed from the list, as well as items that were eliminated during the
development and analysis of user requirements or in subsequent analysis.
4.2.2 Approve User Requirements Specification
♦ After the task force and contractor agree on a set of user requirements to include in
the URS, the contractor must include or reference the URS in the project/product
work plan.
♦ Since the URS is included or referenced in the work plan, the task force approval of
the work plan also represents approval of the URS.
♦ The inclusion of the URS in the work plan and the subsequent task force approval
acknowledges:
○ That task force or its designee has reviewed, analyzed, and accepted each
requirement in the URS, and

Page 3 06/16/2009
Requirements Standard Version 3.010.02S

○ The commitment of both the task force and contractor to implementing all
requirements in the URS.
4.2.3 Create Initial Requirements Traceability Matrix
♦ After the task force has approved the URS, the contractor must create a
Requirements Traceability Matrix (RTM) that includes all user requirements in the
URS.
♦ Each user requirement entered in the RTM must include the same Requirement ID
used in the URS and must include a backwards reference to a documented need,
enhancement request, expectation, constraint, or interface.
♦ This initial version of the RTM is normally created early in the project life cycle.
♦ A RTM is normally not created for enhancements to an existing product. Refer to the
“Maintain Bi-Directional Traceability” and “Deliverable and Work Product Definition”
sections for more information on the RTM and the conditions when a RTM should be
created for enhancements.

4.3 Develop and Document System Requirements


4.3.1 Develop System Requirements
♦ After the URS is approved, the contractor should begin developing the System
Requirements Specification (SRS).
○ The SRS is a deliverable that contains the requirements which describe, in the
language of the software developer and integrator, what the proposed product
will do. The SRS should describe all functional, non-functional (behavior),
technical, data, and interface requirements of the proposed system in sufficient
detail to support system design.
o The SRS may be created as a document, spreadsheet, or another type of digital
file or repository.
o The URS should be reviewed and analyzed and used as the basis for developing
the requirements in the SRS.
○ The SRS is normally created early in the project life cycle or, in the case of an
existing product, early in the work program. When Agile or another type of
incremental development methodology is used, the SRS is typically created
and/or revised with each implementation increment.
♦ Security, accessibility, interface, user interface, and performance requirements must
always be included in the SRS. The approach for compliance with Section 508 of
the U.S. Rehabilitation Act and the Web Content Accessibility Guidelines (WCAG) of
the World Wide Web Consortium Web Accessibility Initiative (W3C WAI) must be
included with the accessibility requirements.
♦ Refer to the “Deliverables and Work Product Definition” section for the required
content of the SRS.
4.3.2 Develop Use Cases and Functional Model
♦ The SRS must contain functional requirements that define the fundamental actions
or behaviors that must take place within the product to accept and process the inputs
and to process and generate the outputs. These describe what the proposed
product must do in order to fulfill the user requirements and are normally described
by use case models.
♦ A functional model (or similar type of product) should be created with the
development and refinement of system requirements and use cases. A functional

Page 4 06/16/2009
Requirements Standard Version 3.010.02S

model is the hierarchical arrangement of functions and sub functions and their
internal and external functional interfaces and external physical interfaces.
♦ Other products such as operational concepts, scenarios, storyboards, and flow
diagrams may be used in lieu of or in addition to the use case and functional models.
4.3.3 Allocate Systems Requirements and Update Requirements
Traceability Matrix
♦ The contractor must enter each system requirement in the SRS into the RTM and
reference each to its source user requirement.
♦ Each system requirement should be allocated to a function, sub function, screen,
report and/or other design object of the proposed product.
♦ The allocation of system requirements to design objects must be documented as
forward traceability references in the RTM.
♦ Refer to the “Maintain Bi-Directional Traceability” and “Deliverable and Work Product
Definition” sections for more information on the RTM.

4.4 Analyze, Validate, and Approve Systems Requirements


4.4.1 Analyze and Validate System Requirements
♦ Before the SRS is completed, the task force and contractor should review and
analyze the SRS in the same manner that was used for the URS (refer to section
4.2.1).
♦ After reviewing and analyzing the SRS, the task force and/or a designated
stakeholder group should validate the system requirements to ensure that:
○ The resulting end product will perform as intended if the requirements are
implemented, and
○ Conflicts between the system requirements, user requirements, and the source
needs, enhancement requests, expectations, constraints, and interface
requirements are identified and resolved.
♦ The contractor should conduct facilitated reviews of the system requirements with
the task force and/or stakeholder groups to help validate the system requirements.
♦ The contractor should develop product representations such as prototypes, mock-
ups, simulations, or storyboards to assist in the analysis or validation of the system
requirements. These should aid in validating the requirements by:
○ Describing to the business stakeholders how the proposed product will be used
to accomplish specific goals or tasks,
○ Defining how stakeholders will interact with the product and how they will interact
with each other with regards to the product, and
○ Helping ensure the stakeholders that the right product is being developed and
that the product will meet their needs.
♦ Any issues or new requirements discovered during the analysis or validation of the
SRS should be documented and reviewed by both the task force and contractor.
The RTM should be updated accordingly.
♦ If the SRS is reworked and resubmitted, the task force should repeat the analysis,
validation, and approval process.

Page 5 06/16/2009
Requirements Standard Version 3.010.02S

4.4.2 Approve Systems Requirements Specification


♦ After the task force completes the validation of the SRS, the contractor should
analyze the impact of the system requirements against the current project/product
work plan, tasks, deliverables, and other planned work products.
♦ The completed SRS must be submitted to the task force for review and approval.
♦ Any significant issues or findings from the impact analysis should also be provided to
the task force.
♦ The task force must approve or reject the SRS and communicate the approval
decision to the contractor.
♦ When Agile or another type of incremental development methodology is used, the
SRS may be approved with each implementation increment.
♦ If the SRS is rejected, then:
○ The task force should provide the reason for rejection to the contractor,
○ The contractor should rework and resubmit the SRS, and
○ The task force should repeat an appropriate level of analysis and validation, and
then repeat the approval process.
♦ Evidence of the task force approval/rejection of the SRS and the communication to
the contractor must be created and saved for future reference. The Deliverable
Acceptance work product in the “Deliverables and Work Product Definition” section
defines the requirements for this documentation.
♦ The submission and approval of the SRS acknowledges:
○ That task force or its designee has reviewed, analyzed, validated, and accepted
each requirement in the SRS, and
○ The commitment of both the task force and contractor to implementing all
requirements in the SRS.

4.5 Manage Changes to Requirements


♦ Each product task force must have a documented change control procedure that
includes, but is not limited to, the following features listed below:
○ The ability to monitor requests that add, change, or remove functionality or
requirements documented in the approved URS. All requests must be submitted
to the task force for approval or rejection.
○ The ability to monitor requests that add, change, or remove functionality or
requirements documented in the approved SRS. All requests must be submitted
to the task force for approval or rejection.
○ Change requests should include a priority, clear description, and a justification for
the change.
○ The contractor should perform an impact analysis on each change request and
return the following information to the task force:
□ New, deleted, or changed requirements that result from the change request,
□ The cost and estimated time to implement the change request, and
□ The impact to the work plan, tasks, deliverables, and other work products if
the change request is implemented.
○ The task force should review the change request and the impact analysis,
approve or reject the request, and communicate the approval decision to the

Page 6 06/16/2009
Requirements Standard Version 3.010.02S

contractor and the originator of the request. If rejected, the reason for rejection
should be included.
o Evidence of the task force approval/rejection of each change request and the
communication to the contractor must be created and saved for future reference.
The Change Request Acceptance in the “Deliverables and Work Product
Definition” section defines the requirements for this documentation.
○ The contractor must modify the RTM to reflect additions, changes, or deletions
made to user or system requirements.
○ The contractor should modify plans, tasks, deliverables, and work products that
are impacted by approved change requests and should keep the task force
advised of progress and issues associated with the impacted items.

4.6 Maintain Bi-Directional Traceability


♦ As previously discussed, the contractor must create a Requirements Traceability
Matrix (RTM) that:
o Includes all users requirements from the approved URS,
o Includes backwards traceability references for all user requirements to a source
need, enhancement request, change request, expectation, constraint, or
interface requirement,
o Includes a backwards traceability reference for all systems requirements to a
source user requirement,
o Includes forward traceability for each system requirement to design objects and
test procedures,
o Includes the content listed in the “Deliverables and Work Product Definitions”
section.
♦ If a RTM was previously created for an existing product, then the RTM should be
maintained for enhancement projects. If an RTM does not exist for the product, then
an RTM is not required for enhancements. The only exception to this is when the
size and scope of a major enhancement is equivalent to a new development project.
In the case of a very large enhancement, an RTM should be created exclusively for
the enhancement project. The T&AA, SCOJD, and/or AASHTO liaison should assist
in identifying these cases.
♦ The reference of system requirements to design objects and test procedures may be
alternately documented in other work products that are reviewed and approved by
the task force. Refer to the “Deliverables and Work Product Definition” section for
additional information.
♦ The RTM may be created as a document, spreadsheet, or another type of digital file
or repository.
♦ The RTM should be maintained and updated throughout the product/project life cycle
as iterations to the project are completed and as changes to the user and system
requirements are implemented.
♦ The task force must review the RTM at different time frames in the product/project
life cycle to ensure that:
o All requirements from the approved URS or those resulting from approved
change requests are included in the RTM with references to their source.
o All requirements from the SRS are referenced to their source user requirement.

Page 7 06/16/2009
Requirements Standard Version 3.010.02S

♦ All requirements in the SRS include forward references to design objects and test
procedures.
♦ The RTM must be submitted as a deliverable to the task force prior to beta testing. If
no beta testing is performed, the RTM should be submitted after alpha testing is
completed.
♦ Evidence of the task force approval/rejection of the RTM and the communication to
the contractor must be created and saved for future reference. The Deliverable
Acceptance in the “Deliverables and Work Product Definition” section defines the
requirements for this documentation.

4.7 Identify Inconsistencies


The purpose of this procedure is to ensure that inconsistencies, which arise during the
execution of the project/product work, between the plan and requirements, are identified and
corrected.
■ The task force should review the work plan and planned deliverables and work product
for inconsistencies with the requirements when the following conditions occur:
♦ Approval of the User Requirements Specification.
♦ Approval of the System Requirements Specification.
♦ Approval of a change request that adds, modifies, or deletes requirements.
♦ Work plan changes.
If inconsistencies are found, proposed changes to the work plan, URS, SRS, or change
request should be submitted to the task force to address the inconsistencies.

5. Technical Requirements
The Technical Architecture portion of the SRS should be used to document the specific
technical requirements that the proposed product must comply with.

Page 8 06/16/2009
Requirements Standard Version 3.010.02S

6. Deliverable and Work Product Definitions


This section describes the deliverables and work products that must be prepared and saved in
order to comply with the Requirements Standard.
All deliverables and work products must be versioned, stored, and controlled using configuration
management procedures.

6.1 User Requirements Specification


6.1.1 Description:
The User Requirements Specification (URS) is a deliverable that contains all of the user
requirements that are approved by the task force to be accomplished in a specified
contract period. The URS, which is incorporated in or referenced by the project or
product work plan, specifies the requirements that the user expects from the proposed
product.
6.1.2 Content
The primary content of the URS should be the information that describes the user
requirements. Each requirement in the URS must include the content listed below. The
task force and contractor may arrange the mandatory items in any order and may add
any additional information to the overall URS or the individual requirements.
♦ Requirement ID: The number or tag that uniquely identifies each requirement.
♦ Description: The full description of the requirement.
♦ Short Description: An optional short description which describes the content of the
requirement but is short enough to use in tables and listings.
♦ Priority: The business priority for implementing the requirement (example - Critical,
Urgent, High, Medium, Low).
♦ Cost: Estimated cost to implement the requirement.
In addition to the requirement information, the URS should include the following
document identification information: Project/Product Name, Contract Period, Version
Number, and Submission Date.

6.2 System Requirements Specification (SRS)


6.2.1 Description
The System Requirements Specification (SRS) is a deliverable that contains the
requirements which describe, in the language of the software developer and integrator,
what the proposed product will do. The SRS should describe all functional, non-
functional, technical, role, and data requirements of the proposed system in sufficient
detail to support system design. Each requirement in the SRS should be traceable to a
requirement in the URS.
6.2.2 Content
There is no rigid format required for the SRS; however, the content listed below must be
included in the SRS. Any other information useful to the task force or contractor may
also be added to the SRS.
♦ Requirement ID: Each requirement included in the SRS must be identified by a
unique number or tag.
♦ Short Description: An optional short description which describes the content of the
requirement but is short enough to use in tables and listings.

Page 9 06/16/2009
Requirements Standard Version 3.010.02S

♦ Technical Architecture: The SRS must contain requirements that define the
technical environment which must be supported by the proposed product. (Examples
are requirements which define platforms, databases, etc.).
♦ System Roles: The SRS must define the roles and skills needed for use and support
of the system. These identify the system users and stakeholders; define their roles
associated with the system; and define the skills needed to perform their roles. The
system roles are used in conjunction with the security requirements when defining
access permission for specific groups of system users or stakeholders. (Example
roles include users, managers, executives, system administrators, security
administrators, database administrators, and application support personnel).
♦ Functional Requirements: The SRS must contain functional requirements that define
the fundamental actions or behaviors that must take place within the product to
accept and process the inputs and to process and generate the outputs. Functional
requirements describe what the proposed product must do in order to fulfill the user
requirements. Functional requirements are normally described by use case models.
(Examples include validity checks, calculations, and data manipulations, input or
output sequences, and responses to abnormal situations).
Functional requirements are normally described by or supported by Use Case
models.
♦ Non Functional Requirements: The SRS must contain non-functional requirements
which specify criteria that can be used to judge the operation of a system, rather than
specific behaviors. Non functional requirements are requirements that are not
specifically concerned with the functionality of a system. They define the overall
qualities or attribute of the resulting product and place restrictions on how the user
requirements are to be met.
Non functional requirements should be broken down into types such as reliability,
accuracy, performance, scalability, testability, maintainability, security, usability,
interface, user interface, design constraints, and implementation constraints.
Security, accessibility, interface, user interface, and performance requirements must
always be included in the SRS.
Refer to the Security Standard for additional information regarding security
requirements.
The requirements that describe the approach for compliance with Section 508 of the
U.S. Rehabilitation Act and the Web Content Accessibility Guidelines (WCAG) of the
World Wide Web Consortium Web Accessibility Initiative (W3C WAI) must be
included with the accessibility requirements.
The interface requirements should include Data Transfer/Exchange requirements as
documented in the XML Standard.
♦ Data Models: The SRS must contain data models for all data to be stored or
exchanged with other systems. For existing systems, a reference or link to the
existing data model should be provided.
In addition to the requirement information, the SRS should include the following
document identification information: Project/Product Name, Contract Period, Version
Number, and Submission Date.

6.3 Requirements Traceability Matrix


6.3.1 Description
The Requirements Traceability Matrix (RTM) is a deliverable that describes the
backward traceability and forward traceability of the requirements in the URS. The RTM

Page 10 06/16/2009
Requirements Standard Version 3.010.02S

documents that every requirement has been addressed in the design and that every
design object addresses a requirement. The RTM also documents that each requirement
is traced to a testing procedure.
A RTM is not normally created for enhancements to existing products. Refer to the
Maintain Bi-Directional Traceability (4.6) section for additional information.
6.3.2 Content
The RTM must contain the following content. The RTM is normally created as grid with
columns for each of the following items.
♦ User Requirement ID: The number or tag that uniquely identifies a user requirement.
All requirements from the approved URS must be included in the RTM and use the
same IDs used in the URS. Each requirement is normally a row in the matrix.
♦ User Requirement Source: A reference or link to source user need, expectation,
enhancement request, change request, constraint, interface, or other information that
was used to derive the user requirement. Multiple user requirements may be traced
to a User Requirement Source.
♦ System Requirement ID: The number or tag that uniquely identifies a system
requirement that was derived from the user requirement. Each system requirement
in the approved SRS must be entered in the RTM. Multiple system requirements
may be traced to a source user requirement.
♦ Design Object Reference: A reference or link to a design object that was derived
from a system requirement. Multiple design objects may be traced to a source
system requirement.
♦ Test Reference: A reference or link to the alpha or beta test procedure or script used
to test and accept a user or system requirement. Multiple tests references may be
traced to a source requirement.
♦ Note: The reference of system requirements to design objects and test procedures
may be alternately documented in other work products that are reviewed and
approved by the task force. In this case, a document must be prepared that
describes where the components of the RTM are located and how they are used to
define traceability. Each document must use the same Requirement IDs that are
used in the URS, SRS, and RTM.
In addition to the requirement information, the RTM should include the following
document identification information: Project/Product Name, Contract Period, Version
Number, and Submission Date.

6.4 Deliverable Acceptance


6.4.1 Description
This work product is the record of the each task force approval or rejection of the SRS
and RTM. A separate Deliverable Acceptance must be created for each of these
deliverables.
Since the work plan approval represents the approval of the URS, a Deliverable
Acceptable for the URS is not required, but may be used if the contractor and/or task
force desires.
6.4.2 Content
Each Deliverable Acceptance must include following content:
♦ Project/Product Name
♦ Contract period

Page 11 06/16/2009
Requirements Standard Version 3.010.02S

♦ Submission date
♦ Deliverable Name
♦ Task force approval or rejection decision
♦ Date of the decision
♦ Reason for rejection (if applicable)
The Deliverable Acceptance may be created and saved in any form acceptable to both
the task force and contractor, such as a letter, form, email, or minutes. Although not
required, it is recommended that the document be signed by the task force chair.

6.5 Change Request Acceptance


6.5.1 Description
This work product is the record of a change request submittal, impact analysis, and the
task force approval or rejection decision. A separate Change Request Acceptance must
be created for each change request.
6.5.2 Content
A Change Request Acceptance must include the following content:
♦ Project/Product Name
♦ Submission date
♦ Originator
♦ Change request description
♦ Business priority
♦ Impact analysis
○ New, deleted, or changed requirements that result from the change request,
○ The cost and estimated time to implement the change request, and
○ The impact to the work plan, tasks, deliverables, and other work products.
♦ Task force approval or rejection decision
♦ Date of the decision
♦ Reason for rejection (if applicable)
The Change Request Acceptance may be created and saved in any form acceptable to
both the task force and contractor, such as a letter, email, or minutes. Although not
required, it is recommended that the document be signed by the task force chair.

Page 12 06/16/2009
XML
STANDARD
S&G Number: 3.015.01S
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 2/03/2009 Replaces AASHTOWare XML Implementation and 03/04/2009
Migration guideline (3.03.G20.01). Reviewed and Approved by
updated by T&AA. Reviewed by stakeholders and SCOJD
then updated.
------------------------------------------------------------------
Additional minor changes and format modifications
for publishing were approved by T&AA on
06/16/2009.

06/16/2009
XML Standard 3.015.01S

Table of Contents
1. Purpose ............................................................................................................... 1
2. Task Force/Contractor Responsibilities........................................................... 1
2.1 For New Development Projects.........................................................................1
2.2 For Major Enhancement Projects......................................................................1
3. Required Deliverables and Work Products ...................................................... 2
3.1 For New Development Projects.........................................................................2
3.2 For Major Enhancement Projects......................................................................2
4. Procedures.......................................................................................................... 2
5. Technical Requirements and Recommendations............................................ 2
5.1 XML .....................................................................................................................2
5.2 TRANSXML .........................................................................................................2
5.3 Schemas .............................................................................................................3
5.4 Names .................................................................................................................3
5.5 Namespaces .......................................................................................................3
5.6 Data Dictionaries................................................................................................3
5.7 XML Tools...........................................................................................................3
6. Deliverable and Work Product Definitions ....................................................... 4
6.1 XML Strategy (included in product Strategic Plan)..........................................4
6.1.1 Description ........................................................................................................... 4
6.1.2 Content................................................................................................................. 4
6.2 Data Transfer/Exchange User Requirements ...................................................4
6.2.1 Description ........................................................................................................... 4
6.2.2 Content................................................................................................................. 4
6.3 Data Transfer/Exchange System Requirements ..............................................4
6.3.1 Description ........................................................................................................... 4
6.3.2 Content................................................................................................................. 4
6.3.3 XML Reporting Requirements.............................................................................. 5

Page i 06/16/2009
XML Standard 3.015.01S

1. Purpose
The purpose of this document is to provide details for the use of XML (eXtensible Markup
Language) in AASHTOWare products. This standard applies to new development and major
data transfer/exchange related enhancement projects. The standard does not normally apply to
minor maintenance and software maintenance efforts, however, it should be reviewed when
these efforts involve data transfer/exchange. Refer to the Glossary in the Standards and
Guidelines Notebook for definitions of the types of projects and efforts.
This standard includes certain activities that must be followed and work products that must be
produced in order to comply with the standard. These requirements are shown in red italicized
text.

2. Task Force/Contractor Responsibilities


The product task force and contractor responsibilities for the XML standard are summarized
below:
In the case of existing products, each task force should develop a strategy for using XML to add
or revise internal and external data transfer/exchange functionality and include the strategy in
the product strategic plan. When deemed beneficial, the strategic plan should also include the
strategy for adding new reports using XML or converting existing reports to XML.
For new products, XML must be used as the method for data transfer and/or exchange and is
strongly recommend for reporting.
In addition, to the above responsibilities, the product task force and contractor also have the
following responsibilities regarding the project/product work plan.

2.1 For New Development Projects


■ Document the project needs for new internal and/or external data transfer/exchange
functionality as user requirements in the project work plan. Also, document that this
functionality will be implemented using XML.
■ Using the user requirements, develop system requirements to expand and detail
specifically what the system must do and how it is to be accomplished in regards to data
transfer/exchange and the use of XML. Document these requirements in the Systems
Requirement Specification (SRS).
■ As discussed above, it is strongly recommended that XML be used for reporting on new
products. When the task force and/or contractor determine that XML based reporting is
beneficial, the same work plan and SRS activities listed above should be followed.
■ Implement and test the XML requirements in the SRS.

2.2 For Major Enhancement Projects


■ If the major enhancement involves new data transfer/exchange needs, document these
as user requirements in the product work plan.
■ If XML is to be used for implementing the data transfer/exchange requirements, note this
in the work plan.
■ Using the user requirements, develop system requirements to expand and detail
specifically what the system must do and how it is to be accomplished in regards to data
transfer/exchange and the use of XML. Document these requirements in the Systems
Requirement Specification (SRS).
■ When a major enhancement involves reporting, it is recommended that the use of XML
for reporting be considered. When the task force and/or contractor determine that XML
based reporting is beneficial, the same work plan and SRS activities listed above should
be followed.

Page 1 06/16/2009
XML Standard 3.015.01S

In addition to the above responsibilities, the task force has the responsibility of ensuring that the
required submissions, approvals, communications, documentation, and technical requirements
defined in this standard are complied with. In the event that a requirement of the standard
cannot be complied with, the task force chair should advise the SCOJD or T&AA liaison early in
the project/product life cycle. A request for an exception to the standard must be submitted to
the SCOJD with any necessary documentation for their consideration. Approval of exceptions
to the standards is under the purview of the SCOJD.

3. Required Deliverables and Work Products


The following summarizes the required deliverables and work products that must be created
and/or delivered in order to comply with the XML standard. Refer to the “Deliverable and Work
Product Definitions” section below for additional information.

3.1 For New Development Projects


■ Include data transfer/exchange and XML user requirements in the project work plan.
■ Include detailed requirements for implementing data transfer/exchange in the SRS.
■ Include XML reporting strategies and requirements in the above deliverables when
applicable.

3.2 For Major Enhancement Projects


■ Include XML strategies in the product strategic plan.
■ Include XML items in the same deliverables listed above, when major enhancements
involve data transfer/exchange.
■ Include XML reporting strategies and requirements in the above deliverables when
applicable.

4. Procedures
Not Applicable

5. Technical Requirements and Recommendations


Technical descriptions and requirements for the use of XML are available on the web. This
section does not attempt to reproduce the web data. Brief descriptions and/or requirements are
provided along with minimal links to associated information.

5.1 XML
XML is a general purpose specification for creating custom markup languages. It is flexible,
or extensible, because it allows users to define their own elements if needed rather than
follow a strict, limited format. The specification is recommended and maintained by the
World Wide Web Consortium (W3C). For a full definition of XML, refer to
http://en.wikipedia.org/wiki/XML.
AASHTOWare recognizes the benefit of XML as a method for data exchange and
recommends that all AASHTOWare products consider how the specification might be
utilized, either internally or externally.
W3C XML web site & link http://www.w3.org/XML/
to specifications

5.2 TRANSXML
In March of 2004, the National Cooperative Highway Research Program (NCHRP) began
Project 20-64, XML Schemas for Exchange of Transportation Data. The objectives of the
project were to develop broadly accepted public domain XML schemas for exchange of

Page 2 06/16/2009
XML Standard 3.015.01S

transportation data and to develop a framework for development, validation, dissemination,


and extension of current and future schemas. The framework developed was called
TransXML. The project was completed in October of 2006.
There were four business area schemas (Bridge, Transportation Safety, Survey/Roadway
Design, and Transportation Construction/Materials) developed during the project. The final
report from Project 20-674, Report 576, is available at various sites on the web.
Abstract and access to http://www.trb.org/news/blurb_detail.asp?ID=7338
contents of the CD-ROM
included with the report
NCHRP Report 576 http://onlinepubs.trb.org/onlinepubs/nchrp/nchrp_rpt_576.pdf
TransXML web site http://www.transxml.org/
AASHTOWare supports the results of the TransXML project and recommends that all
AASHTOWare products consider the use of the schemas developed and/or modification
thereof when implementing XML functionality.

5.3 Schemas
Schema definitions for AASHTOWare products should be compatible with the W3C
specification and should follow the schemas developed under the TransXML project to the
extent possible. Maximum use of existing schema(s) should be made; development of
completely new schemas is unacceptable where there is an existing schema or an existing
schema that may be modified to meet the needs.

5.4 Names
XML names shall be W3C compliant, self-explanatory and meaningful to the business area.
When the possibility of data sharing between products exists, all of the involved product task
forces should review the proposed naming conventions to prevent ambiguous names.
Theses activities should be coordinated through AASHTO staff liaisons assigned to each
project or product.

5.5 Namespaces
Where namespaces are used, they shall be W3C compliant.
Namespaces in XML 1.1 (2nd http://www.w3.org/TR/2006/REC-xml-names11-20060816/
Edition)

5.6 Data Dictionaries


Data dictionaries shall be produced which contain information for each element in the
schema. A brief description of the element should be included in the data dictionary and as
a comment in the schema. When the schema is maintained by a third party, the task force
and/or contractor should only maintain documentation associated with the additions or
modifications unique to the AASHTOWare product.

5.7 XML Tools


There are a variety of tools that can be used for XML development. They range from
extensive suites of tools to shareware and freeware editors. Many of the suites of tools
include all of the products necessary for formatting, generating stylesheets and schemas,
etc. This document does not provide a list of recommended tools. Lists can easily be
obtained using Google Search or other web search engines.

Page 3 06/16/2009
XML Standard 3.015.01S

6. Deliverable and Work Product Definitions


All deliverables and work products listed below must be versioned, stored, and controlled using
configuration management procedures. This does not apply to the strategic plan; however,
similar practices are recommended for storing versions of the strategic plan.

6.1 XML Strategy (included in product Strategic Plan)


6.1.1 Description
If XML has not been fully implemented in an existing product, the product Strategic Plan
should include the long term strategy to convert existing data transfer/exchange
functionality to XML and to implement new functionality with XML.
6.1.2 Content
There is no specific format or content required for strategic plan XML strategies.

6.2 Data Transfer/Exchange User Requirements


6.2.1 Description
User requirements that describe the basic functionality needed for data transfer and
exchange should be included in the project/product work plan. In the case of new
development projects, the requirement to implement this functionality with XML should
also be included.
6.2.2 Content
These user requirements should include the same content as all other user requirements
described in the URS, including a requirement ID and description. Refer to the
AASHTOWare Requirements Standard for additional information regarding user
requirements.

6.3 Data Transfer/Exchange System Requirements


6.3.1 Description
Data Transfer/Exchange requirements must be included in the SRS or must be included
in a separate document that is referenced in the SRS for all new development projects.
All major internal and external data transfer/exchange instances associated with the new
product or the proposed major enhancement must be defined.
6.3.2 Content
As with other components of the SRS, there is no rigid format for SRS items. For each
major internal and external data transfer/exchange instance, the following content should
be included. Data items and other items needed to define the requirements should be
added or referenced as required. A tabular format is recommended for documenting
Data Transfer/Exchange system requirements.
Refer to the AASHTOWare Requirements Standard for additional information regarding
the system requirements and the SRS.
6.3.2.1 Requirement ID
Each requirement included in the SRS must be identified by a unique number or tag.
6.3.2.2 Short Description
Each requirement included in the SRS should optionally include a short description
which describes the content of the requirement but is short enough to use in tables
and listings.
6.3.2.3 Description
This item provides a detailed description of the data transfer/exchange instance.

Page 4 06/16/2009
XML Standard 3.015.01S

6.3.2.4 Existing Process


This item only applies to existing products. The current method used for each
existing data transfer/exchange instance should be identified. Examples include, but
are not limited to, binary, XML, and delimited file.
6.3.2.5 Proposed Process
This identifies the proposed process to be used to implement the data
transfer/exchange instances. Examples include, but are not limited to, binary, XML,
and delimited file. XML should be used for new development unless there is a valid
reason for not using XML.
6.3.2.6 Reason For Proposed Process
If XML was not identified as the proposed process, the reason for not using XML
must be provided. The reason for selecting XML may be optionally provided.
6.3.2.7 XML Implementation Descritpion
This item provides a brief description of how XML will be implemented for the
specified instance.
6.3.3 XML Reporting Requirements
For new development, it is also recommended that XML be strongly considered for
reporting. In this case, each major report instance for the proposed product should also
be included in the data transfer/exchange requirements, or documented in a similar
reporting section, with the reason for choosing or not choosing XML as the method of
implementation.
If a major enhancement involves reporting, it is also recommended that the report
instances associated with the enhancement be included with the data transfer/exchange
requirements or similar reporting section.

Page 5 06/16/2009
This page is intentionally blank.
SECURITY
STANDARD
S&G Number: 3.020.01S
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 02/02/2009 Initial Draft. Reviewed and modified after T&AA 03/04/2009
and AASHTOWare stakeholder reviews. Approved by
------------------------------------------------------------------ SCOJD
Additional minor changes and format modifications
for publishing were approved by T&AA on
06/16/2009.

06/16/2009
Security Standard 3.020.01S

Table of Contents
1. Purpose ............................................................................................................... 1
2. Task Force/Contractor Responsibilities........................................................... 1
3. Required Deliverables and Work Products ...................................................... 1
4. Procedures.......................................................................................................... 1
4.1 Establish Security Requirements......................................................................1
4.2 Include AASHTOWare Security Technical Requirements ...............................2
4.3 Review Impact to Existing Security ..................................................................2
4.4 Test and Implement the Security Requirements ..............................................2
5. Technical Requirements .................................................................................... 2
5.1 Lightweight Directory Access Protocol (LDAP) ...............................................2
5.2 Encryption of Sensitive Data.............................................................................2
5.3 Role Based Security...........................................................................................3
5.4 Industry Standard Passwords ...........................................................................3
5.5 Appropriate Levels of Hardening ......................................................................3
5.6 Security Patches ................................................................................................3
6. Deliverable and Work Product Definitions ....................................................... 4
6.1 Security Requirements ......................................................................................4
6.1.1 Description ........................................................................................................... 4
6.1.2 Content................................................................................................................. 4
6.2 System Roles......................................................................................................4
6.2.1 Description ........................................................................................................... 4
6.2.2 Content................................................................................................................. 4

Page i 06/16/2009
Security Standard 3.020.01S

1. Purpose
AASHTOWare recognizes its responsibility for providing secure applications. Further,
AASHTOWare endorses and demands that applications delivered meet user needs and
maintain the highest level of application, data, and infrastructure security as practical. This
standard defines the security requirements and responsibilities that must be met when
developing AASHTOWare products.
This standard applies to new development and major security related enhancement projects.
The standard does not normally apply to minor maintenance and software maintenance efforts,
however, it should be reviewed when these efforts involve security.
Refer to the Glossary in the Standards and Guidelines Notebook for definitions of the types of
projects and efforts. In addition, the standard primarily addresses multi-user applications except
where noted otherwise.
The Security Standard includes certain activities that must be followed and work products that
must be produced in order to comply with the standard. These requirements are shown in red
italicized text.

2. Task Force/Contractor Responsibilities


The product task force and contractor responsibilities for the Security Standard are summarized
below:
● Ensure that business specific security requirements are defined and implemented.
● Ensure that the security technical requirements defined in this standard are implemented in
the product when applicable.
● Ensure that industry best security practices and emerging security trends are considered
and implemented appropriately.
In addition, the task force has the responsibility of ensuring that the required submissions,
approvals, communications, documentation, and technical requirements defined in this standard
are complied with. In the event that a requirement of the standard cannot be complied with, the
task force chair should advise the SCOJD or T&AA liaison early in the project/product life cycle.
A request for an exception to the standard must be submitted to the SCOJD with any necessary
documentation for their consideration. Approval of exceptions to the standards is under the
purview of the SCOJD.

3. Required Deliverables and Work Products


The following summarizes the required deliverables and work products that must be created
and/or delivered in order to comply with the Security Standard. Definitions and content
requirements are provided in the “Deliverable and Work Product Definitions” section of this
document.
• Security Requirements – must be included in the System Requirements Specification (SRS).
• System Roles – must be included in the SRS.

4. Procedures
4.1 Establish Security Requirements
For each new development or major enhancement effort, the task force and/or contractor
should:
■ Analyze the business needs, expectations, and constraints that impact the data,
application, and system security,

Page 1 06/16/2009
Security Standard 3.020.01S

■ Define the applicable security requirements and system roles for the effort and include in
the System Requirements Specification (SRS).

4.2 Include AASHTOWare Security Technical Requirements


Where applicable, the task force and/or contractor must ensure that the technical
requirements listed below are included in the SRS.

4.3 Review Impact to Existing Security


For each enhancement or modification to an existing application, the task force and/or
contractor should ensure that there is no impact to the existing security introduced by the
implementation of the enhancement or modification.

4.4 Test and Implement the Security Requirements


The task force and contractor should ensure that all security requirements in the approved
System Requirements Specification are tested and implemented.

5. Technical Requirements
Research performed by T&AA reveals that there is a wide variety of tools, products, and
computer environments in use at member agencies. Such variety exists that identifying detailed
security requirements is not practical. Therefore, the following high-level security requirements
are identified.
In addition to the standards listed below, product contractors and task forces are responsible for
ensuring that industry best security practices and emerging security trends are considered and
implemented appropriately.

5.1 Lightweight Directory Access Protocol (LDAP)


User authentication routines must support the use of Lightweight Directory Access Protocol
(LDAP) for user authentication and when feasible should support the use of LDAP for
access permissions. The use of internal processes to support user authentication and
access permissions is not prohibited, but AASHTOWare products must also support
authenticating users via LDAP queries.
References:
LDAP http://en.wikipedia.org/wiki/Lightweight_Directory_Access_Protocol
List of LDAP software http://en.wikipedia.org/wiki/List_of_LDAP_software

5.2 Encryption of Sensitive Data


User accounts, passwords, and any other data identified as being sensitive must be
encrypted while in transit or at rest using methods and techniques accepted by the industry
as being reliable and secure. This includes, but is not limited to, data transmitted on
internal, external, public, or private networks and data stored in a database management
system such as Oracle, Microsoft SQL Server, etc.
References:
Data encryption http://en.wikipedia.org/wiki/Data_Encryption_Standard
standards
Microsoft SQL http://channel9.msdn.com/Showpost.aspx?postid=139794
encryption
http://www.databasejournal.com/features/mssql/article.php/3483931
Encryption and SQL http://articles.techrepublic.com.com/5100-22-5083541.html
injection

Page 2 06/16/2009
Security Standard 3.020.01S

Oracle Transparent http://www.oracle.com/technology/deploy/security/database-


Data Encryption security/transparent-data-encryption/index.html
Configuring data http://download-
encryption and west.oracle.com/docs/cd/A97630_01/network.920/a96573/asoconfg.htm
integrity
Payment Card https://www.pcisecuritystandards.org/index.shtml
Industry standards

5.3 Role Based Security


Applications must use role based security. Roles should be controlled within the application.
This will eliminate the need for users to have accounts that access databases directly, which
improves overall security.

5.4 Industry Standard Passwords


Passwords must follow industry recognized standards for minimum length, makeup (i.e.,
characters, numbers, or symbols), and change frequency.
References:
Federal Information Processing http://www.itl.nist.gov/fipspubs/fip112.htm
Standards (FIPS 112)
US Agency of International http://www.usaid.gov/policy/ads/500/545mau.pdf
Development password
creation standards

5.5 Appropriate Levels of Hardening


Hardware and software provided to AASHTOWare customers that is exposed to external
network users, including Internet users, must be hardened to levels accepted by the industry
as appropriate and effective for the hardware and software being used.
References:
World Wide Web Consortium http://www.w3.org/Security/Faq/www-security-faq.html
FAQ
CERT http://www.cert.org/cert/information/sysadmin.html
SANS Institute http://www.sans.org/
http://isc.sans.org/diary.html?storyid=1615&rss
Windows hardening guidelines http://www.first.org/resources/guides/#bp11
http://www.whitehatinc.com/services/security/hardening
US Security Awareness http://www.ussecurityawareness.org/highres/infosec-
program.html
http://www.usccu.us/documents/US-CCU%20Cyber-
Security%20Check%20List%202007.pdf
Open Web Application Security http://www.owasp.org/index.php/Main_Page
Project (OWASP)

5.6 Security Patches


AASHTOWare contractors should assist in identifying and monitoring security patches for
third-party components used in AASHTOWare products. In addition, contractors should
notify the licensees of the location where the patches may be obtained and provide any
specific instructions needed to incorporate the patches into AASHTOWare products within a

Page 3 06/16/2009
Security Standard 3.020.01S

reasonable timeframe from when the manufacturer of the third-party component makes
patches available.

6. Deliverable and Work Product Definitions


6.1 Security Requirements
6.1.1 Description
The security requirements of the proposed application, system, database, or
enhancement must be included in the System Requirements Specification (SRS). In
addition, the security requirements must be included in the appropriate test procedures
for alpha and best testing.
6.1.2 Content
The SRS must include a section where all security requirements are documented. Other
methods that allow all security requirements to be easily identified may be used in lieu of
this method.
The security requirements should define:
○ Privacy concerns associated with the application or data;
○ The types of users that have access to the applications, systems, databases, and
data (see system roles below);
○ What each type of user has access to and the type of access allowed;
○ AASHTOWare and member organization technical and organizational security
requirements and constraints; and
○ Security Requirements

6.2 System Roles


6.2.1 Description
The SRS must define the roles of the various stakeholders that use and support the
system.
6.2.2 Content
The roles may be provided in any format that identifies the groups of users and
stakeholders along with their roles and responsibilities regarding the proposed system.
Example roles include users, managers, executives, system administrators, security
administrators, database administrators, and application support personnel.

Page 4 06/16/2009
PRODUCT GRAPHICAL
INTERFACE STANDARD
S&G Number: 3.030.03S
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 Oct. 2000 Initial Version. Oct. 2000
02 June 2001 This specification is a revision of the “Graphical April 2002
Interface Guideline, No: 3.10.G10.01 and replaces
“User Interface Standards”, No: 3.10.100.02.
03 06/15/2009 Applied standard template and changed standard 06/16/2009
number from 3.03.010.02 to 3.030.03S. Made Approved by
minor changes and format modifications. T&AA

06/15/2009
AASHTOWare Product Graphical Interface 3.030.02S

1. Scope or Area of Application


This specification applies to all AASHTOWare software.

2. Description
This specification describes requirements and recommendations for the graphical user
interfaces of AASHTOWare products. Emphasis will be on MS Windows operating
environments since most AASHTOWare development is for these platforms.

3. Compliance (Requirements & Recommendations


Summary)
● All AASHTOWare products shall provide a user interface which is consistent with the best
practices of the Version/Release of the Operating/Windowing Environment for which the
product is designed.
● If the manufacturer of the Operating/Windowing Environment provides a style guide, it shall
be used as a guideline for user interface design.
● Development tools which limit useful functionality of the Operating/Windowing Environment
shall be avoided.
● Products with browser interfaces should support Microsoft Internet Explorer and Netscape
Navigator. The user interface shall conform to the capabilities of the Version/Release
required by the product user community.
● The user interface, where possible, should operate at the application specific minimum
screen resolution without resorting to window or panel scrolling. For example, if the
application design specifies 800x600 pixels as the minimum screen resolution, then all
window/panel objects (i.e. buttons, tabs, menus) should be visible without scrolling when the
window/panel is maximized on the screen.
● Products shall conform to the initiation, termination, installation, and removal conventions of
the Operating/Windowing Environment.
● Products shall be designed to coexist with other concurrently running applications without
modifying or interfering with their user interfaces. The minimizing, maximizing, window
positioning, window sizing, and window activation functions appropriate to the
Operating/Windowing Environment shall be enabled.
● Products shall provide application help facilities which are consistent with those provided by
the environment. The granularity of context sensitive help shall be at least to the
window/frame level. Bubble help should also be employed where useful.
● All AASHTOWare products shall display their registered name, AASHTO logo,
AASHTOWare configuration name, and AASHTOWare copyright notice in a manner
consistent with the environment.

4. Benefits or Advantages
Observance of this specification will reduce the time required to learn how to use the product. It
will also improve its marketability.

5. Costs or Disadvantages
Additional development costs may be incurred. These should be offset by training and usability
improvements.
Page 1 06/15/2009
AASHTOWare Product Graphical Interface 3.030.02S

6. Implementation Method
This specification should be observed for all AASHTOWare product releases that occur after its
effective date.

Page 2 06/15/2009
This page is intentionally blank.
DATABASE SELECTION
AND USE GUIDELINE
S&G Number: 3.040.02G
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
02 06/09/2009 Replaces existing Database Selection Guideline 06/16/2009
(3.03.G50.01). Reviewed by T&AA. Reviewed by Approved by
stakeholders and changes were made. T&AA

06/09/2009
Database Selection and Use Guideline 3.040.02G

Table of Contents

1. Purpose ............................................................................................................... 1
2. Task Force/Contractor Responsibilities........................................................... 1
3. Recommended Deliverables and Work Products ............................................ 1
4. Procedures.......................................................................................................... 1
4.1 New Database Notification ................................................................................1
4.2 Discontinued Database Planning......................................................................2
4.3 Database Selection ............................................................................................2
4.3.1 Database Selection Criteria ................................................................................. 2
4.3.2 Additional Considerations .................................................................................... 3
4.4 Update Public Web Site and/or Product Catalog .............................................3
5. Technical Recommendations ............................................................................ 3
5.1 Enterprise (Multi-User) User Databases ...........................................................3
5.2 Standalone (Single User) Databases ................................................................3
6. Deliverable and Work Product Definitions ....................................................... 4
6.1 Published List of Database Platforms and Versions .......................................4
6.1.1 Description ........................................................................................................... 4
6.1.2 Content................................................................................................................. 4

Page i 06/09/2009
Database Selection and Use Guideline 3.040.02G

1. Purpose
Relational databases are the preferred method of data storage for application programs. This is
especially true for multi-user applications, where data update coordination between many users
is essential. Databases provide built-in functions that lend themselves to performance, security,
and multi-user access.
It is the intent of this guideline to apply industry standards in the use of databases in
AASHTOWare product development. In addition, the guideline provides information and
recommendations which promote the preservation, sharing, and exchange of data supported by
AASHTOWare products. This guideline is applicable and should be considered for new product
databases; database support of existing products; development efforts that include the
establishment/replacement of an application data storage repository; and efforts that include
major enhancements to the data storage repository.

2. Task Force/Contractor Responsibilities


The project/product task force and contractor responsibilities regarding this guideline are
summarized below:
● Routinely survey the current and potential user base to determine what databases are
supported, planned, being eliminated, and regarded as the preferred databases.
● Recommend new database platforms to be supported in specific products.
● Notify the T&AA Task Force when new database platforms are planned.
● Participate in research and testing associated with evaluating and accepting new database
platforms.
● Maintain a list of supported database platforms and versions on a public web site and/or in
the product catalog.
● Develop a product migration plan before or shortly thereafter the date that the database
version will no longer be supported.
● Ensure compliance with all license requirements and report potential issues to AASHTO.

3. Recommended Deliverables and Work Products


The following summarizes the recommended deliverables and work products for this guideline.
Refer to the “Deliverable and Work Product Definitions” section below for additional information.
● Published list of supported database platforms and versions.

4. Procedures
4.1 New Database Notification
When a project/product task force is making plans to add support for a new database
platform, the task force chair person should advise the T&AA liaison or T&AA task force
chair person. This is strictly a courtesy notification and may be communicated verbally, by
phone, or email. This will allow T&AA to communicate any concerns to the project/product
task force and contractor early in the product development life cycle.

Created on 06/09/2009 13:35 PM Page 1 Modified on 06/24/2009 23:21 PM


Database Selection and Use Guideline 3.040.02G

4.2 Discontinued Database Planning


When a vendor announces the discontinuation of support for a specific version of a
database product, a plan should be developed to migrate the product away from that
version. An action should be included in the next product tactical work plan to address the
discontinuation of AASHTOWare support for that database version.

4.3 Database Selection


Database software is selected for use in AASHTOWare products by using the selection
criteria and additional consideration described below.

4.3.1 Database Selection Criteria


The following selection criteria are used as a basis for evaluation of database products
and for their recommended use in the development of AASHTOWare products.

4.3.1.1 Standards Conformance


The products recommended are chosen on the basis of their conformance with
industry standards such as SQL and connectivity.

4.3.1.2 Platform Support


The products recommended are chosen because of their support of a broad range of
development and operational platforms (operating software/hardware). Special
attention is given to those platforms which are currently employed by AASHTOWare
products. Consideration is also given to those products which are current industry
leaders.

4.3.1.3 Scalability
The products recommended are highly scalable within their product family.

4.3.1.4 Security
The product recommended should have adequate security features for database
administration.

4.3.1.5 Development Tools


The products recommended are accessible and usable by a broad range of
development tools which are suitable for the development of AASHTOWare
products.

4.3.1.6 Middleware and Gateway


The recommended database product families provide middleware and gateways
which permit access to and from other manufacturers’ database products over a
variety of networking types (differing network protocols).

4.3.1.7 Replication
The products chosen support replicating data across a network and to different
server environments.

4.3.1.8 Product Viability


All products recommended are well established in the market place and/or the user
community.

Created on 06/09/2009 13:35 PM Page 2 Modified on 06/24/2009 23:21 PM


Database Selection and Use Guideline 3.040.02G

4.3.2 Additional Considerations


New AASHTOWare product development should also consider the items listed below
when determining which database(s) to support. It is also suggested that existing
products utilize the items to determine if the list of currently supported databases can be
reduced.

4.3.2.1 Use of the Latest ODBC and JDBC Client Drivers


Software database drivers are available for most database platforms so that
application software can use a common Application Programming Interface (API) to
retrieve the information stored in a database. AASHTOWare product development
should ensure that the latest stable ODBC and JDBC client drivers are used when
developing and maintaining AASHTOWare products.

4.3.2.2 Surveying User base


In order to stay abreast of database platforms being used in the current and potential
user base, AASHTOWare management should routinely survey the member
departments to determine what databases are: preferred, currently supported, not
used, planned for future use, and planned for retirement.
□ The project/product task force should routinely solicit this information when
surveying the current organizations licensing their products, as well as potential
new users.
□ The SCOJD and the T&AA Task Force should routinely include questions
regarding database platforms in the AASHTO Annual IT survey, which is sent to
the Chief Information Officer in each of the AASHTO member departments.

4.3.2.3 Maintain the Minimum Number of Databases


AASHTOWare should select and maintain support for the minimum number of
database platforms required to meet the user and organizational requirements for
new and existing product development.

4.4 Update Public Web Site and/or Product Catalog


When support for a new database platform or new version of an existing platform has been
added to an AASHTOWare product, the web site that is used to provide information to the
public should be updated to show the new platform and/or version number. This information
should also be updated when support for a platform or version is eliminated. If this
information is not maintained on a web site, this information should be updated in the next
release of the product catalog.

5. Technical Recommendations
5.1 Enterprise (Multi-User) User Databases
The following enterprise databases are recommended for new and existing AASHTOWare
product development.
■ Oracle
■ Microsoft SQL Server

5.2 Standalone (Single User) Databases


When using standalone databases the following recommendations should be considered:

Created on 06/09/2009 13:35 PM Page 3 Modified on 06/24/2009 23:21 PM


Database Selection and Use Guideline 3.040.02G

■ Use a single standalone database engine within the application.


■ Licenses should be included and distributed with the AASHTOWare product.
■ Functionality to transfer of data to and from the enterprise database should be included
in the application.

6. Deliverable and Work Product Definitions


6.1 Published List of Database Platforms and Versions
6.1.1 Description
A list of supported database platforms and versions for each AASHTOWare product
should be published for public access on a web site and/or in the product catalog.

6.1.2 Content
The content should include the product, product version, database platforms, and
database versions.

Created on 06/09/2009 13:35 PM Page 4 Modified on 06/24/2009 23:21 PM


PRODUCT
DOCUMENTATION
STANDARD
S&G Number: 3.050.04S
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 Nov. 1993 Initial version of the specification Jan. 1994
02 April 1997 Update to permit distribution of machine readable June 1997
documentation.
03 Dec. 2005 Remove all definitions relating to requirements to Jun 2006
prepare for Requirements Management
implementation. Remove Appendix B which
duplicates information in the AASHTOWare
Lifecycle Framework (ALF). Simplify information
defining page numbering.
04 06/10/2009 Changed standard number from 3.04.020.03 to 06/16/2009
3.050.04S; and applied standard template. Approved by
Made minor changes and format modifications. T&AA

06/10/2009
Product Documentation Standard 3.050.04S

Table of Contents
1. Documentation Types ........................................................................................ 1
1.1 Requirements Documentation - User and System Requirements .................1
1.1.1 Purpose:............................................................................................................... 1
1.1.2 Audience: ............................................................................................................. 2
1.1.3 Distribution: .......................................................................................................... 2
1.2 Internal Documentation - Product Development / Modification, and
Maintenance ..................................................................................................................2
1.2.1 Purpose:............................................................................................................... 2
1.2.2 Audience: ............................................................................................................. 2
1.2.3 Distribution: .......................................................................................................... 2
1.3 User Documentation - Product Use ..................................................................3
1.3.1 Purpose:............................................................................................................... 3
1.3.2 Audience: ............................................................................................................. 3
1.3.3 Distribution: .......................................................................................................... 3
1.4 System Documentation - Product Implementation and Administration.........3
1.4.1 Purpose:............................................................................................................... 3
1.4.2 Audience: ............................................................................................................. 3
1.4.3 Distribution: .......................................................................................................... 3
1.5 Promotional Documentation..............................................................................3
1.5.1 Purpose:............................................................................................................... 3
1.5.2 Audience: ............................................................................................................. 3
1.5.3 Distribution: .......................................................................................................... 4
1.6 Requirements: ....................................................................................................4
2. Document Format and Organization................................................................. 4
2.1 Medium ...............................................................................................................4
2.2 Organization Components.................................................................................4
2.3 Numbering Requirements..................................................................................6
2.4 Revision Recommendations .............................................................................6
3. Requirements Documentation........................................................................... 6
3.1 User Requirements Specification .....................................................................6
3.2 System Requirement Specification...................................................................6
4. Internal Documentation ..................................................................................... 6
4.1 Physical Data Design or Model .........................................................................6
4.1.1 Physical Database Design ................................................................................... 6
4.1.2 Data Model/Physical Data Mapping..................................................................... 6
4.2 Physical Process Design or Model ...................................................................7
4.2.1 Physical Process Design ..................................................................................... 7
4.2.2 Processing Model/Module Mapping..................................................................... 7
4.2.3 Module Relationship Mapping.............................................................................. 7
4.2.4 Environment Specifications.................................................................................. 7
5. User Documentation .......................................................................................... 7
5.1 Reference............................................................................................................7
5.1.1 Quick Reference .................................................................................................. 7
5.1.2 Expanded Reference ........................................................................................... 7
5.2 Learning Guide...................................................................................................8
6. Systems Documentation.................................................................................... 8

Page i 06/10/2009
Product Documentation Standard 3.050.04S

6.1 Implementation Documentation ........................................................................8


6.1.1 Differences ........................................................................................................... 8
6.1.2 Environment ......................................................................................................... 8
6.1.3 Warnings .............................................................................................................. 8
6.1.4 Recovery and De-installation ............................................................................... 9
6.1.5 Installation ............................................................................................................ 9
6.1.6 Problem Resolution.............................................................................................. 9
6.1.7 Interfaces to Systems and Other Applications Software...................................... 9
6.1.8 Required Maintenance (system care and feeding, not changes) ........................ 9
6.1.9 Customization Features ....................................................................................... 9
6.2 Security Management ........................................................................................9
6.3 Administration Documentation .........................................................................9
6.3.1 Operator Documentation...................................................................................... 9
7. Appendices ....................................................................................................... 10
7.1 Appendix A: Document Requirements Table .................................................10
7.1.1 Keys: .................................................................................................................. 10
7.1.2 Notes: ................................................................................................................. 10

Page ii 06/10/2009
Product Documentation Standard 3.050.04S

1. Documentation Types
Development of effective documentation depends upon an understanding of the documents
purpose and an appreciation of the intended audience. The following sections define and
distinguish, on the basis of purpose and audience, the five types of documentation which apply
to AASHTOWare applications.

1.1 Requirements Documentation - User and System Requirements


Refer to the “30100101 AASHTOWare Requirements Management Standard” to find
definitions of user and system requirements. The following definitions are provided merely to
maintain the completeness of this document.
1.1.1 Purpose:
The purpose of requirements documentation is to insure efficient and correct execution
of product requirements and to define for the professional what the product is supposed
to do. This documentation may be divided into the two following categories:
○ The User Requirements Specification (URS) should describe all aspects of required
product functionality and performance. This documentation is the responsibility of the
Project or Product Task Force though it may derive from submissions by User
Groups or product sponsors. This documentation should be designed to fulfill the
following needs.
□ Provide sufficient information to user groups and sponsors to permit prioritization
and approval of user or sponsor requirements for enhancement of existing
products.
□ Provide sufficient information for communicating scope and deliverables of new
development projects.
□ Provide information suitable for inclusion in the Project or Product work plan.
Costs, times of completion, and priorities are examples of items which should be
included.
○ The System Requirements Specification (SRS) - sometimes referred to as Functional
or Logical Design Documentation - is a refined statement of what the existing or
proposed system does or will do. This documentation serves as a bridge between
the user requirements specifications and the internal documentation described
below. The SRS should provide the following elements:
□ Organizational Architecture defining the roles and skills needed for use and
support of the system. Requirements that identify the system actors and define
their roles (Examples of some of these actor roles are: roles of usage - users,
managers, and executives; roles of administration - application; security; and
data; and roles of technical support - installation and performance monitoring).
□ Business Rules defining the non-functional requirements that describe how the
various entities concerned with the system conduct business.
□ System Behavior definitions that describe the results of all the actions of the
system actors (the roles of the system actors are defined in the Organizational
Architecture area above). These system behaviors shall be described with Use
Case models that are linked to the business rules (defined above in the Business
Rule area), which are applicable.
□ Data Models defining all data to be stored or exchanged with other systems.

Page 1 06/10/2009
Product Documentation Standard 3.050.04S

□ Interface Description Models defining the interrelation of processing and data for
all interfaces with other systems.
□ Other Analysis Models: optional area containing all other analysis models that
the Task Force and contractor deem necessary or useful for defining the system.
□ User Interface Prototype: optional area containing information on prototypes
developed to confirm interface requirements.
1.1.2 Audience:
The user requirements specification should be expressed in terms suitable for
submission to users, sponsors, and non- information processing professionals. The
system requirements specification should be addressed to both the non-professional and
professional, so that it may serve as a bridge between the user’s and the developer’s
idea as to what the system should do.
1.1.3 Distribution:
External documentation should be supplied upon request by AASHTO Administration.

1.2 Internal Documentation - Product Development / Modification, and


Maintenance
1.2.1 Purpose:
The purpose of internal documentation is to reduce the cost of and provide portability for
product development, modification, and maintenance. It also insures that the resulting
product is faithful to the user and system requirements specifications described above.
Internal documentation - sometimes referred to as Development or Physical Design
Documentation - should include such elements as physical database design, data
model/physical database entity mapping, physical data structure mapping, processing
model/module mapping, module specifications, module relationship mapping,
environment specifications, and source code.
1.2.2 Audience:
All internal documentation should be written to satisfy the needs of data processing
professionals, specifically those needing to develop, modify, and maintain the product.
1.2.3 Distribution:
Internal documentation should not normally be needed by users of the product. The
most common reasons cited for needing source code and internal documentation are
that the product requires modification to conform to the users needs and that the
software has serious problems which are not solved in a timely fashion.
The first of these reasons can be addressed by providing and documenting mechanisms
for product alteration, such as exits or replaceable modules, which permit customization
without requiring source code modification.
The second problem can be resolved by improving the quality of new releases (fewer
and less severe software errors) and in providing error analysis and resolution services
which inspire the customer's confidence.
Internal documentation should be distributed by AASHTO staff upon request and as the
current policy concerning the right to source code and the subject documentation
dictates.

Page 2 06/10/2009
Product Documentation Standard 3.050.04S

1.3 User Documentation - Product Use


1.3.1 Purpose:
The purpose of user documentation is to provide sufficient information to make easy the
unassisted and correct use of the software product.
The two major categories of user documentation are Learning Guides and Reference
Manuals. The purpose of the first is to train personnel who are unfamiliar with the
application and its functionality. The purpose of the second is to provide easy access to
detailed information about the product that cannot be or is not committed to memory.
1.3.2 Audience:
These documents should be written in terms which are understandable by the users (not
installers, administrators, or programmers, unless of course they are also the only users)
of the software application.
1.3.3 Distribution:
This documentation should be distributed to all licensees and should be written to
address the needs of the users of the product.
Both learning and reference material may be incorporated in the software product itself.
To the degree that this is successfully achieved, reliance on education and manuals can
be reduced.

1.4 System Documentation - Product Implementation and Administration


1.4.1 Purpose:
The purpose of system documentation is to provide installers and product managers with
sufficient information to safely integrate the software product into their computing
environment and to understand the consequences of such integration.
1.4.2 Audience:
This documentation should be written to satisfy the needs of product installers,
managers, and administrators.
1.4.3 Distribution:
This documentation should be distributed to all licensees. Because the System
Documentation may contain sensitive information such as security administration it
should be structured so that the sensitive material can be distributed only to those
persons authorized to use it.

1.5 Promotional Documentation


1.5.1 Purpose:
The purpose of promotional documentation is to acquire new customers and to keep
existing ones.
1.5.2 Audience:
This documentation, which is the responsibility of the Project or Product Task Force,
should be governed by the following principles.
○ It should show potential and current customers the benefits of using the product.
(Both costs and intangibles).
○ It should promote confidence in the product, where warranted, by describing such
things as a proven track record (history), favorable customer experiences, ease of
implementation, quality of service, and plans for future enhancements.

Page 3 06/10/2009
Product Documentation Standard 3.050.04S

○ It should describe what the product does in terms the manager, administrator,
executive, and procurement agent can understand.
○ It should provide means for acquiring additional information or for ordering the
product either for demonstration or permanent use.
○ Information should be provided on future events such as new releases, new features,
and user group conferences.
1.5.3 Distribution:
Production and distribution of this material are the responsibilities of the Project or
Product Task Forces.

1.6 Requirements:
Of the above document types, only Promotional Documentation is optional at the
discretion of the project or product task force; the rest are required as defined.
The above document types and the information they contain must remain distinct even
when they are combined in a single volume. In other words, information appropriate to
the different document types should not be intermixed. Information which is of a
sensitive nature should be segregated in such a fashion as to be separated easily.
Finally, information which has different audiences should be segregated into separate
volumes.

2. Document Format and Organization


2.1 Medium
Appendix A defines all of the required and permissible mediums for distribution and use of
documentation. Paper, Magnetic Tape, Optical Disk, and Floppy Disk are examples of
mediums which could be used.

2.2 Organization Components


The following items describe organizational components which may be present in
documentations. For descriptions of the information which will vary depending on the type
of document, see the specifications dealing specifically with each document type.
■ The cover and/or title page (should include the following information: AASHTO logo,
document name, application name, version, platform, revision level, release number,
preparer information, telephone numbers, and addresses).
This component is required for all AASHTOWare application documents.
■ The notices component should contain AASHTOWare required information (logo,
copyrights, disclaimers, and rights of use, copying & quotation).
The standard required AASHTO copyright notice is as follows:
 Copyright 200X by the American Association of State Highway and
Transportation Officials, Inc.
444 North Capitol Street, N. W., Suite 249
Washington, D.C. 20001 U.S.A.
(202) 624-5800
All rights reserved. Printed in the United States of America. This book, or parts
thereof, may not be reproduced in any form without the permission of the
publishers.

Page 4 06/10/2009
Product Documentation Standard 3.050.04S

Credits for all trademarks used in the document should also appear in this component.
This component is required for all AASHTOWare application documents.
■ The table of contents entries should contain the title of the information referenced and its
page number. If the page number and the title are widely separated, lines, periods or
some other character should be used to lead the eye across the intervening space.
Table of contents entries should be arranged in the same order as the topics referenced.
Table of contents entries should distinguish between major topics and subordinate topics
by bolding or indentation. The structure of the document should be evident in the
structure of the table of contents.
This component is optional only for very small documents, promotional documents, and
internal documentation.
■ The table of figures or illustrations should provide a list of the titled graphics contained in
the document.
This component is optional.
■ The preface or summary component should define the purpose of the document,
summarize its contents, and describe the audience for whom it is intended.
The requirements for this component are the same as those for the table of contents.
■ Document text varies according to the type of document (see the following sections for
the specific requirements relating to each specific document type).
This component is required for all AASHTOWare application documents.
■ A glossary should serve as a dictionary for terminology, used in the document, which the
reader might not understand or that requires precise or special definition. The expansion
and definition of acronyms which appear in the text should be included.
The entries in the glossary should be sequenced alphabetically.
This component is optional but strongly recommended.
■ The appendices should contain information which is occasionally needed and would not
be appropriate in the text portion.
Examples of information which might go into an appendix are error codes with
explanations of corrective actions, Useful examples of product usage, command
summary, keyboard and mouse assignments, and specifications on limits and capacities
of the product.
This component is optional.
■ A list of references supporting all citations occurring in the document should be provided.
This component should employ standard bibliography formats.
■ The index should enable the user to locate key information in the document.
Index entries should be arranged alphabetically and include the location in the manual
where the term or keyword is used.
The index should not reference text where terms are merely used and no substantive
information explaining them is provided.
This component is optional for all types of documents except User and Systems.
■ The order notice, when present, should include information for acquiring additional
copies of the document or for requesting permission to copy.

Page 5 06/10/2009
Product Documentation Standard 3.050.04S

This component is recommended but optional.


■ The reader response notice, when present, should include information on how the user
of the document may make suggestions for its improvement.
This component may be combined with the previous component when both are present.
This component is recommended but optional.

2.3 Numbering Requirements


Page numbering is required. Page numbers on title pages may be omitted. Page numbers
may apply to the chapters of the document or may be continuous for the whole document.
Where they apply to the chapters of a document, the chapter name should appear in the
footer along with the page number.

2.4 Revision Recommendations


Documents may be revised in the following manners:
■ Total replacement - usually applicable when a large percentage of the material in the
document has been revised.
■ Replacement and insertion of pages in the document - usually reserved for making
minor changes to a document.
Revision marks in the margins to indicate material which has changed may be useful but are
not required. An example of a revision mark is a vertical bar in the right hand margin which
extends to all of the lines changed. Revision marks should not be used to mark insignificant
changes, cosmetic changes for instance, which are of no interest to the reader.

3. Requirements Documentation
3.1 User Requirements Specification
Refer to the “30100101 AASHTOWare Requirements Management Standard” to find
definitions of user and system requirements.

3.2 System Requirement Specification


Refer to the “30100101 AASHTOWare Requirements Management Standard” to find
definitions of user and system requirements.

4. Internal Documentation
The Internal Documentation (Physical Database and Process Design) includes all elements
necessary to develop, implement, and maintain a working system on all the platforms
supported.

4.1 Physical Data Design or Model


The following components of the physical data design will be produced.
4.1.1 Physical Database Design
Include all procedures, control statements, and definitions (Structured Query Language
(SQL), XML Schema, and Data Definition Language (DDL) for example) necessary to
actually create the physical database in the target environment.
4.1.2 Data Model/Physical Data Mapping
Provide a map which relates the physical data entity, structure, and element definitions
(e.g. database tables, screens, inter-process communications areas, temporary storage
queues, and external (flat) files) with those of the logical data model as defined in the

Page 6 06/10/2009
Product Documentation Standard 3.050.04S

System Requirements Specification (see the “30100101 AASHTOWare Requirements


Management Standard”).
This is a quality control check point which assures that the physical data design meets
all the requirements of the logical data design.

4.2 Physical Process Design or Model


4.2.1 Physical Process Design
Provide all source code, compile, link, make, and build procedures, and all case tool
inputs necessary to build the physical software in the target environment.
4.2.2 Processing Model/Module Mapping
Provide a map which relates the physical modules provided in "1" above with the
definitions of the logical functions defined in the function inventory of the SRS (see
section "III.B.3.a").
4.2.3 Module Relationship Mapping
Provide graphic and/or tabular representation of the relationships (e.g. Include, Call,
Transfer of Control) between modules. All symbology used should be defined.
4.2.4 Environment Specifications
All interactions and requirements of the target environment should be documented. Any
required library, communications, or systems software definition requirements must be
documented. Any case tools used to produce the system or its documentation must also
be specified.
The following must be supplied for each tool used.
○ Name of tool
○ Supplier and manufacturer
○ Commercial availability and cost (indicate if not commercially available)
○ Description of what the tool does with regard to the AASHTOWare application.
○ The input and output formats that the tool is capable of using and producing.
○ The input and output formats used to produce AASHTOWare applications

5. User Documentation
5.1 Reference
5.1.1 Quick Reference
This optional document usually includes frequently used commands and procedures
taken from the expanded reference and presented in schematic form. It is useful to
users who are familiar with the operation of the application and as a consequence do not
need explanation of the commands or procedures in question. Quick references should
be produced if it saves the time and effort of experienced users.
This form of documentation can often be incorporated in the software application itself
(on-line command references and context specific help are examples of this) and in that
case is not needed as a separate hard copy document.
5.1.2 Expanded Reference
○ The expanded reference should express briefly and simply all information needed to
use (not learn to use) the product. Every functional capability, all inputs, and all

Page 7 06/10/2009
Product Documentation Standard 3.050.04S

outputs must be described. In short, the reference manual should be a complete


description, with respect to use, of what the application can do and how to cause it to
do it.
○ The following organizational components are recommended:
□ An introduction to the application should provide an overview of the system - its
work flows and its functionality.
□ Discussion of all operational aspects of the system which should include such
topics as screen navigation, management of storage media, data entry, initiation
of procedures, production of output, and backup of data.
□ Process, procedure, and command documentation providing generalized
command or procedure formats and specific examples of their use to perform
relevant work.
○ Examples of product usage should be provided and graphics or illustrations should
be included where useful, especially for description of data input, data presentation
or output, and data flow between components of the system.
○ The format of the text should follow the logical structure of the application when
possible, grouping together commands and procedures which are related and used
together. The index should contain all commands and thereby satisfy the need for
an alphabetic reference.
○ The document should contain an appendix describing methods for the diagnosis of
problems. Lists of error and warning messages should also be provided. These
messages should be arranged by number or alphabetically and should be followed
by corrective actions to be taken.

5.2 Learning Guide


No standards have been established in this area at this time.

6. Systems Documentation
Systems documentation should normally be divided by function. The following sections
describe these functions.

6.1 Implementation Documentation


6.1.1 Differences
Provide brief descriptions of the differences (deltas) between this and the previous
version. Release level, maintenance level, fixes applied, and testing level information
should also be supplied in this section.
6.1.2 Environment
Provide description of environment and resource requirements. These descriptions
should include documentation of interactions with systems & communications software,
dependencies on or interfaces with other products, resource requirements, hardware
feature or device requirements, and performance characteristics.
6.1.3 Warnings
Provide warning messages with clear descriptions of any potential for destroying of
corrupting data as well as any irreversible actions.

Page 8 06/10/2009
Product Documentation Standard 3.050.04S

6.1.4 Recovery and De-installation


Provide instructions for removing the product either because of a failed installation or to
return to a previous version.
6.1.5 Installation
Provide instructions for installation of the whole product, maintenance, and fixes. Also
provide instructions for running test scripts to verify correct installation and operation.
6.1.6 Problem Resolution
Describe the methods and procedures that should be employed for isolating, identifying,
documenting, and reporting errors.
6.1.7 Interfaces to Systems and Other Applications Software
Describe data formats and method of interaction.
6.1.8 Required Maintenance (system care and feeding, not changes)
6.1.9 Customization Features
○ Describe customization features such as generation or installation parameters.
Explain implications of choosing different options.
○ User maintainable system parameters such as initialization files, account profiles,
performance parameters, or configuration definitions should be documented.
○ User exits, hooks, and replaceable modules should be documented along with the
processes and procedures necessary to activate them.
○ A data dictionary defining the data elements required for the implementation of the
exits, hooks, and replaceable modules described above should be provided. This
dictionary should also define those data elements input from and output to external
files which the user is permitted to access.

6.2 Security Management


Contains information appropriate for distribution to installation security managers. This
component should be separately packaged when it is likely that security administration will
be performed by personnel other than the installers.

6.3 Administration Documentation


Where it is likely that the product will require management or administration by personnel
separate from installers or maintenance personnel this, section may be separately
packaged. Some examples of such management are data file maintenance, performance
monitoring, problem resolution, resource allocation, account management, database
maintenance, work scheduling, and report distribution.
6.3.1 Operator Documentation
Operator documentation, where separate from user documentation as in the case of
shared use systems (mainframes, servers), should be separately packaged. This
documentation should contain all operator messages. These messages should be
segregated by severity and by whether or not they require responses.

Page 9 06/10/2009
Product Documentation Standard 3.050.04S

7. Appendices
7.1 Appendix A: Document Requirements Table

Document Type Document Document Medium Development Tools


Required? (1) Requirements (2) Source (3)
Internal Documentation
Physical Data R S or MRD (O,MRS)
Physical Process R S, NSM, or MRD (R,MRS)
User Documentation
Quick Reference O S, NSP, or MRD N/A
Expanded Reference R S, NSP, or MRD (O,MRS)
Learning Guide O S, NSP, or MRD (O,MRS)
System Documentation
Implementation R S or MRD N/A
Customizable Features R S or MRD N/A
Security Management R S or MRD N/A
Manager or Admin. R S or MRD N/A
Operator R S or MRD N/A
Requirements Documentation
User Requirements (See the 30100101 Standard for specification of this document)
System Requirements (See the 30100101 Standard for specification of this document)

7.1.1 Keys:
R = Required
O = Optional (at the discretion of the Project or Product Task Force)
S = Standard Paper Size of 8.5" X 11" (consistent with method of update)
NSM = Non Standard Size Machine Listings
NSP = Non Standard Size Printed Text
MRD = Machine Readable Documentation residing on tape, diskette or CD ROM for
example (If a required document is supplied in machine readable format only, then the
software and instructions for printing the document must also be supplied.)
MRS = Machine Readable Development Tool Data such as Analysis, Design, and
Construction Models or Source Code
7.1.2 Notes:
(1) This column indicates whether a document is required or optional.
(2) This column indicates medium types which are permissible for the given document.
(3) This column indicates requirements for machine readable development tool input
and output. All non-proprietary machine readable material necessary to maintain or
modify the product must be provided to AASHTO staff with the distribution of each
new release. This should include such items as source code, frameworks, load
modules, data dictionaries, and model or component libraries.

Page 10 06/10/2009
Product Documentation Standard 3.050.04S

Page 11 06/10/2009
GLOSSARY OF PRODUCT
TERMINOLOGY
STANDARD
S&G Number: 3.060.03S
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 April 1995 Initial Version April 1995
02 March 2000 This standard was updated to current March 2000
AASHTOWare and industry practices. Software
systems have become increasing complex,
requiring more advanced product versioning
control and documentation. That same
information must now be reflected through
standard GUI interfaces to provide the developer,
implementer and end-user fast, reliable and
accurate information about the software
application and related components.
03 06/10/2009 Applied standard template and changed new 06/16/2009
standard number from 3.04.040.02 to 3.060.03S. Approved by
Made minor changes and format modifications. T&AA

06/10/2009
Glossary of Product Terminology Standard 3.060.03S

Table of Contents
1. Introduction......................................................................................................... 1
1.1 AASHTO..............................................................................................................1
1.2 AASHTOWare .....................................................................................................1
2. AASHTOWare Product Nomenclature .............................................................. 1
2.1 Owner Name .......................................................................................................1
2.2 Family Name .......................................................................................................1
2.3 Product Name.....................................................................................................1
2.4 Module Name......................................................................................................1
2.5 Version Name .....................................................................................................2
2.5.1 Major Version Number ......................................................................................... 2
2.5.2 Minor Version Number ......................................................................................... 2
2.5.3 Maintenance Version Number ............................................................................. 2
2.5.4 Build Version Number .......................................................................................... 2
2.6 Platform Name....................................................................................................2
2.6.1 Syntax .................................................................................................................. 2
2.6.2 Examples ............................................................................................................. 2
2.7 Edition Name ......................................................................................................3
2.7.1 Syntax .................................................................................................................. 3
2.7.2 Examples ............................................................................................................. 3
2.7.3 Functional Name .................................................................................................. 3
2.7.4 Syntax .................................................................................................................. 3
2.7.5 Examples ............................................................................................................. 3
2.8 Trade Name.........................................................................................................3
2.8.1 Syntax .................................................................................................................. 3
2.8.2 Examples ............................................................................................................. 3
3. AASHTOWare Product Identification................................................................ 3
3.1 AASHTO Logo ....................................................................................................4
3.2 AASHTOWare Logo............................................................................................4
3.3 AASHTOWare Family Logo ...............................................................................4
3.4 AASHTOWare Product Logo .............................................................................4
3.5 AASHTOWare Product Icon...............................................................................5
3.6 AASHTOWare Product Splash Screen..............................................................5
3.7 AASHTOWare Product About Dialog Box ........................................................5

Page i 06/10/2009
Glossary of Product Terminology Standard 3.060.03S

1. Introduction
AASHTO has established the AASHTOWare Product Terminology standard to assist
AASHTOWare contractors and users in proper use of the AASHTOWare terminology for
product nomenclature and identification. AASHTO reserves the right to change this standard at
any time at its discretion. The AASHTOWare contractors must comply with this standard as
amended from time to time.
The AASHTOWare Product Terminology standard provides a source for consistent and correct
usage for terms and graphics that are specific to the AASHTOWare products. This standard is
applicable to all AASHTOWare documentation and packaging describing the AASHTOWare
products and services.
To comply with the AASHTOWare Product Terminology standard it is important to understand
and differentiate the usage of the term AASHTO and AASHTOWare.

1.1 AASHTO
The term AASHTO is the acronym for American Association of State Highway and
Transportation Officials and is a registered trademark of the American Association of State
Highway and Transportation Officials, Inc.

1.2 AASHTOWare
The term AASHTOWare is a registered trademark and service mark of AASHTO. It
collectively represents all intellectual property including computer software products
resulting from the AASHTO Cooperative Software Development Program.

2. AASHTOWare Product Nomenclature


The AASHTOWare product nomenclature provides definitions of terms specific to the
AASHTOWare environment for uniform naming of the AASHTOWare products. AASHTOWare
product names based on this nomenclature are generally submitted to the United States Patent
and Trademark Office to obtain official trademark registration.

2.1 Owner Name


This term represents the name of the legal owner of the AASHTOWare products. An
AASHTOWare product may include intellectual property or components legally licensed by
AASHTO for distribution. AASHTO is the designated Owner Name for all AASHTOWare
products.

2.2 Family Name


This term designates a group of AASHTOWare products designed for a specific
transportation-engineering domain. The use of Family Name for AASHTOWare product
naming is optional. Trns•port and BRIDGEWare are examples of the existing
AASHTOWare Family Names.

2.3 Product Name


This term designates an AASHTOWare product that provides information and functionality
for an identifiable or definable segment within a transportation-engineering domain.
DARWin, SDMS, and SiteManager are examples of some of the existing AASHTOWare
Product Names

2.4 Module Name


The term Module Name designates a portion of an AASHTOWare product that can operate
independently but is usually data compatible with the other portions or modules of the
product. The use of Module Name for AASHTOWare product naming is optional. AASHTO
SDMS Collector and Processor are example of two modules of an AASHTOWare product.

Page 1 06/10/2009
Glossary of Product Terminology Standard 3.060.03S

2.5 Version Name


The Version Name represents a specific designation for each compiled component of an
AASHTOWare Product. The Version Name is composed of four distinct numerical terms
separated by decimal points specifying the chronological order of the software productions.
The AASHTOWare contractor should maintain a written record of Version Name with
description of software changes associated with each version. A complete Version Name
must appear on the AASHTOWare product About Dialog Box for product identification. The
Version Name can be truncated to the first two terms for display in the AASHTOWare
product Splash Screen and other documentation.
2.5.1 Major Version Number
The first term designates the major revisions to the AASHTOWare product, which
usually include major functional additions and enhancements. AASHTOWare Task
Force approval is required to update this term.
2.5.2 Minor Version Number
The second term designates minor changes to the AASHTOWare product such as minor
functional additions, improved performance, and improved user interface. AASHTOWare Task
Force approval is required to update this term.
2.5.3 Maintenance Version Number
The third term designates maintenance updates to the AASHTOWare product resulting
form software malfunction correction. The AASHTOWare contractor can update this
term with every software maintenance release.
2.5.4 Build Version Number
The forth term designates incremental software build indicator. The AASHTOWare
contractor should update this term with every build of the AASHTOWare product.

2.6 Platform Name


The term Platform Name designates the operating platform for the AASHTOWare product.
The operating platform includes the operating system and any other operating environment
software necessary for designed functional use of the AASHTOWare product. The
AASHTOWare product naming convention requires the use of the word "for" before the
Platform Name.
The syntax and examples of AASHTOWare product naming convention using Owner Name,
Family Name, Product Name, Module Name, Version Name and Platform Name terms is
presented below (optional terms are shown in parentheses):
2.6.1 Syntax
Owner Name [Family Name] Product Name [Module Name] Version Name for Platform
Name
2.6.2 Examples
AASHTO Trns•port SiteManager 2.0 for Microsoft Windows
AASHTO Trns•port BAMS/DSS 6.1 for Microsoft Windows
AASHTO DARWin 3.1 for Microsoft Windows
AASHTO BRIDGEWare Opis 3.0 for Microsoft Windows
AASHTO SDMS Collector 3.4 for DOS
AASHTO SDMS Processor 1.0 for Microsoft Windows
AASHTO IGrds 9.0 for Sun Unix and Bentley MicroStation
AASHTO IGrds 9.0 for Microsoft Windows and Bentley MicroStation

Page 2 06/10/2009
Glossary of Product Terminology Standard 3.060.03S

2.7 Edition Name


The term Edition Name designates the AASHTOWare product implementation alternatives.
An AASHTOWare product may offer a number of implementation alternatives depending
upon user computing infrastructure, export restrictions, and commercial use restrictions.
The use of Edition Name for AASHTOWare product naming is optional. The AASHTOWare
product naming convention requires the use of the word "Edition" after the Edition Name.
The syntax and examples of AASHTOWare product naming convention using Edition Name
term is presented below:
2.7.1 Syntax
[Edition Name] Edition
2.7.2 Examples
Client/Server Edition
Developer Edition
Evaluation Edition
Educational Edition
International Edition
2.7.3 Functional Name
The term Functional Name designates a descriptive name of the AASHTOWare product
corresponding to the functionality of the product. The syntax and examples of
AASHTOWare product naming convention using Functional Name term is presented
below:
2.7.4 Syntax
Functional Name
2.7.5 Examples
Pavement Analysis and Design System
Bridge Management System
Cost Estimation System
Interactive Graphics Roadway Design System

2.8 Trade Name


The term Trade Name designates an abbreviated marketing name for the AASHTOWare
product. The Trade Name is generally adopted to distinguish a specific release of the
AASHTOWare product with the standard annual release of the product. The use of
Trade Name for AASHTOWare product naming is optional. The syntax and examples of
AASHTOWare product naming convention using Trade Name term is presented below.
2.8.1 Syntax
[Trade Name]
2.8.2 Examples
Pontis 2000
IGrds 2000
Superpave LE

3. AASHTOWare Product Identification


AASHTOWare product identification through the use of appropriate graphic elements is
recommended to enhance the outlook of the AASHTOWare products. This section provides
Page 3 06/10/2009
Glossary of Product Terminology Standard 3.060.03S
information of different types of graphic elements recognized by AASHTO for the AASHTOWare
product identification. AASHTOWare Task Force approval is required for incorporating graphic
element into AASHTOWare products for identifications. Alteration of color and aspect ratio is not
allowed.

3.1 AASHTO Logo


The AASHTO logo is a registered trademark of AASHTO.

Figure 1. Sample AASHTO Logo.

3.2 AASHTOWare Logo


The AASHTOWare logo is a trademark and service mark of AASHTO. This logo should be
used to identify a product as an AASHTOWare product.

Figure 2. Sample AASHTOWare Logo

3.3 AASHTOWare Family Logo


The AASHTOWare family logos should be used to identify a product as a member of an
AASHTOWare family of products.

Figure 3. Sample AASHTOWare Family Logos.

3.4 AASHTOWare Product Logo


AASHTOWare product logos are the most visible form of product identification elements. It
is recommended that product logos within an AASHTOWare product family should have
common graphical elements to allow visual association of individual products within a
product family.

Figure 4. Sample AASHTOWare Product Logos.

Page 4 06/10/2009
Glossary of Product Terminology Standard 3.060.03S

3.5 AASHTOWare Product Icon


AASHTOWare product icons are the most recognizable graphical element for the product
user. Consistency should be maintained in updating product icons between major releases
of the AASHTOWare products.

Figure 5. Sample AASHTOWare Product Icons.

3.6 AASHTOWare Product Splash Screen


The AASHTOWare product splash screen should be used to illustrate product quality and
consistency. Splash screen can serve as a strong identification mark for a family of
AASHTOWare products. The splash screen should contain complete product nomenclature
and product logo or other graphics representing the product.

Figure 6. Sample AASHTOWare Product Splash Screen.

3.7 AASHTOWare Product About Dialog Box


The AASHTOWare product About Dialog Box is the most significant product identification
component. The About Dialog Box must contain complete product nomenclature, copyright
notices, product icon, and information for product registration and support.

Page 5 06/10/2009
Glossary of Product Terminology Standard 3.060.03S

Figure 7. Sample AASHTOWare Product About Dialog Box.

Page 6 06/10/2009
TESTING STANDARD

S&G Number: 3.080.02S


Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 Sep. 2006 Initial version of the specification Oct. 2006
02 06/10/2009 Changed standard number from 3.06.001.01 to 06/16/2009
3.080.02S, and applied standard template. Approved by
Made minor changes and format modifications. T&AA

06/10/2009
Testing Standard 3.080.02S

Table of Contents
1. Introduction......................................................................................................... 6
1.1 Purpose of Testing.............................................................................................6
1.2 Definitions ..........................................................................................................7
1.3 Deliverables ........................................................................................................8
2. Procedure Definitions ........................................................................................ 9
2.1 Test 1: Test Planning .........................................................................................9
2.1.1 Description ........................................................................................................... 9
2.1.2 Participation in Testing Process Flow .................................................................. 9
2.1.3 Task Force Activities ............................................................................................ 9
2.2 Test 2: Preparation of Test Instance .................................................................9
2.2.1 Description ........................................................................................................... 9
2.2.2 Participation in Testing Process Flow ................................................................ 10
2.3 Test 3: Walkthrough.........................................................................................10
2.3.1 Description ......................................................................................................... 10
2.3.2 Participation in Testing Process Flow ................................................................ 10
2.3.3 Stakeholder Activities......................................................................................... 10
2.4 Test 4: Unit Testing ..........................................................................................11
2.4.1 Description ......................................................................................................... 11
2.4.2 Participation in Testing Process Flow ................................................................ 11
2.5 Test 5: Build Testing ........................................................................................11
2.5.1 Description ......................................................................................................... 11
2.5.2 Participation in Testing Process Flow ................................................................ 12
2.6 Test 6: System Testing ....................................................................................12
2.6.1 Description ......................................................................................................... 12
2.6.2 Participation in Testing Process Flow ................................................................ 12
2.7 Test 7: Alpha Testing .......................................................................................12
2.7.1 Description ......................................................................................................... 12
2.7.2 Participation in Testing Process Flow ................................................................ 13
2.7.3 Task Force Activities .......................................................................................... 13
2.8 Test 8: Beta Testing .........................................................................................13
2.8.1 Description ......................................................................................................... 13
2.8.2 Participation in Testing Process Flow ................................................................ 14
2.8.3 Task Force Activities .......................................................................................... 14
2.8.4 Tester Activities.................................................................................................. 15
2.9 Test 9: Peer Review and Exception Correction..............................................16
2.9.1 Description ......................................................................................................... 16
2.9.2 Participation in Testing Process Flow ................................................................ 16
2.10 Test 10: Alpha Testing Acceptance ................................................................16
2.10.1 Description ......................................................................................................... 16
2.10.2 Participation in Testing Process Flow ................................................................ 16
2.10.3 Task Force Activities .......................................................................................... 16
2.11 Test 11: Beta Testing Acceptance ..................................................................17
2.11.1 Description ......................................................................................................... 17
2.11.2 Participation in Testing Process Flow ................................................................ 17
2.11.3 Task Force Activities .......................................................................................... 17
2.12 Test 12: Installation..........................................................................................17
2.12.1 Description ......................................................................................................... 17
2.12.2 Participation in Testing Process Flow ................................................................ 18

Page 1 06/10/2009
Testing Standard 3.080.02S

2.12.3 Task Force Activities .......................................................................................... 18


2.12.4 Tester Activities.................................................................................................. 19
2.12.5 Standing Committee on Joint Development Activities ....................................... 19
3. Work Product Definitions................................................................................. 19
3.1 Test Plan ...........................................................................................................19
3.1.1 Description ......................................................................................................... 19
3.1.2 Plan Elements .................................................................................................... 19
3.1.3 Payment and Deliverable Considerations.......................................................... 19
3.2 Test Criteria ......................................................................................................20
3.2.1 Walkthrough Criteria .......................................................................................... 20
3.2.2 Unit Test Criteria ................................................................................................ 20
3.2.3 Build Test Criteria............................................................................................... 20
3.2.4 System Test Criteria........................................................................................... 21
3.2.5 Alpha Test Criteria ............................................................................................. 22
3.2.6 Beta Test Criteria ............................................................................................... 23
3.3 Installation Materials........................................................................................24
3.3.1 Description ......................................................................................................... 24
3.3.2 Content............................................................................................................... 24
3.3.3 Payment and Deliverable Considerations.......................................................... 24
3.4 Distribution Test Materials ..............................................................................24
3.4.1 Description ......................................................................................................... 24
3.4.2 Content............................................................................................................... 24
3.4.3 Payment and Deliverable Considerations.......................................................... 25
3.5 Test Results Repository ..................................................................................25
3.5.1 Description ......................................................................................................... 25
3.5.2 Content............................................................................................................... 25
3.5.3 Payment and Deliverable Considerations.......................................................... 26
3.6 Alpha Test Acceptance Report........................................................................26
3.6.1 Description ......................................................................................................... 26
3.6.2 Content............................................................................................................... 26
3.6.3 Payment and Deliverable Considerations.......................................................... 27
3.7 Beta Test Acceptance Report..........................................................................27
3.7.1 Description ......................................................................................................... 27
3.7.2 Content............................................................................................................... 27
3.7.3 Payment and Deliverable Considerations.......................................................... 27
3.8 Installation Status Report ................................................................................27
3.8.1 Description ......................................................................................................... 27
3.8.2 Content............................................................................................................... 27
3.8.3 Payment and Deliverable Considerations.......................................................... 28
3.9 Test Instance Report........................................................................................28
3.9.1 Description ......................................................................................................... 28
3.9.2 Format................................................................................................................ 28
3.9.3 Payment and Deliverable Considerations.......................................................... 29
4. Appendices ....................................................................................................... 30
4.1 Appendix A: Procedure Activity Diagrams.....................................................30
4.1.1 Test 1: Test Planning ......................................................................................... 31
4.1.2 Test 2: Preparation of Test Instance.................................................................. 32
4.1.3 Test 3: Walkthrough ........................................................................................... 33
4.1.4 Test 4: Unit Testing ............................................................................................ 33
4.1.5 Test 5: Build Testing .......................................................................................... 34
4.1.6 Test 6: System Testing ...................................................................................... 34

Page 2 06/10/2009
Testing Standard 3.080.02S

4.1.7 Test 7: Alpha Testing ......................................................................................... 35


4.1.8 Test 8: Beta Testing ........................................................................................... 36
4.1.9 Test 9: Peer Review and Exception Correction ................................................. 37
4.1.10 Test 10: Alpha Testing Acceptance ................................................................... 38
4.1.11 Test 11: Beta Testing Acceptance ..................................................................... 38
4.1.12 Test 12: Installation ............................................................................................ 39
4.2 Appendix B: Test Criteria Examples ...............................................................40
4.2.1 Examples of Types of Walkthrough ................................................................... 40
4.2.2 Examples of Unit Testing ................................................................................... 40
4.2.3 Examples of Build Testing ................................................................................. 40
4.2.4 Examples of System Testing ............................................................................. 42
4.2.5 Examples of Alpha Testing ................................................................................ 43
4.2.6 Examples of Beta Testing .................................................................................. 45

Page 3 06/10/2009
Testing Standard 3.080.02S

Test Phase and Work Product Definition Compliance


Testing Procedures Required? Project Types (2)
Test 1 Test Planning Yes All Types
Test 2 Preparation for Test Yes All Types
Instance
Test 3 Walkthrough No All Types
Test 4 Unit Testing Yes All Types
Test 5 Build Testing Yes All Types
Test 6 System Testing Yes All Types
Test 7 Alpha Testing Yes All Types
Test 8 Beta Testing Yes All Types
Test 9 Peer Review and Yes All Types
Correction
Test 10 Alpha Testing Yes All Types
Acceptance
Test 11 Beta Testing Acceptance Yes All Types, MajM-
Optional
Test 12 Installation Yes All Types

Testing Work Product Definitions Required? / Project Types (2)


Deliverable?
Test Plan Yes /Yes All Types
Test Criteria Yes (1)/No All Types
Test Instance Report Format Optional (1)/No All Types
Distribution Test Materials Yes (1)/Yes All Types
Requirements Traceability Matrix Optional (1,3)/Optional All Types
(3)
Test Results Repository Yes (1)/No All Types
Alpha Test Acceptance Report Yes (1)/Yes All Types
Beta Test Acceptance Report Yes (1)/Yes All Types
Installation Materials Yes/Yes All Types
Installation Status Report Yes/Yes All Types

(1) Where testing tools are employed by AASHTOWare contractors, formats of Work
Product Definitions may be modified to conform to those provided by the chosen tool.
The content of deliverables, however, may not be changed. If the tool does not provide
for recording some of the content information, it will have to be added, as a supplement,
by the contractor.

Page 4 06/10/2009
Testing Standard 3.080.02S

(2) Project Type Requirements - For the purposes of the table above “All Types” is defined
as including “New Development,” “Enhancement,” and “Major Maintenance,” while it is
defined as excluding “Minor Maintenance.”
○ New Development (ND) – For each new development effort all Testing Procedures
and Work Product Definitions are required.
○ Enhancement (E) – For each enhancement effort, a statement will be included
defining the Testing Procedures and Work Product Definitions to be followed based
on this specification. The Task Force will review and determine, for each
enhancement, if deviations from the Testing Specification are warranted and, if so,
they need to be justified/documented in the Product Work Plan.
○ Major Maintenance (MajM) – For each major maintenance version, the Task Force
will receive from the contractor a statement which will define the Test Procedures
and Work Product Definitions to be followed, based on the Testing Specification. The
Task Force will review, if any deviations from the Testing Specification are proposed,
to determine if the need is justified. The Task Force will be responsible that decisions
will be documented before work is initiated.
○ Minor Maintenance (MinM) – This project type represents the temporary fix or repair
of an existing product module. The temporary fix or repair results must not add to,
change nor delete from the functionality of a product module. Minor maintenance is
outside the scope of this Specification.
(3) Use of the “Requirements Traceability Matrix” to store and deliver the Test Procedures
and Result Criteria is optional. They may be delivered in the Alpha or “Beta Test
Acceptance Report”. Wherever they are stored and delivered, it is required that they be
backward linked to a requirement in the “Requirements Traceability Matrix”.

See the attached specification for descriptions of the Testing Procedures and Work Product
Definitions.

Page 5 06/10/2009
Testing Standard 3.080.02S

1. Introduction
1.1 Purpose of Testing
The purpose for testing AASHTOWare systems or system components is to insure that the
specified requirements are met (this is called verification) and to demonstrate that a system
or system component fulfills its intended use when placed in its intended environment (this
is called validation). In other words, verification ensures that “you built it right;” whereas,
validation ensures that “you built the right thing.”
The following table describes how Verification and Validation are supported by the
procedures of this specification.
Testing Procedure Verification Validation?
?
Test 1: Test Planning Yes Yes
Test 2: Preparation of Test Yes Yes
Instance
Test 3: Walkthrough Yes Yes
Test 4: Unit Testing Yes N/A
Test 5: Build Testing Yes N/A
Test 6: System Testing Yes N/A
Test 7: Alpha Testing Yes N/A
Test 8: Beta Testing N/A Yes
Test 9: Peer Review and Yes Yes
Correction
Test 10: Alpha Testing Yes N/A
Acceptance
Test 11: Beta Testing Acceptance N/A Yes
Test 12: Installation N/A Yes

Verification activities are usually performed in a testing environment provided by the


developer. This environment balances testing efficiency, cost of maintenance, and
compatibility with customer target environments. Verification tests are as varied as are the
requirements for the system being tested.
Validation activities use approaches similar to verification (e.g., test, analysis, inspection,
demonstration, or simulation). Most often, the end users are involved in the validation
activities. Where possible, validation should be accomplished using the system or system
component operating in its intended environment and using real data.
Because AASHTOWare systems are produced using differing development methods, this
specification is designed to be neutral to development methodologies. It can be easily used
with either waterfall or iterative methodologies.
An appendix is included to assist in the understandability and the execution of
AASHTOWare Testing. It provides a detailed pictorial view of the tasks and activities that
should take place in test procedures.

Page 6 06/10/2009
Testing Standard 3.080.02S

The following highlighted table shows the AASHTOWare Lifecycle phases, which contain
testing activities.
Strategy/ Tactic/ Requirement/ Verification/ Product
Contract Planning Design Construction Implementation
Proposal Solicitation Analysis Validation Maintenance

1.2 Definitions
In order to reduce confusions and simplify the text of this document, the following definitions
are provided:
■ Project/Product Work Plan – this term refers to the activities, schedule, and resource
costs proposed and contracted to satisfy the defined user requirements. This plan is
developed in the Tactics / Solicitation phase of the AASHTOWare Lifecycle and is
established in the Contract phase. A Project/Product Work Plan is usually an annual
plan, and the work described within it is scheduled to correspond to the AASHTO fiscal
year.
■ Work Product Definition – A Work Product Definition usually the criteria of acceptance,
format, content, responsibilities, and usage of an artifact.
■ Requirements Traceability Matrix: the Requirements Traceability Matrix is the repository
of all the traceable objects. It is the method used to manage the requirements and is
capable of all the attribute definition and linking (source, derivative, and horizontal)
needed to support traceability. The Requirements Traceability Matrix is an artifact or
requirements management.
■ Project Types:
○ New Development – This project type represents the addition of major new functional
requirements to an existing product line or to an existing product module; or the
creation of a new product line or product module. New Development is formally
identified and approved through user groups, technical advisory committees, project
task forces, and the Subcommittee on Joint Development.
Example: Addition to product line of new module.
○ Enhancement – This project type represents the addition of new features to an
existing product module; or the correction of limited-scope, non-critical
inconsistencies or inadequacies of a product module. Enhancements are formally
identified and approved through user groups, technical advisory committees, and
product task forces. For each enhancement effort, a statement will be included
defining the test procedures to be followed based on the Testing Specification. The
Task Force will review and determine, for each enhancement, if deviations from the
Testing Specification are warranted and, if so, they need to be justified/documented
in the Product Work Plan.
Example: Upgrade of a product's or a module's technical platform (i.e. use of new
data base or teleprocessing technology).
○ Major Maintenance – This project type represents the SCHEDULED repair of an
existing product module or the product's technical operating environment which is
required to enable successful execution of the product as prescribed by business
requirements. For each major maintenance version, the Task Force will receive from
the contractor a statement which will define the test procedures to be followed,
based on the Testing Specification. The Task Force will review, if any deviations
from the Testing Specification are proposed, to determine if the need is justified. The
Task Force will be responsible that decisions will be documented before work is
initiated.

Page 7 06/10/2009
Testing Standard 3.080.02S

Example: Maintenance release which could contain permanent fixes as a result of


unscheduled repairs.
○ Minor Maintenance – This project type represents the temporary fix or repair of an
existing product module. The temporary fix or repair results must not add to, change
nor delete from the functionality of a product module. Minor maintenance is outside
the scope of this Testing Specification.
Example: Client site A is not able to successfully execute module XYZ. After
attempting to resolve the problem without success, client notifies the Contractor with
appropriate termination codes, messages, and information. Contractor provides a
new system load module to be re-linked, or other technical resolution.

1.3 Deliverables
The required deliverables for testing are:
■ A Test Plan, described in the Work Product Definitions section below, specifies the
schedule, activities, and resources required to perform the testing of a system, system
components, documentation, or procedures. It also includes a schedule of deliverables.
It is a component of the Project / Product Work Plan.
■ The Distribution Test Materials, described in the Work Product Definitions section below,
contains all of the materials needed by the beta test participant to implement and
perform beta testing in the appropriate environment and to report the results.
■ A Traceability Matrix (see Requirements Traceability Matrix in the Work Product
Definitions section of the “301001 Requirements Management” specification) contains
the requirements to be tested. This deliverable is optional for it may be used to store the
test procedures and result criteria and to backward and forward link them to the tested
requirement. Since it is required that the test procedures and result criteria be stored, as
part of the Test Instance Report, in the Beta Test Acceptance Report or the Alpha Test
Acceptance Report, their storage in the Requirements Traceability Matrix is allowed for
convenience.
■ Alpha Test Acceptance Report, described in the Work Product Definitions section below,
contains the identification (ID) of requirements (from the “Requirements Traceability
Matrix” - this is the same as a backward link to the requirement), the test procedures /
result criteria, the identification of the system being tested, the summary of test results,
the documented exceptions, and the approved / accepted resolutions for all contributing
test types (Unit, Build, System, and Alpha).
■ Beta Test Acceptance Report, described in the “Work Product Definitions” section below,
contains the Requirement ID (from the “Requirements Traceability Matrix” - this is the
same as a backward link to the requirement), Test Procedures / Result Criteria,
summary of Test Results, documented Exceptions, Approved and Accepted Exception
Resolutions. The report contains all tests performed for Beta Testing.
■ Installation Materials contain all procedures, executables, and documentation needed to
implement and operate the delivered system at the user agency site.
■ Installation Status Report contains the number of licensees, date/agency of each
successful installation, date/agency/description of each problem encountered, and
date/agency/description of each problem resolution. When the Task Force approves the
Installation Status Report, testing is complete.

Page 8 06/10/2009
Testing Standard 3.080.02S

2. Procedure Definitions
2.1 Test 1: Test Planning
2.1.1 Description
The purpose of the Test Plan is to define the schedule of activities for testing the system
being developed. The Test Plan is developed prior to the performance of any testing
activities, though it may be modified, using the procedures of Project Planning,
whenever the need arises.
The Test Plan is a required deliverable which must be consistent with the specification of
the same name, provided below in the Work Product Definitions section.
Test 1: Test Planning is required for all project types (new development, enhancements,
maintenance).
This procedure defines the activities needed to accomplish the following tasks:
○ Develop Test Plan.
○ Accomplish Task Force review and approval.
2.1.2 Participation in Testing Process Flow
This procedure may be started any time after contract approval. This procedure is
complete when the Test Plan is approved by the Task Force and the Supplier
Agreement Management procedures are completed.
2.1.3 Task Force Activities
The contractor will develop the Test Plan and submit it to the Project/Product Task Force
for approval.
2.1.3.1 Review and Approve or Reject with Reasons the Test Plan
The Task Force shall review the plan for consistency with the Test Plan work product
definition. After review, notify the contractor of approval or rejection.
If the Test Plan is rejected, reasons for that rejection shall be included in the
notification so that the contractor may make appropriate revisions to the plan and
resubmit it.
If the Test Plan is approved, it is forwarded to the Project Planning procedures for
integration in the appropriate plan. After this is complete, the Test Plan is submitted
as a deliverable for processing by the Supplier Agreement Management procedures

2.2 Test 2: Preparation of Test Instance


2.2.1 Description
The purpose of this procedure is to prepare everything that is needed to perform a test.
The types of tests that are prepared for are Walkthrough, Unit, Build, System, Alpha, and
Beta.
A test instance may be developed when it is possible to identify or develop requirements
to be tested, when test procedures and result criteria may be developed, and when the
system or system components may be identified or developed. No testing may be
performed until all of the components of a test instance are prepared. The test instances
must be consistent with the section of the Test Criteria work product definition, which
describes test type (i.e. Walkthrough, Unit, Build, System, Alpha, or Beta).

Page 9 06/10/2009
Testing Standard 3.080.02S

Each new test instance and its test results shall be documented in accordance with the
Test Instance Report work product definition. These documents, when complete, shall
be stored in accordance with the Test Results Repository work product definition.
Test 2: Preparation of Test Instance is mandatory for all project types (new
development, enhancements, maintenance).
This procedure defines the activities needed to accomplish the following tasks:
□ Identify system or component and test type for a new version of a test instance.
□ Establish test requirements.
□ Establish test procedures and result criteria.
□ Establish system or system component to be tested.
2.2.2 Participation in Testing Process Flow
See Figure 1 for a description of this procedure and Figure 1 through Figure 4 below for
demonstrations of all of the possible interactions with other procedures.

2.3 Test 3: Walkthrough


2.3.1 Description
Walkthroughs, in this specification, are used to examine (test the concepts of) the
components of test instances. It may examine the requirements, the test procedures /
result criteria, or the system / component of the test instance. A Walkthrough should
directly precede any planned testing activity (Unit, Build, System, Alpha, and Beta),
which will benefit from presentation to and collaboration with peers or stakeholders.
Stakeholders (AASHTO organization personnel, Task Force members, technical
experts, user or agency representatives, subject matter experts) should be invited
whenever their participation would be useful.
For a more thorough discussion, see the Walkthrough Criteria in the Test Criteria work
product definition.
Test 3: Walkthrough is optional for all project types (new development, enhancements,
maintenance).
This procedure defines the activities needed to accomplish the following tasks:
○ Confirm walkthrough environment and contents.
○ Invite walkthrough participants.
○ Conduct walkthrough and perform resulting actions.
2.3.2 Participation in Testing Process Flow

2.3.3 Stakeholder Activities


2.3.3.1 Stakeholders Participate in Walkthrough
After the contractor has produced the environment and materials for the
Walkthrough, any stakeholders that are needed beyond the contractor staff are

Page 10 06/10/2009
Testing Standard 3.080.02S

invited. The stakeholders participate in the walkthrough, representing their concern


or area of expertise. Their contributions are documented in the Test Instance Report.

2.4 Test 4: Unit Testing


2.4.1 Description
Unit testing is performed on each class/object or program module to reduce the
complexity of testing the entire system. It is also useful for discovering errors which
might be difficult to detect or isolate at a higher level of testing. For object oriented
development the methods of the class/object are the natural unit of testing.
For a more thorough discussion, see the Unit Test Criteria section of the Test Criteria
work product definition.
Test 4: Unit Testing is mandatory for all project types (new development, enhancements,
maintenance).
This procedure defines the activities needed to accomplish the following tasks:
○ Confirm unit test environment.
○ Perform unit test procedures and document results
○ Store unit test results and complete test or continue iteration.
2.4.2 Participation in Testing Process Flow
The logic of the following diagram is the same for Build Testing and System Testing as it
is for Unit Testing; consequently, it will not be repeated again to describe the process
flow of those procedures.

2.5 Test 5: Build Testing


2.5.1 Description
Build Testing is the means used for testing a component that is made up of lesser
components (units or other builds). Build boundaries usually correspond to the
boundaries established by the application architecture employed. The build component
must be isolatable from the rest of the system so that it can be independently tested.
Because of this its interfaces should be clearly definable. Build testing can reduce the
complexity of testing and simplify system maintenance.
For a more thorough discussion, see the Build Test Criteria section of the Test Criteria
work product definition.
Test 5: Build Testing is mandatory for all project types (new development,
enhancements, maintenance).
This procedure defines the activities needed to accomplish the following tasks:
○ Confirm build test environment.
○ Perform build test procedures and document results.

Page 11 06/10/2009
Testing Standard 3.080.02S

○ Store the build test results and complete test or continue iteration.
2.5.2 Participation in Testing Process Flow
See Figure 2: Unit, Build, and System Test Iterations above.

2.6 Test 6: System Testing


2.6.1 Description
The purpose of System Testing is to test the system as a whole to ensure that the
integration is completed and that it performs as required. All use cases or user stories
and business rules should also be tested. System Testing leads to Alpha Testing and
may be used to prepare for formal Alpha Test Acceptance.
For a more thorough discussion, see the System Test Criteria section of Test Criteria
work product definition.
Test 6: System Testing is mandatory for all project types (new development,
enhancements, maintenance).
This procedure defines the activities needed to accomplish the following tasks:
○ Confirm system test environment.
○ Perform system test procedures and document results.
○ Store system test results and complete test or continue iteration.
2.6.2 Participation in Testing Process Flow
See Figure 2: Unit, Build, and System Test Iterations above.

2.7 Test 7: Alpha Testing


2.7.1 Description
Alpha Testing covers the same system and system components as does System
Testing. The emphasis is, however, on breaking the system, checking the user
requirements, and reviewing all documentation for completeness by using the
application as if it were in production.
For a more thorough discussion, see the Alpha Test Criteria section of the Test Criteria
work product definition.
Test 7: Alpha Testing is mandatory for all project types (new development,
enhancements, maintenance).
This procedure defines the activities needed to accomplish the following tasks:
○ Confirm alpha test environment.
○ Perform alpha test procedures and document results.
○ Provide for Task Force participation in alpha test.
○ Store alpha test results and continue iteration.

Page 12 06/10/2009
Testing Standard 3.080.02S

2.7.2 Participation in Testing Process Flow

2.7.3 Task Force Activities


2.7.3.1 Review Test Instance Report and Approve or Reject with Reasons
After the contractor has developed the Test Instance Report for the Alpha Test, it is
delivered to the Task force for review to determine if the test instance components
are in accordance with the Test Criteria work product definition (are all requirements
developed, do the test procedures and result criteria cover all requirements, and is
the system fully developed and ready for testing). If the alpha test instance is
approved, the contractor is advised that Alpha Testing may begin. If the test instance
is rejected, the contractor is advised of the reasons for rejection and the re-
development of the test instance begins again at Test 2: Preparation of Test
Instance.
2.7.3.2 Review Test Results and Determine which Exceptions Need
Resolution; Report to Contractor
After the Alpha Test has been performed, the contractor documents the test results
in the Test Instance Report and then compares them with the result criteria
documenting all exceptions in the same document. The contractor then delivers the
Test Instance Report to the Task Force for conformation of the discovered
exceptions. The Task Force reviews the materials and determines which exceptions
require resolution. These determinations are returned to the contractor who
documents them by updating the Test Instance Report. The test continues with Test
9: Peer Review and Exception Correction.

2.8 Test 8: Beta Testing


2.8.1 Description
Beta Testing confirms to the user / tester that all functionality and operability
requirements are satisfied and that the system is ready for delivery and implementation.
Beta Testing also includes the review of all the project deliverables such as
documentation and installation procedures.
Beta testing provides for the development of the Distribution Test Materials defined
below. These materials are provided to agency participants so they may test the
developed system using ‘real’ data, in the ‘real’ work environments, on all intended
platforms. The participants, who represent agencies which are licensees, install the

Page 13 06/10/2009
Testing Standard 3.080.02S

system or system components, perform the included test procedures, and report the
testing results, especially exceptions to the result criteria.
For a more thorough discussion, see the Beta Test Criteria section of the Test Criteria
work product definition.
The Distribution Test Materials document is a required deliverable.
Test 8: Beta Testing is mandatory for new development and enhancements and is
optional for maintenance (Determined by Product Task Force).
This procedure defines the activities needed to accomplish the following tasks:
○ Prepare and approve distribution test materials.
○ Select testers and send materials.
○ Confirm beta test environment.
○ Perform beta test procedures and compare to criteria.
○ Store, review, and document exceptions.
2.8.2 Participation in Testing Process Flow

2.8.3 Task Force Activities


2.8.3.1 Review and Approve or Reject with Reasons the Distribution Test
Materials
After the contractor has developed the Distribution Test Materials document, it is
delivered to the Task Force for review and approval or rejection with reasons. The
basis for rejection of the Distribution Test Materials is usually that they do not
conform to the work product definition of the same name, that they do not cover the
areas of system functionality that should be tested, or that the developed test
procedures and result criteria do not adequately conform to the Beta Test Criteria.
If the Distribution Test Materials are rejected, they are returned to the contractor for
resolution of the rejection reasons and resubmission.
If the Distribution Test Materials are approved, they are returned to the contractor
who performs the following activities:

Page 14 06/10/2009
Testing Standard 3.080.02S

□ Perform Configuration Management procedures to store and identify the


approved version of the Distribution Test Materials document.
□ Perform Supplier Agreement Management procedures to submit and process the
Distribution Test Materials document as a deliverable.
2.8.3.2 Review and Send Invitations for Beta Testing Participation
After the contractor has selected potential agency testers for validating the system in
all of the intended environments, invitations are prepared and given to the Task
Force for review. If the Task Force approves of the invitations they are sent to the
agencies requesting their participation.
2.8.4 Tester Activities
2.8.4.1 Assess Invitation and Determine whether or not to Participate
Each agency tester assesses the invitation and decides whether to commit to beta
test participation. The contractor then determines whether all of the intended
environments are represented. If not, additional potential sites need to be selected
and invitations sent.
When all intended environments are covered, the contractor ships the Distribution
Test Materials to the beta test participants.
2.8.4.2 Install Product and Report any Problems
After receiving the Distribution Test Materials including system to be tested (included
in the Installation Materials), the agency beta tester installs the system and reports
any installation problems to the contractor. The contractor will then resolve the
issues and redistribute all materials if necessary.
2.8.4.3 Perform Beta Test Procedures
After the successful completion of the installation, the tester will perform each of the
beta test procedures that were included in the Distribution Test Materials.
2.8.4.4 Compare Results to Test Criteria and Report the Results in the Test
Instance Report Work Product Definition
The beta tester then compares the results of the performed procedures with the test
result criteria and records the results and exceptions. The Test Instance Report,
which shall be used to distribute the test procedures and result criteria may also be
used by the tester to record test results and exceptions. After all of the results and
exceptions are recorded the tester will return the report to the contractor. This should
not be interpreted to mean that exceptions can not be reported as soon as they are
found to speed up their correction.
When the contractor receives the results and exceptions from an agency beta tester,
the following activities will be performed:
□ Discover any additional exceptions missed by the beta tester and store testing
results and exceptions in the Test Instance Report. If test procedures, result
criteria, test results and discovered exceptions are to be stored in the
Requirements Traceability Matrix, backward links may be maintained there.
Otherwise, all links will be made in the updated Test Instance Report.
□ After all Test Instance Reports describing the test results and exceptions that
have been collected from Beta Testing participants are complete, combine them
into a single Test Instance Report, which contains all results and documented
exceptions.

Page 15 06/10/2009
Testing Standard 3.080.02S

2.9 Test 9: Peer Review and Exception Correction


2.9.1 Description
Peer Review and Exception Correction is concerned with resolving test exceptions by
correcting one or more of the following test instance components: system or system
component, test procedures / result criteria, and test requirements. The exceptions are
generated by the execution of Unit, Build, System, Alpha, or Beta Tests described
above.
Test 9: Peer Review and Exception Correction is mandatory for all project types (new
development, enhancements, maintenance).
This procedure defines the activities needed to accomplish the following tasks:
○ Peer review of test exceptions.
○ Perform revisions to requirements, system/component, and/or procedures/criteria.
2.9.2 Participation in Testing Process Flow
See Figure 2, Figure 3, and Figure 4, which demonstrate all of the possible interactions
with this procedure.

2.10 Test 10: Alpha Testing Acceptance


2.10.1 Description
In the Alpha Testing Acceptance procedure, the results of the resolution of exceptions
are reported in the “Alpha Test Acceptance Report” which is submitted to the Task Force
for approval. Approval of the results indicates that no further cycles of alpha testing are
needed.
The Alpha Test Acceptance Report is a required deliverable.
Test 10: Alpha Testing Acceptance is mandatory for all project types (new development,
enhancements, maintenance).
This procedure defines the activities needed to accomplish the following task:
○ Approve alpha test results or re-issue test.
2.10.2 Participation in Testing Process Flow
For a description of the inputs and outputs of this procedure and how it fits in the flow of
testing procedures, see Figure 3 above.
2.10.3 Task Force Activities
2.10.3.1 Review and Approve or Reject with Reasons the Alpha Test
Acceptance Report
After all of the alpha test results have been reviewed and all mitigating revisions to
the tested requirements, the developed system, and/or the test procedures / result
criteria have been made, the contractor produces the Alpha Test Acceptance Report,
which is a required deliverable that is defined by the work product definition of the
same name. The report contains links to the tested requirements, the testing
procedures / result criteria, and the documented testing exceptions / resolutions. The
report is presented by the contractor to the Task Force for review and approval, or
rejection.
If the report is rejected, the Task Force shall notify the contractor of the reasons for
the rejection. After review of the rejection reasons the contractor shall make any

Page 16 06/10/2009
Testing Standard 3.080.02S

necessary corrections and restart the alpha testing process with Test 7: Alpha
Testing, described above.
If the report is approved the contractor will be notified. The contractor shall then
submit the Alpha Test Acceptance Report for Configuration Management and submit
it as a deliverable for Supplier Agreement Management processing.

2.11 Test 11: Beta Testing Acceptance


2.11.1 Description
In the Beta Testing Acceptance procedure, the results of the resolution of exceptions are
reported in the Beta Test Acceptance Report which is submitted to the Task Force for
acceptance. Acceptance of the results indicates that no further cycles of beta testing are
needed and that distribution to agency licensees may begin.
The Beta Test Acceptance Report is a required deliverable.
Test 11: Beta Testing Acceptance is mandatory for new development and
enhancements. It is optional for maintenance (Determined by the Task Force).
This procedure defines the activities needed accomplish the following task:
□ Accept Beta Test Results or Re-issue Test
2.11.2 Participation in Testing Process Flow
For a description of the inputs and outputs of this procedure and how it fits in the flow of
testing procedures, see Figure 4 above.
2.11.3 Task Force Activities
2.11.3.1 Review and Approve or Reject with Reasons the Beta Test
Acceptance Report
After corrections of documented exceptions have been made, the contractor
produces the Beta Test Acceptance Report. This report is a required deliverable that
is defined by the Beta Test Acceptance Report work product definition. The report
contains links to the tested requirements, the test procedures / result criteria, and the
documented testing exceptions / resolutions. The report is presented by the
contractor to the Task Force for review and approval, or rejection.
If the report is rejected, the Task Force shall notify the contractor of the reasons for
the rejection. These reasons will usually consist of departures from the Beta Test
Acceptance Report work product definition or inadequate resolutions of exceptions
discovered during testing. After review of the rejection reasons the contractor shall
make any necessary corrections and begin another iteration of the beta testing
process with Test 8: Beta Testing described above.
If the report is approved the contractor will be notified. The contractor shall then
submit the Beta Test Acceptance Report for Configuration Management and submit
it as a deliverable for Supplier Agreement Management processing.

2.12 Test 12: Installation


2.12.1 Description
The purpose of this procedure is to validate the installation of the system at
user/licensee sites. The contractor assembles the Installation Materials, defined in the
work product definition of the same name, and distributes them. The first task, installing
on the intended platform, may include installing hardware, installing the program(s) on
the computer, reformatting/creating the data base(s), and verifying that all components

Page 17 06/10/2009
Testing Standard 3.080.02S

have been included. The system is then placed into operation either by phasing in or
parallel operation with the pre-existing version of the system. The delivered system must
successfully install and operate in the intended environment.
The Installation Materials are required deliverables of this procedure.
The Installation Status Report is a required deliverable of this procedure.
Test 12: Installation is mandatory for all project types (new development, enhancements,
maintenance).
This procedure defines the activities needed to accomplish the following tasks:
○ Assemble and Distribute Project/Product Materials
○ Install Product
○ Complete Testing
2.12.2 Participation in Testing Process Flow
This procedure may be started after the Beta Test Acceptance Report is accepted. In
this procedure, the installation materials are produced, these materials are approved by
the Task Force, they are delivered to the user agencies where installations occur, and
the contractor records installation progress in the Installation Status Report. When this
report is approved by the Task Force, all development testing is complete.
2.12.3 Task Force Activities
2.12.3.1 Review, Approve or Reject with Reasons the Installation Materials
After the contractor has produced and delivered to the Task Force the Installation
Materials for approval, the Task Force shall review and approve or reject the
materials. If the materials are rejected, the contractor will be asked to address the
supplied reasons for the rejection and resubmit the corrected Installation Materials. If
the materials are approved, the contractor will be notified to begin distribution of the
system. The approved Installation Materials will be processed by Configuration
Management and Supplier Agreement Management procedures.
2.12.3.2 Review Installation Problems and Resolution Efforts to Determine
that Problems are Resolved
The contractor will provide to the Task Force any installation problems reported by
the tester/user of the product and resolution efforts taken by the contractor to resolve
the problems. The Task Force will review the installation problems and resolution
efforts to insure that problems are being resolved by the contractor. If the Task Force
decides that a reported problem is not satisfactorily resolved they shall notify the
contractor to continue efforts to resolve the issue.
2.12.3.3 Review, Approve or Reject with Reasons the Installation Status
Report
After the contractor has produced and delivered to the Task Force the combined
“Installation Status Report” for approval, the Task force shall review and approve or
reject the report. If the report is approved, it will be sent to the Standing Committee
on Joint Development for their review. The approved Installation Status Report will
be processed by Supplier Agreement Management procedures. All testing
procedures are considered complete at this point. If the report is rejected, the
contractor will be asked to begin requirements analysis (described in the section
titled “Test 2: Preparation for Test Instance”) of a new beta testing cycle.

Page 18 06/10/2009
Testing Standard 3.080.02S

2.12.4 Tester Activities


2.12.4.1 Install Product and Report Any Installation Problems Encountered
After the Task Force approved installation materials have been delivered to a
Tester/User by the contractor, the Tester/User shall install the system and report to
the contractor any problems encountered. The Tester/User and the contractor shall
work to resolve these problems.
2.12.5 Standing Committee on Joint Development Activities
2.12.5.1 Review the Combined Status of Installations Report
After the Task Force approval of the Combined Status of Installations Report (beta
testing completion), the report is provided to the Standing Committee on Joint
Development (SCOJD) for their review.

3. Work Product Definitions


The Work Product Definitions used for testing are defined in the following sections.

3.1 Test Plan


3.1.1 Description
The Test Plan specifies the schedule and duration of activities required to perform the
testing of a system, system components, documentation, and procedures. It also
includes a schedule of deliverables. It is a component of the contractor’s project plan.
The elements required to populate the plan are described below.
3.1.2 Plan Elements
○ System or system components to be tested and the testing methods to be used.
○ Schedule for submission of each testing deliverable.
○ Define the following Activities:
□ Schedule (ending date and duration) of Beta Test activities (inclusive of all
walkthroughs, unit, and build testing and regression testing to correct problems)
□ Schedule (ending date and duration) of Alpha Test activities (inclusive of all
walkthroughs, unit, and build testing and regression testing to correct problems)
□ Schedule (ending date and duration) of System Test activities (inclusive of all
walkthroughs, unit, and build testing and regression testing to correct problems)
□ Schedule (ending date and duration) of the high level Build Test activities
(inclusive of all walkthroughs, unit testing, subordinate build testing and
regression testing to correct problems)
3.1.3 Payment and Deliverable Considerations
The document(s) described by this specification is required.
It is eligible for payment by deliverable when approved. Cost and schedule may be
provided by procedures of Project Planning. The document(s) is processed as a
deliverable by Supplier Agreement Management procedures.
The document(s) shall be delivered, complete with contractor configuration management
versioning information, in production tool format and in PDF format on a medium which
is acceptable to both parties.

Page 19 06/10/2009
Testing Standard 3.080.02S

3.2 Test Criteria


3.2.1 Walkthrough Criteria
3.2.1.1 Description
Walkthroughs, in this specification, are used to examine (test the concepts of) the
components of test instances. They may examine the requirements, the test
procedures / result criteria, or the system / component of the test instance. A
Walkthrough may precede any type of test: Unit, Build, System, Alpha, and Beta. If
errors or exceptions are found during a walkthrough, action items are developed that
when performed will correct them.
Stakeholders (AASHTO organization personnel, Task Force members, technical
experts, user or agency representatives, subject matter experts) should be invited to
walkthroughs whenever their participation would be useful.
3.2.1.2 Areas to Test
The following Test Instance Types may be tested with a Walkthrough:
□ Unit Test
□ Build Test
□ System Test
□ Alpha Test
□ Beta Test
For each of the above types, the following Test Instance Components may be tested:
□ System or System Component being tested
□ Test Procedures and Result Criteria
□ Requirements being tested
3.2.2 Unit Test Criteria
3.2.2.1 Description
Unit testing consists in testing the smallest isolatable development product. This can
be the method of a Class/Object or function of a module where object oriented
development is not being used. Unit testing reduces the complexity of testing the
entire system. It is also useful for discovering errors which might be difficult to detect
or isolate at a higher level of testing.
3.2.2.2 Areas to Test
The ease of unit testing is contingent on how many units are independent of other
portions of the system and can thus be independently tested. A system architecture
which isolates functionality and reduces dependencies is the best method for easing
the unit test burden as well as reducing maintenance effort and facilitating system
portability. For more information on system architecture, see Build Test Criteria
below.
3.2.3 Build Test Criteria
3.2.3.1 Description
Build Testing is the means used for testing a component that is made up of lesser
components (units or other builds). A Build is expected to operate as a functional
whole. All of its parts are expected to be well integrated and to contribute to its
functional mission. The build component must be isolatable from the rest of the

Page 20 06/10/2009
Testing Standard 3.080.02S

system so that it can be independently tested. Because of this its interfaces should
be clearly definable. Build testing can reduce the complexity of testing and simplify
system maintenance.
3.2.3.2 Areas to Test
Build boundaries should correspond to the abstract boundaries established by the
application architecture (for an example of application architecture boundary
definitions, see the Examples of Build Criteria section of Appendix B). These
boundaries must be capable of being mapped to the physical boundaries of the
deployed system. These boundaries are established through re-factoring the design
to achieve the layer (boundary) characteristics described below:
Functionality layers are used to segregate functionality within an application into
independently testable segments. These layers will have the following
characteristics:
□ Layers should have minimal interdependence with other layers.
□ Layers should have clear defined interfaces. Interfaces should not be insisted
upon for layers with no dependencies, such as a domain layer.
□ Layers should have a minimal of duplicated code or definition.
□ Layers must map to the tiers by which the system is to be physically deployed.
Since layer boundaries are dependant on application architecture, they will be as
varied as are the architectures needed to solve entire range of automation problems.
The layers represent the highest level of build testing. Each layer should be tested in
accordance with the capabilities provided by its interface(s). Any dependencies to
other layers should be mocked. Sub builds that are within the boundaries of a layer
may be performed where useful.
3.2.4 System Test Criteria
3.2.4.1 Description
The purpose of System Testing is to test the system as a whole to ensure that the
integration is completed and that it performs as required. System Testing leads to
Alpha Testing and may be used to prepare for formal Alpha Test Acceptance.
3.2.4.2 Areas to Test
The emphasis of system testing is to ensure that the system functionality is properly
integrated. Secondarily and in anticipation of Alpha testing, it is important to show
that the system meets user requirements.
Integration
□ Test the build functionality to insure that nothing has been lost as a result of
integration.
□ Test that activities which cross build (layer) boundaries perform correctly thus
checking integration of system. This means that everywhere dependencies were
mocked, there needs to be tests devised that check the real interface.
Meet User Requirements
□ Test all functional requirements that are defined by use cases or user stories.
□ Test all requirements that are defined by business rules.

Page 21 06/10/2009
Testing Standard 3.080.02S

3.2.5 Alpha Test Criteria


3.2.5.1 Description
Alpha Testing should demonstrate that the system conforms to its requirements;
however, the emphasis is on breaking the system. For that reason, only discovery of
system faults is discussed here. For conformity to requirements see the discussion of
System Testing above. To clarify the differences, consider the following:
□ Intended Functionality: That which is represented in the requirements.
□ Actual Functionality: That which is known to be represented by the developed
system.
□ Ordinary Faults: Those which represent known system behavior that is contrary
to intended functionality.
□ Unintended, Undocumented, Unknown Functionality: That which lies unknown in
the developed system.
Because unintended and unknown functionality may contain faults which cause the
system to fail or security to be compromised, it is important to test for these
possibilities.
3.2.5.2 Areas to Test
The emphasis of alpha testing is to break the system or its security. Secondarily, it is
important to show that the system meets user requirements.
Break the System and System Security
□ Attacking Software Dependencies – Applications rely heavily on their
environment in order to work properly. They depend on the OS (operating
system) to provide resources like memory and disk space; they rely on the file
system to read and write data; they use structures such as the Windows Registry
to store and retrieve information; the list goes on and on. These resources all
provide input to the software – not as overtly as the human user does – but input
nevertheless. Like any input, if the software receives a value outside of its
expected range, it can fail.
□ Breaking Security Through User Interface – Most security faults result from
additional, unintended, and undocumented user behavior from the UI (user
interface). This amounts to handling unexpected input from the user in a way that
compromises the application, the system it runs on, or its data. The result could
be privilege escalation or allowing secret information to be reviewed by non-
authorized users.
□ Attacking Design – Design documentation does not often reveal potential design
faults. The problem is that subtle design decisions can lead to component
interaction and inherent flaws that lead to exploitable vulnerabilities.
□ Attacking Implementation – A perfect design can still be made vulnerable by
imperfect implementation. For example, the Kerberos authentication scheme is
renowned as a well thought out and secure authentication scheme, yet the MIT
implementation has had many serious security vulnerabilities in its
implementation, most notably buffer overruns.
Test User Requirements
These tests should be brought forward from system testing described above.
□ Test all functional requirements that are defined by use cases or user stories.

Page 22 06/10/2009
Testing Standard 3.080.02S

□ Test all requirements that are defined by business rules.


3.2.6 Beta Test Criteria
3.2.6.1 Description
The purpose of Beta Testing is to demonstrate that a system or system component
fulfills its intended use when placed in its intended environment. The Beta Test
Criteria defined below provide indications as to what should be tested to establish
the validity of a system, system components, or documentation.
3.2.6.2 Areas to Test
Technical Environment
□ Design Operating Environments (what operating environments must the system
support)
□ Design Hardware Capacities (how must the product perform on its minimum and
recommended hardware configurations)
□ Design Network Environments (what network types, topologies, and capacities
are supported)
□ Design Database Environments (what releases of which DBM systems are
supported)
□ Design Interfaces (what releases of which other software products are
supported)
Organizational Environment
□ Supports organizations workflow (what information is passed from one user to
another)
□ Supports workplace organizational structure and job functions (i.e. subject or
professional, supervisors, managers, application administrators, technical
support)
□ Intuitively understandable by users in each functional area (screen interfaces and
product outputs are readily understandable within the context of the technical
environment and professional discipline)
Method or Philosophy of Solution
□ Method and Philosophy of solution that satisfies customer subject matter and
information system experts
□ Repeatability of results from release to release or from product to product where
method and philosophy are the same.
Implementation and Operation Characteristics
□ Accuracy of implementation and operation documentation within target
environments
□ Procedures or methods for backward and forward preservation of data in target
environments
□ Methods for incorporating maintenance in target environments
Additional Customer Testing
□ During Beta Testing at the customer site, the users may test the product in any
way they see fit as long as the testing is completed by a date that fits the project
plan. The contractor would not provide procedures for these tests.

Page 23 06/10/2009
Testing Standard 3.080.02S

□ The contractor shall provide the user requirements tests used in System Testing
and Alpha Testing which may be performed at the customer site as long as they
do not interfere with the timely completion of beta testing.

3.3 Installation Materials


3.3.1 Description
The Installation Materials contain all procedures, executables, and documentation
needed to implement and operate the System at the user agency site. These materials
are a required deliverable.
3.3.2 Content
○ System components
○ Installation software
○ Installation procedures and documentation
○ All system documentation
○ All other materials needed to install, operate, and maintain the delivered system
3.3.3 Payment and Deliverable Considerations
The document(s) described by this specification is required.
It is eligible for payment by deliverable when approved. Cost and schedule may be
provided by procedures of Project Planning. The document(s) is processed as a
deliverable by Supplier Agreement Management procedures.
The document(s) shall be delivered, complete with contractor configuration management
versioning information, in production tool format and in PDF format on a medium which
is acceptable to both parties.

3.4 Distribution Test Materials


3.4.1 Description
The Distribution Test Materials document contains all of the materials pertaining to a
release of beta testing which are needed by the Beta Testers to perform their testing
activities. The Distribution Test Materials document is part of a beta testing distribution
which contains, in addition the system or system components, and system
documentation.
3.4.2 Content
○ Procedures for the implementation of the beta test environment. Since there may be
multiple different environments for which the system is to be tested, there may need
to be multiple different versions of the Distribution Test Materials work product.
○ Test Instructions:
□ Test Identification (Test Phase, Test Version, Product/Component name,
Product/Component version)
□ Test Purpose (in the case of beta testing the purpose is to achieve acceptance of
the Product/Component).
□ Description and use of test materials.
□ Method for reporting problems and getting help with the test.
□ Test schedule.

Page 24 06/10/2009
Testing Standard 3.080.02S

□ Identification of optional tests (included verification tests or user composed tests,


for example).
□ Method for reporting results of test.
□ Method for reporting product acceptance.
○ Beta test procedures and result criteria (see Test Instance Report).
○ Map of relationships of selected requirements, test procedures / result criteria (see
REQM Requirements Traceability Matrix (RTM)).
3.4.3 Payment and Deliverable Considerations
The document(s) described by this specification is required.
It is eligible for payment by deliverable when approved. Cost and schedule may be
provided by procedures of Project Planning. The document(s) is processed as a
deliverable by Supplier Agreement Management procedures.
The document(s) shall be delivered, complete with contractor configuration management
versioning information, in production tool format and in PDF format on a medium which
is acceptable to both parties.

3.5 Test Results Repository


3.5.1 Description
The Test Results Repository contains all materials relating to testing and the results of
that testing. The repository may be organized in any manner that is convenient to the
contractor.
3.5.2 Content
The following definition of the contents of the Test Result Repository is organized by
testing types for convenience of description only. The contents may be organized in
whatever scheme is convenient:
3.5.2.1 Walkthroughs
The Walkthrough information is reported in the Test Instance Report and therefore is
contained in the documentation of the testing instance it preceded.
3.5.2.2 Unit, Build, and System Testing
The Unit, Build, and System Testing information has the same structure and content
but is segregated by type:
□ Name of Unit, Build, or System component / system and version tested (as many
instances as are needed).
• Name of Test and Version (as many instances as are needed to complete
testing).
♦ “Test Instance Report” includes Results and documented Exceptions.

3.5.2.3 Alpha Testing and Alpha Testing Acceptance


The Alpha Testing information has the following structure and content:
□ Name of System and Version tested (as many instances as are needed).
• Name of Test and Version (as many instances as are needed to complete
alpha testing).
♦ “Test Instance Report” document with Results and documented
Exceptions.

Page 25 06/10/2009
Testing Standard 3.080.02S

♦ “Alpha Testing Acceptance Report” (included in those Alpha Testing


iterations in which acceptance is sought). This entry should also include
reasons for rejection where they apply.
3.5.2.4 Beta Testing and Beta Testing Acceptance
The Beta Testing information has the following structure and content:
□ Name of System and Version, tested (as many instances as are needed).
• “Distribution Test Materials”, with Name of Test and Version (as many
instances as are needed to complete beta testing).
♦ Agency (as many instances as are needed).

♠ “Test Instance Report” document with Results and documented


Exceptions (an instance for each participating agency).
♠ Combined “Test Instance Report” with all Results and documented
Exceptions.
♠ “Beta Testing Acceptance Report” (included in those Alpha Testing
iterations in which acceptance is sought). This entry should also
include reasons for rejection where they apply.
3.5.3 Payment and Deliverable Considerations
The document(s) described by this specification are required as supporting documents
during operation of the testing procedures. They are not deliverables.
They are not eligible for payment by deliverable.

3.6 Alpha Test Acceptance Report


3.6.1 Description
The Alpha Test Acceptance Report (ATAR) documents the last instance of Alpha
Testing.
3.6.2 Content
The contained Test Instance Reports may be reorganized, links should be expanded,
and explanatory information inserted, where necessary, to make the report
understandable and readable.
○ System Name and Version being submitted.
○ Date submitted for Task Force Approval.
○ Person and Organization submitting the report.
○ Justification for Alpha Test Acceptance Report approval.
○ The complete and final Test Instance Report (see the work product definition of the
same name) for Alpha Testing.
○ The complete and final Test Instance Report (see the work product definition of the
same name) for System Testing.
○ All of the complete and final Test Instance Reports (see the work product definition of
the same name) for Build components that contributed to the final System.
○ All of the complete and final Test Instance Reports (see the work product definition of
the same name) for Unit components that contributed to the Builds that make up the
final System.

Page 26 06/10/2009
Testing Standard 3.080.02S

3.6.3 Payment and Deliverable Considerations


The document(s) described by this specification is required.
It is eligible for payment by deliverable when approved. Cost and schedule may be
provided by procedures of Project Planning. The document(s) is processed as a
deliverable by Supplier Agreement Management procedures.
The document(s) shall be delivered, complete with contractor configuration management
versioning information, in production tool format and in PDF format on a medium which
is acceptable to both parties.

3.7 Beta Test Acceptance Report


3.7.1 Description
The Beta Test Acceptance Report (BTAR) documents the Acceptance and Approval of
the last instance of Beta Testing.
3.7.2 Content
The contained Test Instance Report may be reorganized, links should be expanded, and
explanatory information inserted, where necessary, to make the report understandable
and readable.
○ Name and Version of system tested.
○ Date submitted for Task Force Approval.
○ Person and Organization submitting the report.
○ Justification for Beta Test Acceptance Report approval.
○ The complete and final Test Instance Report (see the work product definition of the
same name) for Beta Testing. This report combines all results collected from the
beta testing agencies.
3.7.3 Payment and Deliverable Considerations
The document(s) described by this specification is required.
It is eligible for payment by deliverable when approved. Cost and schedule may be
provided by procedures of Project Planning. The document(s) is processed as a
deliverable by Supplier Agreement Management procedures.
The document(s) shall be delivered, complete with contractor configuration management
versioning information, in production tool format and in PDF format on a medium which
is acceptable to both parties.

3.8 Installation Status Report


3.8.1 Description
The Installation Status Report is a request for approval of installation progress. It
includes a summary report of installation progress. When the Task force approves the
Installation Status Report, testing procedures are complete. This report is a required
deliverable.
3.8.2 Content
The Installation Status Report contains the following information:
○ Date and Title of report
○ Recommendation for approval with reasons.

Page 27 06/10/2009
Testing Standard 3.080.02S

○ Total number of licensees for the product


○ Total number of licenses covered by installs performed.
○ Date/agency of each successful installation
○ Date/agency/description of each problem encountered
○ Date/agency/description of each problem resolution.
3.8.3 Payment and Deliverable Considerations
The report described by this specification is required to acquire approval for system
installation, but it is not eligible for payment by deliverable.

3.9 Test Instance Report


3.9.1 Description
A Test Instance Report combines all of the components needed to perform a test and to
document its results. It identifies the requirements that are being tested, the system or
system component that is being tested, the test procedures / result criteria used to
perform and measure the test, the results of the test, discovered exceptions, and
proposed resolutions.
A Test Procedure, also called a test script or test scenario, is a sequence of
activities/events that will test the system or system component for compliance with
functional and performance requirements. The procedure represents a "transaction"
with a desired result (a result could include a desired response time or measurement of
resource consumption, etc.). The specification of the boundaries or equalities of the
result are the Result Criteria. All major functional and performance requirements are
translated into Test Procedures and Result Criteria. Once developed the procedures will
become part of the product documentation to be used repeatedly during product
development, enhancement, and maintenance.
Successful testing of all procedures against the product is a minimum requirement
before a product can be released commercially.
3.9.2 Format
A form may be used for documenting testing procedures, result criteria, and the results
of testing. It could be useful during Beta Testing as a vehicle for capturing testing results
from the user. Any method for capturing the results, which captures all of the needed
information, as described below, is satisfactory.
3.9.2.1 Test Instance Identification
□ Test Instance Number: Unique identification of the test instance.
□ Test Instance Name: Name which is meaningful to users of this documentation
and its derivatives.
□ Test Instance Description.
□ Test Type: Designates whether the test is Unit, Build, System, Alpha, or Beta.
□ Test Objective: Describes what the test is to achieve (Test Functionality, Test
Performance, etc.).
3.9.2.2 System or System Component Identification
□ System or System Component Identification: Identification of system or system
component tested.

Page 28 06/10/2009
Testing Standard 3.080.02S

□ System or System Component Version: Unique version identification of system


or system component tested.
□ Current Author Name.
□ Development or Revision Date.
3.9.2.3 Requirements Definition
□ Requirement ID: Unique ID of the requirement in the Requirements Traceability
Matrix. There should be as many instances of this as there are requirements to
be tested by this test instance.
• Requirement Descriptive Name: This name also comes from the
Requirements Traceability Matrix.
3.9.2.4 Test Procedure Definition
□ Procedure Unique Name: There may be as many instances of this as are needed
to perform the test instance.
□ Activity / Event / Script: The actual activities, events or scripts may be placed
here or links to them.
□ Link to Requirement(s) Tested: There may be as many instances of this as are
needed for this procedure.
• Data Used Name: There may be as many instances of this as are needed to
perform the test instance. Use of Data.
♦ Source or Destination of Data

□ Miscellaneous Notes: Notes describe test considerations, special arrangements


or platforms, and etc.
3.9.2.5 Test Criteria Definition
□ Result Criteria Unique Name: There may be as many instances of this as are
needed to perform the test instance.
• Link to Procedure Unique Name.
• Result Criteria Description: This shall contain the criteria or a link to it.
3.9.2.6 Actual Results
□ Result Instance Name: There can be as many instances of this as are needed.
• Link to Criteria Unique Name.
• Tester Agency or Name
• Result of Test
• Exception Description: This item and its subordinates will appear only if an
exception is discovered.
♦ Proposed Resolution.

♦ Action Performed.
3.9.3 Payment and Deliverable Considerations
The document(s) described by this specification contains information which is used in
the Distribution Test Materials, Alpha Test Acceptance Report and the Beta Test
Acceptance Report work products, which are required deliverables. The Test Results
Repository work product provides for the retention of these documents.
The document(s) is not a required deliverable and is not eligible for payment by
deliverable.

Page 29 06/10/2009
Testing Standard 3.080.02S

4. Appendices
4.1 Appendix A: Procedure Activity Diagrams
The procedure activity diagrams provided in this appendix are not required and are for
reference only. They are intended to help in understanding the context in which the Task
Force, Testers, and Stakeholders activities are performed and to help the contractor prepare
the materials needed to complete those activities.
The activity diagrams (flow charts) are color coded. See the legend provided with each
procedure for coding. The following list provides descriptions of the graphics figures used in
the procedure diagrams:
■ Rectangles represent activities to be performed.
■ Diamonds represent decisions to be made.
■ Rectangular areas with round ends represent entrance to, exit from, or use of another
procedure. These procedures may be within the Testing Process Area or they may be in
some other process area.
■ Circles represent a jump to some other part of the same procedure.
■ Arrow points on connecting lines represent the direction of flow.
When process areas, not yet developed, are referenced, the current accepted methods of
AASHTOWare development will be used. The following list describes the external process
areas referenced by the procedures:
■ Requirements Development (RD): The purpose of RD is to produce and analyze
customer, project/product, and project/product component requirements.
■ Requirements Management (REQM): The purpose of REQM is to manage and maintain
customer, project/product, and project/product component requirements.
■ Supplier Agreement Management (SAM): The Purpose of SAM is to manage the
acquisition of products from suppliers for which there exists a formal agreement. This
process area is used to specify and manage AASHTOWare Product Contracts.
■ Configuration Management (CM): The purpose of CM is to establish and maintain the
integrity of work products using configuration identification, configuration control,
configuration status accounting, and configuration audits.
■ Project Planning (PP): The purpose of PP is to establish and maintain plans that define
project activities.
■ Technical Solution (TS): The purpose of TS is to design, develop, and implement
solutions to requirements.
■ Product Integration (PI): The purpose of PI is to integrate the components of the system.

Page 30 06/10/2009
Testing Standard 3.080.02S

4.1.1 Test 1: Test Planning

Page 31 06/10/2009
Testing Standard 3.080.02S

4.1.2 Test 2: Preparation of Test Instance

Page 32 06/10/2009
Testing Standard 3.080.02S

4.1.3 Test 3: Walkthrough

4.1.4 Test 4: Unit Testing

Page 33 06/10/2009
Testing Standard 3.080.02S

4.1.5 Test 5: Build Testing

4.1.6 Test 6: System Testing

Page 34 06/10/2009
Testing Standard 3.080.02S

4.1.7 Test 7: Alpha Testing

Page 35 06/10/2009
Testing Standard 3.080.02S

4.1.8 Test 8: Beta Testing

Page 36 06/10/2009
Testing Standard 3.080.02S

4.1.9 Test 9: Peer Review and Exception Correction

Page 37 06/10/2009
Testing Standard 3.080.02S

4.1.10 Test 10: Alpha Testing Acceptance

4.1.11 Test 11: Beta Testing Acceptance

Page 38 06/10/2009
Testing Standard 3.080.02S

4.1.12 Test 12: Installation

Page 39 06/10/2009
Testing Standard 3.080.02S

4.2 Appendix B: Test Criteria Examples


The examples of test criteria provided here are not required. They are provided to illustrate
the kinds of tests which might be employed for each of the testing types (Walkthrough, Unit,
Build, System, Alpha, and Beta).
4.2.1 Examples of Types of Walkthrough
One example would be a Logic Walkthrough, which is a verbal description of the logic in
a class method or program module. This description is provided by the component’s
developer to members of the development team, who follow the presentation of the logic
structure of the component. The objective of the logic walkthrough is to detect logic
errors.
In a larger sense, walkthroughs may by used in any instance where the presentation of
intended actions to peers or stakeholders might uncover errors or difficulties. This
principle is used in pair programming, and in Joint Application Development sessions.
Other uses of walkthroughs would be the development or confirmation of User Stories or
Use Cases, confirmation of Test Procedures or Result Criteria, re-factoring system
design, and confirmation of system or system component functionality / logic.
4.2.2 Examples of Unit Testing
Classes may be tested by instantiating all variations of objects of which the class is
capable and testing each method and associated attributes of these objects. This can be
done by writing a ‘main’ program that performs the instantiation and testing. Tools are
available which automate this process.
Modules may be tested in similar manner by writing an executive program which calls
eacah module function with all of its possible parameters / arguments, and checks its
returns. For this process to be useful, it is assumed that good modular programming
techniques have been employed.
4.2.3 Examples of Build Testing
Build boundaries are dependant upon the application architecture used to structure the
application. Functionality layers are used to segregate functionality within an application
into independently testable segments. The following examples of application layers are
taken from “Expert Spring MVC and Web Flow” by Seth Ladd.
○ User Interface (View) – The user interface layer is responsible for presenting the
application to the end user. This layer renders the response generated by the web
layer into the form requested by the client. For cell phones this may be WML or at
least specializations of XHTML. Other clients may want PDFs for their user interface.
And of course, browsers want the response rendered as XHTML. Reasons for
isolating the user interface layer are the following:
□ De-linking the processing of the rest of the application from the processing that is
dependant on the unreliable network.
□ Isolation of the user interface layer allows the changing of rendering technologies
(examples of rendering tools are Velocity, FreeMarker, XSLT) without affecting
the other layers.
□ This isolation permits UI specialists to be shielded from the intricacies of the
application. These specialists are usually focused on different concerns than the
rest of the developers.
○ Web – The web layer is responsible for navigation through the web site, which may
be as simple as mapping a single URL to a single page or as complex as a

Page 40 06/10/2009
Testing Standard 3.080.02S

implementing a full workflow engine. Most layers are stateless; however, the web
layer must maintain states in order to guide the user through the correct path. The
web layer also provides the glue that binds the world of HTTP and the service layer.
The HTTP world is populated with request parameters, HTTP headers, and cookies.
These aspects are not business logic specific and thus are kept isolated from the
service layer. This layer logically contains all of the connection mechanisms such as
HTTP, SOAP, or XML-RPC. The following are reasons for isolating the web layer:
□ Divorcing the web concerns from the service layer means that the system can
export the same business logic via multiple methods.
□ Isolation of navigation, creates a more flexible design, because the individual
functions of the domain model can be combined in many different ways to create
many different user experiences.
□ Moving the web concerns out of the business logic makes the core logic very
easy to test. You won’t be worrying about setting request variables, session
variables, HTTP response codes, or the like when testing the business layer.
Likewise, when testing the web layer, you can easily mock the business layer
and worry only about issues such as request parameters.
The web layer is dependent on the service layer and the domain model layer.
○ Service – For the client, the service layer exposes and encapsulates coarse-grained
system functionality (Use Cases) for easy client usage. A method is coarse grained
when it is very high level, encapsulating a broad workflow and shielding the client
from many small interactions with the system. The service layer should be the only
way a client can interact with the system, keeping coupling low because the client is
shielded from all of the interactions that implement the use case. For the system the
service layer’s methods represent transactional units of work. This means that with
one method call, many objects and their interactions will be performed under a single
transaction. Performing all of the work inside the service layer keeps communication
between the client and the system to a minimum (in fact down to one single call).
Each method in the service layer should be stateless so that many transactions may
be handled concurrently without collisions. This layer provides encapsulations for all
of the use cases of the system. A single use case is often one transactional unit of
work. Consolidating the units of work behind a service layer creates a single point of
entry into the system for end users and clients.
○ The service layer is dependant on the domain model and the persistence layer. It
combines and coordinates calls to both the data access objects and domain model
objects. The service layer should never have a dependency on the view or web
layers.
○ Domain Object Model – The domain object model is the most important layer in the
system. This layer contains the business logic of the system, and thus, the true
implementation of the use cases. The domain model is the collection of nouns in the
system, implemented as objects. These nouns, such as User, Address, and
ShoppingCart, contain both state (user’s first name, users last name) and behavior
(shoppingCart.purchase()).
It may be helpful to think of the domain model as a vertical layer. In other words, all
of the other layers have dependencies on the domain model. The objects of the
domain model, however, have no dependencies on any other layer or the framework
employed. Thus the domain model can be decoupled from its environment. This
means the business logic can be tested outside the container and independently of
the framework. This speeds up development tremendously, as no deployments are

Page 41 06/10/2009
Testing Standard 3.080.02S

required for testing. Unit tests become very simple to create, as they are testing
simple code, without any reliance on database connections, web frameworks, or
other layers in the system. All layers are responsible for its problem domain, but they
all live to service the domain model.
○ Persistence (Data Access) – The data access layer is responsible for interfacing with
the persistence mechanism to store and retrieve instances of the object model. The
data access functionality gets its own layer for two reasons. One of the primary
reasons for the layering abstraction in object oriented systems is to isolate sections
of the application from change. The data access functionality is no different, and it is
designed to isolate the system from changes in the persistence mechanisms. As an
example, a business requirement change might force all user accounts to be stored
inside an LDAP-compliant directory instead of a relational database. While this might
happen rarely, abstracting the persistence operations behind a single interface
makes this a low impact change for the system. Keeping the time to run the system
tests low is the other key reason the data access layer is isolated. Database
connections are expensive resources to create and maintain. Unit tests should be
very quick to run, and they will slow down tremendously if they require connections
to the RDBMS. Isolating the persistence operations to one layer makes it easy to
mock those operations, keeping test runs fast.
Typically only the service layer has a dependency on the data access layer. From a
practical standpoint, the service layer coordinates the data access layer and the
object domain layer such that the appropriate objects are loaded and persisted for
the use case.
4.2.4 Examples of System Testing
The following items are examples of integration testing.
○ Test that all client screens and reports are formatted and rendered properly for the
type of client (PDA, Cell Phone, internet attached Workstation, hosting Workstation).
○ Test navigation through the system functionality.
○ Test workflow by user organizational type (technical support, application
administrator, executive, manager and user) and request type.
○ Check HTTP transaction handling.
○ Test that all supported connection types (HTTP, XHTML, RPC-XML, and SOAP) are
working properly.
○ Test connections to other applications.
○ Test to see that the function described by each use case is exposed with appropriate
security.
○ Test that all functions of all use cases are implemented correctly.
○ Test to see that all required business rules are properly applied.
○ Test to see that appropriate data or objects are correctly loaded or stored in all the
appropriate persistence mechanisms.
○ Test that the system delivers the expected throughput, when implemented in the
designed minimum environment and using the network and persistence mechanisms
of the contractor test environment.

Page 42 06/10/2009
Testing Standard 3.080.02S

4.2.5 Examples of Alpha Testing


Information supplied below is drawn from “How to Break Software a practical guide to
testing,” James A Whittaker, Addison Wesley, 2003 and “How to Break Software
Security,” James A Whittaker and Herbert H Thompson, Addison and Wesley, 2003.
Both books supply free tools which allow manipulation of system interfaces to perform
testing. The list of examples are not meant to be exhaustive, but rather to give an idea of
this type of testing and to provide a beginning nucleus for a fault model which may be
added to over time and system iterations. Breaking the system and breaking security are
close in kind. Usually a security breach is the result of a system fault that can as easily
cause the system to break.
4.2.5.1 Breaking the System and Security through the User Interface
□ Overflow Input Buffers. Buffer overflows are by far the most notorious security
problems in software. A hacker can attach code to the end of a long string which
the process may execute because it has overlapped (replaced) application code.
These problems occur when applications fail to constrain input lengths.
□ Examine all common switches and options. Changing application switches from
default settings to obscure or inappropriate options may force the application into
poorly tested code or error conditions with no recovery routine or a poorly written
one.
□ Explore escape characters, character sets, and commands. If the program
accepts strings as input, which characters does the application treat as special
cases? Testing these special case values is a good way to find bugs that can
leave the application vulnerable.
□ Force the screen to refresh in various circumstances. Refreshing a screen where
objects have been added, moved and resized or where data has been added can
be problematic. Testing the situations in which the screen is or should be forced
to refresh will save your users many headaches.
□ Investigate alternative ways to modify internal data constraints. This attack is
more general than the one concentrating on overflow size. In this attack we are
concerned with investigating all of the access points to any constraint on the data
structure including its size. Such constraints can be size, dimension, type, shape,
location on the screen, and so forth.
□ Experiment with invalid operator and operand combinations.
□ Force computation results to be too large or too small.
□ Find Features which share data or interact poorly. Feature interaction failures are
most often caused when two or more features work with a shared set of data and
each feature enforces a different set of constraints on the data.
4.2.5.2 Attacking Software Dependencies
□ Block access to libraries. Software depends on libraries from the operating
system, third-party vendors, and components bundled with the application. This
attack ensures that the application under test does not behave insecurely if
software libraries fail to load.
□ Manipulate the applications registry values. In the Windows world, the registry
contains information crucial to the normal operation of the operating system and
installed applications. For the OS the registry keeps track of information such as
key file locations, directory structure, execution paths and library version
numbers. Applications rely on this and other information stored in the directory to

Page 43 06/10/2009
Testing Standard 3.080.02S

work properly. However, not all information stored in the registry is secured from
users or other installed programs. This attack tests that applications do not store
sensitive information in the registry or trust the registry to always behave
predictably.
□ Force the application to use corrupt files. Applications can only do so much
before they need to store or retrieve persistent data. It is the tester’s job to make
sure the application can handle bad data gracefully, without exposing sensitive
information or allowing insecure behavior.
□ Force the application to operate in low memory, disk-space, and network-
availability conditions. When an application is forced to operate in low memory,
disk-space, and network-availability conditions, it is forced into error recovery
conditions. This is common to the other attacks described above. If the error
recovery scripts are poorly written the application may be left in an insecure
state.
□ With the supplied tools, memory or network conditions can be varied to inject
memory faults, to determine which functions are memory hogs, to determine the
applications lower-bound threshold of tolerance for low memory, to inject faults at
runtime during memory use, to determine the applications lower-bound threshold
of tolerance for a slow network, and to inject faults at runtime during network use.
□ With the supplied tools, it is possible to fill the file system to its capacity or to
force the media to be busy or unavailable. Application dependencies on the file
system are tested to see whether the application will handle the operating system
generated error or fail.
4.2.5.3 Attacking Design
□ Test for common default and test account names and passwords. Applications
may have undocumented, invisible, and un-configurable accounts that ship with
the product. This attack is often the result of leftover test accounts, legacy
support accounts and incomplete documentation.
□ Expose unprotected test APIs. Most user-accessible APIs don’t include the
capabilities necessary for efficient testing. Test APIs and hooks are added to the
application with the intention of removing them before release. In practice, they
may become so integrated into the code that they are sometimes not removed.
Since the purpose of these APIs and hooks is to make testing more efficient,
application security is not their concern. These APIs may be found by using the
supplied tools while testing tools and scripts are used.
□ Connect to all ports. Applications commonly open ports to send data across the
network. However, an open port is not automatically a secure conduit for
communication. Without proper measures taken by application developers, an
open port is a welcome mat for a hacker attempting to gain unauthorized access
to the system. The ports may be scanned, using supplied tools, to see which are
open and to capture error messages from those that are closed. When an open
port is found, it is the tester’s job to find the application which has opened it and
to determine if it represents a security threat.
□ Fake the source of data. Some data is trusted implicitly, based on its source; for
example, applications tend to accept data from sources like the OS with minimal
scrutiny. Some sources must be trusted (in fact, they must be trusted) for the
application to function. Problems arise when the trust an application extends to a
particular source is not commensurate with the checks it makes to ensure that
data is indeed from that source. The usual problem is that trust is solely extended

Page 44 06/10/2009
Testing Standard 3.080.02S

on the basis of identification without being coupled with authentication.


Identification is the act of saying who you are and Authentication is the act of
proving you are who you say you are. An example would be the user name and
password combination. Security is compromised whenever we can get the
application to accept data or commands from an un-trusted or un-authenticated
source that passes itself off as legitimate.
□ Create loop conditions in any application that interprets script, code, or other
user-supplied logic. Force a function to call itself recursively by repeating a
command over and over, thus denying needed resources and functionality to
entitled users and processes. The major cause of this class of bugs is that
developers write code that does not guarantee that loops and recursive calls
terminate.
□ Force the system to reset values. This attack can be applied to all kinds of
software. It is easy to apply because you don’t have to do anything; indeed that is
the whole point of the attack. Leave entry fields blank, click finish instead of next,
or just delete values. These kinds of actions force the application to provide a
value when you haven’t. Testers need to test for two types of faults: variables
that have illegal or non-existent default values and default values or
configurations that leave the software in an unsafe state.
4.2.5.4 Attacking Implementation
□ Get between time of check and time of use. Data is at risk whenever an attacker
can separate the functions that check security around a feature or a piece of data
from the functions that use it.
□ Create files of the same name as files protected with a higher classification. Files
which are given higher classification based on their name or location may be
mimicked by placing a file of the same name in a directory, protected from the
user, which is previous in the search path. This would apply to DLLs, which are
acquired by name alone. The application should be able to identify files by some
method that is more definite than name alone.
□ Force all error messages. Apply inputs that force all error messages to occur and
apply inputs that will result in computational error messages. Inspect error
messages to see if they reveal information that will make the application more
vulnerable to attack. The reason this is an effective attack is that error causes
require developers to write additional error checking code. It is very difficult to
make a program fail gracefully and such difficulty usually means bugs. Solutions
are input filters, input checking, and error handlers.
□ Look for temporary files and screen contents for sensitive information. Temporary
files are a convenient way to store data. This is particularly true when large
amounts of data are being accessed or when an application needs to retain
information between executions. Cookies are a good example of small files that
hold persistent data between executions. Web developers often work on the
implicit assumption that cookies will not be viewed or altered by the user or will
not be viewed by a different user or web site. This leads to personal information
(name, login information, shopping habits, and other data) being stored in plain
text on the client machine.
4.2.6 Examples of Beta Testing
○ Proof of Concept tests may be employed to determine that product functionality can
be supported within a certain technical environment or interfaced with other products
in the environment.

Page 45 06/10/2009
Testing Standard 3.080.02S

○ JAD sessions supported by look-and-feel prototypes may be used to determine


suitability of product in the users work environment.
○ Beta testing in the intended environment, using real data, and operated by typical
users is the most certain way to prove the validity of a product.

Page 46 06/10/2009
This page is intentionally blank.
PRODUCT RELEASE
CHECK LISTS STANDARD
S&G Number: 3.085.05S
Effective Date: July 1, 2009

Document History
Version
Revision Date Revision Description Approval Date
No.
01 Feb. 1999 Initial Version Feb.1999
04 June 2001 Revised to reflect changes to the Product April 2002
Deliverables and Product Testing Standards.
03 Jan. 2006 Revised to reflect the establishment of the June 2006
Requirements Management Standard
04 Sep. 2006 Revised to reflect requirements for 2 copies, Oct. 2006
extended life media in final shipments to AASHTO.
Revised to reflect establishment of the Testing
Standard.
05 06/10/2009 Changed standard number from 3.04.010.04 to 06/16/2009
3.085.05S, and applied standard template. Approved by
Changed standard number references in T&AA
checklists. Made minor changes and format
modifications.

06/10/2009
Product Release Checklists Standard 3.085.05S

Table of Contents
1. INTRODUCTION .................................................................................................. 1
1.1 Preface ................................................................................................................1
1.2 Objectives...........................................................................................................1
2. SOFTWARE PREPARATION .............................................................................. 2
2.1 Software Preparation Checklist.........................................................................2
2.2 Explanation of Checklist Items..........................................................................3
2.2.1 Product Naming Conventions .............................................................................. 3
2.2.2 Testing Standards ................................................................................................ 3
2.2.3 Product Graphical Interface Standards................................................................ 3
2.2.4 Requirements Deliverables .................................................................................. 4
2.2.5 System Documentation ........................................................................................ 4
2.2.6 Task Force Approval ............................................................................................ 4
3. SHIPMENT PREPARATION ................................................................................ 5
3.1 Shipment Checklist ............................................................................................5
3.2 Explanation of Checklist Items..........................................................................5
3.2.1 Documentation ..................................................................................................... 6
3.2.2 Software ............................................................................................................... 6
3.2.3 Hardware.............................................................................................................. 7
4. CONTENTS LIST ................................................................................................. 8

Page i 06/10/2009
Product Release Checklists Standard 3.085.05S

1. INTRODUCTION
1.1 Preface
As a software product goes from its first release to the many following releases, there is a
need for consistency among the releases. This consistency applies not only to the working
of the software but also to the steps taken to put out the release. For example, the release
numbers should be consistent from release to release. Going from 5.2.3 to 6.0.0 should
show a major revision while going from 5.2.1 to 5.3.0 should show only minor changes.
This specification applies to all new releases of AASHTOWare products and to all
enhancement and/or maintenance releases. Temporary emergency fixes, where there is not
a new build of the product, are outside the scope of this specification. For maintenance
releases, however, this specification is required.

1.2 Objectives
The objective of this standard is to present a list of items that shall be performed and
documented prior to distributing a new release of software. This checklist, which
summarizes the AASHTOWare standards, identifies the minimum requirements needed
before a new release is shipped. The goal of this specification is to promote consistency in
the preparation of deliverables, documentation, and packaging of new AASHTOWare
software releases.

Page 1 06/10/2009
Product Release Checklists Standard 3.085.05S

2. SOFTWARE PREPARATION
This checklist shall be completed by the contractor, with the exception of “Task Force Approval.”
The Task Force then reviews the checklist and, if appropriate, checks off “Task Force Approval.”

2.1 Software Preparation Checklist

ITEM Standard # X
Product Naming Conventions
Product Naming Conventions have been followed 3.060.xxS

Testing Standards
The following deliverables have been approved and delivered
■ Test Plan 3.080.xxS
■ Alpha Test Acceptance Report 3.080.xxS
■ Distribution Test Materials 3.080.xxS
■ Beta Test Acceptance Report 3.080.xxS
■ Installation Materials 3.080.xxS
■ Installation Test Report 3.080.xxS

■ Product Graphical Interface


Product Graphical Interface standards have been followed 3.030.xxS

Requirements and Documentation Deliverables


The following documents have been produced/updated and are
being released with the software
■ User Requirements Specification 3.010.xxS
■ System Requirements Specification 3.010.xxS
■ Requirements Traceability Matrix 3.010.xxS
■ Design Documentation 3.050.xxS

Task Force Approval


■ The Task Force has approved this release for shipment

Page 2 06/10/2009
Product Release Checklists Standard 3.085.05S

2.2 Explanation of Checklist Items


2.2.1 Product Naming Conventions
A fully qualified name for the AASHTOWare product is required. This fully qualified name
will include a release number. See the AASHTOWare Standards & Guidelines
Notebook, standard number 3.060.xxS - use most current version (Glossary of
AASHTOWare Product Terminology) for more on product naming conventions.
2.2.2 Testing Standards
Following a well thought out test program will produce a product that is more stable, easier to
enhance or maintain, and will hold its value a longer time. A list of testing deliverables follows. For
more details, see the AASHTOWare Standards & Guidelines Notebook, standard number
3.080.xxS (AASHTOWare Testing), use the most current version of this standard.
2.2.2.1 Test Plan
The Test Plan specifies the schedule, activities, and resources required to perform
the testing of a system, system components, documentation, or procedures. It also
includes a schedule of deliverables.
2.2.2.2 Alpha Test Acceptance Report
The Alpha Test Acceptance Report contains the identification of requirements, the
test procedures / result criteria, the identification of the system being tested, the
summary of test results, the documented exceptions, and the approved / accepted
resolutions for all contributing test types (Unit, Build, System, and Alpha).
2.2.2.3 Distribution Test Materials
The Distribution Test Materials contains all of the materials needed by the beta test
participant to implement and perform beta testing in the appropriate environment and
to report the results.
2.2.2.4 Beta Test Acceptance Report
The Beta Test Acceptance Report contains the identification of requirements, the test
procedures / result criteria, the identification of the system being tested, the summary
of test results, the documented exceptions, and the approved / accepted resolutions
for all tests performed for Beta Testing.
2.2.2.5 Installation Materials
Installation Materials contain all procedures, executables, and documentation
needed to implement and operate the delivered system at the user agency site.
2.2.2.6 Installation Status Report
Installation Status Report contains the number of licensees, date/agency of each
successful installation, date/agency/description of each problem encountered, and
date/agency/description of each problem resolution. When the Task Force approves
the Installation Status Report, testing is complete and the system is accepted.
2.2.3 Product Graphical Interface Standards
All AASHTOWare products should employ graphical interfaces which are as consistent
as can be within the parameters of best practice for the supported platform. This means
the look and feel the user gets should be as consistent as possible across the
AASHTOWare product line. See the most current version of the AASHTOWare
Standards & Guidelines Notebook, standard number 3.030.xxS (AASHTOWare Product
Graphical Interface) for the complete specification of the user interface.

Page 3 06/10/2009
Product Release Checklists Standard 3.085.05S

2.2.4 Requirements Deliverables


The following deliverables are required by the AASHTOWare Requirements Standard,
standard number 3.010.xxS.
2.2.4.1 User Requirements Specification
The User Requirements Specification becomes a part of the Project or Product Work
Plan and is the definition of work to be done under the authority of the Project or
Product Contract.
2.2.4.2 System Requirements Specification
The Systems Requirement Specification represents the technical requirements used
by developers to design and construct the product in conformance with the User
Requirements approved in the contract.
2.2.4.3 Requirements Traceability Matrix
The Requirements Traceability Matrix permits all requirements to be fully identified
and traced both forward and backward to determine their origin and outcome. Such
bidirectional traceability helps determine that all source requirements have been
completely addressed.
2.2.5 System Documentation
This documentation is specified in the AASHTOWare Product Documentation
Standards, standard number 3.050.xxS.
2.2.5.1 Design Documentation
This documentation is specified in the AASHTOWare Product Documentation
Standards (3.050.xxS).
2.2.6 Task Force Approval
After the vendor has successfully completed the above steps and any additional
requirements needed for the release of this product, the Project/Product Task Force
shall consider approval for shipment. The product must not be shipped until this approval
is given.
Once the vendor receives approval for shipment from the Task Force, it should then ship
the code and documentation to AASHTO Headquarters. AASHTO headquarters will then
archive this material for safekeeping.
It is important that the start and length of the warranty period be clearly defined in the
contractor’s contract. For consistency among AASHTO products, it is recommended that
any warranty period start at the point at which AASHTO Product/Project Manager
accepts the code and documentation.

Page 4 06/10/2009
Product Release Checklists Standard 3.085.05S

3. SHIPMENT PREPARATION
3.1 Shipment Checklist
This checklist is for the vendor to use in preparing for a shipment.

ITEMS X
Documentation
■ Name of Product being Shipped
■ Platform (computing environment) this Shipment is for
■ New Manuals or Updates for Existing Manuals (including update
instructions)
■ Platform Specific Installation Instructions
■ Summary of CD, Tape, or Cartridge Contents
■ Summary of Changes in this Release
■ Special Instructions
■ Checklists
■ Contents List
■ Cover Letter

Software
■ Appropriate Media
■ Product Software
■ Command Language Procedures (Scripts, JCL, EXECs, EXEs )
■ Database Definition Procedures
■ Installation Jobs
■ Third Party Software at appropriate release (if applicable)
■ Virus Scan has been Passed

Hardware
■ Hardware Security Device (if applicable)

3.2 Explanation of Checklist Items


It should be noted that a shipment may include parts, if not all, that have been sent
electronically. The fact that an item has been electronically sent should be noted on the

Page 5 06/10/2009
Product Release Checklists Standard 3.085.05S

checklist. If the entire shipment is electronically sent, it must still include all items (electronic
checklist, contents list ...).
3.2.1 Documentation
3.2.1.1 Name of Product being Shipped
The complete name of the Product being shipped should be clearly stated on all
items shipped.
3.2.1.2 Platform this Shipment requires
It should be clearly stated on the Contents List what platform (computing
environment) this shipment was prepared for.
3.2.1.3 New Manuals or Updates for Existing Manuals
New manuals or updates to manuals the recipient already has should be shipped. If
updates are shipped, clear instructions for updating must be included.
3.2.1.4 Platform Specific Installation Instructions
Any instructions specific to the platform this shipment is to be installed on must be
included.
3.2.1.5 Summary of CD, Tape, or Cartridge Contents
The electronic medium used to ship machine readable components should be
identified. Also the platform requirements for reading the electronic medium should
be specified. If a tape is being shipped a tape map must be included in the shipment.
The kinds of things the tape map should show are: the number of files, how the tape
was created, and its density.
3.2.1.6 Summary of Changes in the Release
A summary of new features, changes, or features removed must be included in the
shipment.
3.2.1.7 Special Instructions
Any special instructions unique to this customer must be included in the shipment.
Also, any known malfunctions must be clearly noted with the appropriate
workarounds documented.
3.2.1.8 Checklists
This checklist must be included in the shipment.
3.2.1.9 Contents List
A contents list must be included in the shipment showing what is being shipped.
3.2.1.10 Cover Letter
The cover letter includes information like: whom the shipment is being sent to, who is
shipping it, what is being shipped, and for what reason.
3.2.2 Software
3.2.2.1 Appropriate Media
The software must be shipped on extended life (minimum 50 years archival life
expectancy) media that provides ease of installation and use to the recipient. A
duplicate copy of the software, also on extended life media, must be supplied to
assist in archival processes.
In instances where media with a shorter life expectancy than 50 years is required
because of installation processes, exceptions to the use of extended life media may

Page 6 06/10/2009
Product Release Checklists Standard 3.085.05S

be requested in writing by the contractor or task force and granted by the special
committee on software development.
3.2.2.2 Product Software
All software the recipients are entitled to must be shipped.
3.2.2.3 Command Language Procedures
Command language procedures needed to install or run the product must be
shipped.
3.2.2.4 Database Definition Procedures
The necessary procedures and schema needed to setup the customer chosen (and
supported) database must be included.
3.2.2.5 Installation Jobs
Installation jobs and procedures to install the product on the platform the shipment is
being prepared for must be included.
3.2.2.6 Third Party Software
If the AASHTOWare software being shipped requires third party software, the
following should be considered. If the third party software is being shipped with the
AASHTOWare software, the latest release of the third party software that has been
tested should be shipped. If the Third party software is not being shipped, it should
be clearly stated in the install document what third party software is needed and what
release it should be.
3.2.2.7 Virus-Scan has been Passed
Any media that is shipped must be scanned for viruses if a commonly used virus-
scanning product is available for that media. The virus-scan software must be of
current release and an industry leader. The scan must show no viruses on the
media.
3.2.3 Hardware
3.2.3.1 Hardware Security Device
If the product requires a hardware security device or software security key to
operate, this device/key should be included in a shipment to a first time recipient. In
all first time shipment cases arrangements must be made to get this device/key to
the recipient. If this shipment is an update of the software and the update does not
require a change in the hardware security device or software security key, a new one
need not be shipped.

Page 7 06/10/2009
Product Release Checklists Standard 3.085.05S

4. CONTENTS LIST
This is an example of a contents list which is shipped as part of the new release package.

Shipped From: Shipped To:

Shipment Date ____/____/____ Release Number -- >

Product Name

Platform / Version (computing environment)

Tape / Cartridge / Diskette / CD-ROM


Hardware Security Device or Software
Security Key
( if applicable )

Documentation Documentation Type

User Reference Manual ( ) New Manual / ( ) Updates

Implementation Manual ( ) New Manual / ( ) Updates

Security Management Manual ( ) New Manual / ( ) Updates

Manager or Administration Manual ( ) New Manual / ( ) Updates

Operator Manual ( ) New Manual / ( ) Updates

Page 8 06/10/2009
INSTALLATION AND USE
TRAINING GUIDELINE
S&G Number: 3.090.02G
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 June 1998 Initial Version July 1998
02 6/15/2009 Changed guideline number from 3.04.G50.01 to 06/16/2009
3.090.02G; and applied standard template. Made Approved by
minor changes and format modifications. T&AA

06/15/2009
Installation and Use Training 3.090.02S

Table of Contents
1. Introduction......................................................................................................... 1
1.1 Purpose...............................................................................................................1
1.2 Background ........................................................................................................1
1.3 Results of Survey ...............................................................................................1
2. Training ............................................................................................................... 1
2.1 Planning for Training .........................................................................................1
2.2 Development of Training ...................................................................................2
2.3 Methods of Training Delivery ............................................................................2
2.4 Evaluate Effectiveness of Training ...................................................................3

Page i 06/15/2009
Installation and Use Training 3.090.02S

1. Introduction
1.1 Purpose
Each AASHTOWare product should provide training to its customers in order to meet the
goal for which that particular product was designed and developed.
Good training, delivered timely and in an effective manner, will result in (1) high customer
satisfaction and (2) effective and correct use of the product in the customer work site.
Meaningful training should also reduce Help Desk and Support calls which might result if
product is not well understood.

1.2 Background
In the past, training for AASHTOWare products has been developed and delivered in an
inconsistent manner. TAA was assigned by the SCOJD to perform an analysis to determine
a better method to accomplish consistent and effective training, and to develop a Guideline
for this purpose.
(1) TAA was assigned this task in the annual work plan for 1996/97.
(2) The Special Committee on Joint Development (SCOJD) proposed an amendment to
“The Governing Policies, Guidelines and Procedures (PG&P) Document for AASHTO’s
Cooperative Computer Software Program.” An item was added which states “End user
training and related training materials may be included in the license fees so long as
they are offered and performed in an equitable manner. Product training materials shall
comply with established guidelines and procedures.”
(3) In order to assess current conditions, TAA conducted a customer satisfaction survey in
the Fall of 1997. Results have been compiled.

1.3 Results of Survey


(1) Results of the survey yielded the following:
(a) No product received an “excellent” rating from customers.
(b) The preferred method for delivery of training is instructor-led, classroom style training
or CBT (Computer Based Training). (People like hands on training and being able to
ask questions.)
(c) There exist differences in opinions of Users in their view of training for installation
and their view of training for product use.
(d) As could be expected, some older products did not provide training in methods
currently preferred. Many of the older products’ training was by way of “passing
down” the information from experienced users to newer employees at the customer’s
state.
(e) On-line help is considered an important part of training/help.
(2) Review of these results gives clear indication that we can improve in the area of training.
(3) A repeat survey at periodic intervals could be performed to build a comparison to the
initial (base line) survey. This method could be used to track progress toward
accomplishing training improvements.

2. Training
2.1 Planning for Training
(1) A needs assessment should be conducted to determine the training needs associated
with each product periodically.
(2) Distinguish between training materials and reference manual so that training follows in a
set of orderly steps.
(3) Determine which method(s) of delivery of training will be available.

Page 1 06/15/2009
Installation and Use Training 3.090.02S

(4) Consider making training a component of each product’s Annual Work Plan.

2.2 Development of Training

(1) Timing - The timing of training is important. Develop training to be made available at the
following times:
(a) Shipment of product to new customer
(b) Shipment of new product release which includes any significant changes
(c) Annually upon request; consideration should be given to using annual User Group
meeting time, if applicable.
(d) Specialized, customized training to specific customer sites may still be purchased as
service units.
(e) For customized training, the state should contact the trainers in advance in order to
insure that it will be customized to fit “the way that state does things.”
(2) Key components of and considerations for training
(a) Product Overview - describe in business terms, what the software actually does.
Provide a framework of where the product fits into their business. Describe the main
features in chronological sequence.
(b) Provide sequential steps to demonstrate how the product is to be used and identify
sequential dependencies and prerequisites. Training should as nearly as possible
simulate “real life” examples that users of the product might encounter when running
it in their own state.
Generally speaking, “worked through” examples may not be sufficient to cover the
reality of product use in the customer’s home state. Therefore additional training and
product use by the student should be encouraged, to allow them to try out and
experiment with the product in a manner that most closely simulates their state’s
experience.
(c) Provide test data which is complete enough to demonstrate all main product
capabilities. Test data should be easy to load and evaluate on software which is
shipped with the product and does not require additional purchase and installation of
separate (data base or file) products specifically for testing purposes.
(d) Provide scripts for the User to follow. Training should include “hands on” where the
User actually executes the required and frequently used parts of the product.
(e) Give information on optional or advanced features and functions. Training should
include information and examples on optional features of the product. For example,
if a product provides report-writing capability for the User to write their own ad-hoc
reports, training should include an example of how such a report could be
constructed, saved, run and retrieved for future changes.
(f) Complexity plays a role. For very complex products or features of a product, break
the training out into units which are manageable. Then tie the units logically together
as the training progresses.
(g) Periodic refresher is highly desirable due to turnover in customer agencies and loss
of internal expertise resulting in declining use of product or declining satisfaction.

2.3 Methods of Training Delivery


In determining the preferred method of delivery of training for the product, the following
criteria can be used:

Page 2 06/15/2009
Installation and Use Training 3.090.02S

(1) Classroom, Instructor led is often preferred. This method should be considered when it
makes sense to do so. Conditions which lend themselves to this method: a large
customer base, many customers receiving new product or new release of a product at
the same time and if there is an assemblage of many of the customers at the same time
would provide an opportunity and reduce travel.
(2) Computer based training or CD has the distinct advantage of being available whenever
the customer is ready to use it. This, combined with a support desk telephone number,
may be the most effective of all. This works well when a few users at a time are
beginning to use the product or a new release of the product since it can accommodate
these differences in timing.
(3) On-line Help and a User Reference Manual are both important parts of the customer’s
overall understanding of the product and their proficiency at using it. Especially on-line
help should be encouraged for all new developments. However, either of these should
not be considered a replacement for training.

2.4 Evaluate Effectiveness of Training


(1) Each training opportunity (by any method) should be accompanied by a training
evaluation. These results should be kept and reviewed by the Task Force to determine
effectiveness of the training.
(2) Periodic customer satisfaction surveys should be conducted and reviewed to determine
if modifications/improvements are needed.

Page 3 06/15/2009
This page is intentionally blank.
4 – Support
This page is intentionally blank.
QUALITY ASSURANCE
STANDARD
S&G Number: 4.010.02S
Effective Date: July, 01, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 02/12/2008 Initial Version for Pilot Process.

02 02/03/2009 Modified initial version based on T&AA and 03/04/2009


stakeholder review. Implemented standard Approved by
template, changed to six month review cycle. SCOJD
Completed additional T&AA/stakeholder reviews.
------------------------------------------------------------------
Additional minor changes and format modifications
for publishing were approved by T&AA on
06/16/2009.

06/16/2009
Quality Assurance Standard 4.010.02S

Table of Contents
1. Purpose ............................................................................................................... 1
2. Task Force/Contractor Responsibilities........................................................... 1
3. Required Deliverables and Work Products ...................................................... 2
4. Procedures.......................................................................................................... 2
4.1 Submit First List of Completed Deliverables and Work Products ..................2
4.2 Select Deliverables and Work Products for QA Evaluation.............................2
4.3 Submit Deliverables and Work Products for Evaluation .................................2
4.4 Evaluate Deliverables and Work Products .......................................................2
4.5 Review Evaluation Reports and Provide Comments .......................................3
4.6 Resolve Issues and Provide Comments...........................................................3
4.7 Prepare and Distribute Final Evaluation Reports.............................................3
4.8 Submit Second List of Completed Deliverables and Work Products .............3
4.9 Repeat procedures 4.2 through 4.7...................................................................3
4.10 Meet With QA Analyst at Contractor Work Site................................................3
4.11 Prepare and Review Annual QA Report............................................................4
5. Technical Requirements .................................................................................... 4
6. Deliverable and Work Product Definitions ....................................................... 4
6.1 List of Deliverables and Work Products Completed........................................4
6.1.1 Description ........................................................................................................... 4
6.1.2 Content................................................................................................................. 4
6.2 QA Evaluation Report ........................................................................................4
6.2.1 Description ........................................................................................................... 4
6.2.2 Content................................................................................................................. 4
6.3 Annual QA Report ..............................................................................................5
6.3.1 Description ........................................................................................................... 5
6.3.2 Content................................................................................................................. 5

Page i 06/16/2009
Quality Assurance Standard 4.010.02S

1. Purpose
The purpose of the Quality Assurance (QA) Standard is to define the responsibilities of the
product task forces and contractors in ensuring that products are being developed and
implemented in compliance with the published AASHTOWare Standards. The activities in the
standard focus on evaluating if required deliverables and work products are created in
compliance with standards and if required processes in the standards are being followed.
The activities do not address whether a deliverable or work product meets its intent or purpose.
Review and acceptance are the responsibility of the task force, and should be completed prior
to submission for QA evaluation. The activities also do not require areas of non-compliance to
be resolved; however recommendations for resolution and common problems found will be used
for process improvement within the applicable standards and within the internal procedures
used by each task force and contractor.
For the purposes of this and other AASHTOWare standards, a work product is defined as a
result or artifact of the software development or project management process. The majority of
work products in each AASHTOWare standard are also defined as deliverables. Deliverables
are always required, must be planned and tracked in the project/product work plan, and must be
formally submitted to the task force for approval or rejection. In addition to deliverables there
are other work products that document the results or outcomes of the processes defined in the
standard and provide evidence that required processes have been followed.
This standard applies to those deliverables and work products that are documented as a
requirement to comply with an AASHTOWare standard. Examples of required deliverables
include the user requirements specification, requirements traceability matrix, and beta test
acceptance report. Examples of required work products that demonstrate process compliance
are the Alpha Testing Acceptance and the Beta Testing Acceptance.
The Quality Assurance Standard includes certain activities that must be followed and work
products that must be produced in order to comply with the standard. These requirements are
shown in red italicized text.

2. Task Force/Contractor Responsibilities


The product task force and contractor responsibilities in regards to the AASHTOWare Quality
Assurance (QA) Standard are summarized below. Additional details on these responsibilities
are provided in the “Procedures” section of this document.
● Provide a list of deliverables and work products completed during the first six months of the
fiscal year and at the end of the fiscal year.
● Submit requested deliverables and work products to the AASHTOWare QA Analyst for
evaluation.
● Meet with QA Analyst at the contractor work site at least once a year.
● During the above meetings, provide the QA Analyst with access to all deliverables, work
products, and other artifacts that demonstrate that AASHTOWare standards are being
followed.
● Review evaluation reports and provide comments.
● Review the Annual QA Report.
In addition, the task force has the responsibility of ensuring that the required submissions,
approvals, communications, documentation, and technical requirements defined in this standard
are complied with. In the event that a requirement of the standard cannot be complied with, the
task force chair should advise the SCOJD or T&AA liaison early in the project/product life cycle.
A request for an exception to the standard must be submitted to the SCOJD with any necessary

Page 1 06/16/2009
Quality Assurance Standard 4.010.02S

documentation for their consideration. Approval of exceptions to the standards is under the
purview of the SCOJD.

3. Required Deliverables and Work Products


The following summarizes the required deliverables and work products that must be created
and/or delivered in order to comply with the Quality Assurance Standard.
● List of deliverables and work products completed: This list is prepared for the first six months
of the fiscal year and at the end of the fiscal year.

4. Procedures
The following provides detailed descriptions of Quality Assurance procedures that involve the
product task force and/or contractor.

4.1 Submit First List of Completed Deliverables and Work Products


At the end of the first six months of the fiscal year, the product task force chairperson will
send an email to the SCOJD and T&AA liaisons providing a list of deliverables and work
products completed during the prior six months. The list may be submitted by email or letter
and should be submitted by January 10 of each fiscal year.
If a task force is responsible for multiple AASHTOWare products or projects, a separate list
may be submitted for each product/project or a combined list can be submitted for all. Also,
if the task force uses multiple contractors with specifically defined individual deliverables and
work products, a separate list may be submitted for each contractor.

4.2 Select Deliverables and Work Products for QA Evaluation


The SCOJD and T&AA liaisons will review the lists and will select a sampling of the
completed deliverables and work products for evaluation. The items will be selected using
the following criteria:
■ The item is important in determining if the processes in a standard are being followed.
■ The content of the item is required to comply with the applicable standard.
■ A type of item has not been evaluated recently.
■ The evaluation of a specific item is important to the product task force.
After the items are selected for evaluation, the SCOJD liaison will notify the product task
force chairperson and the QA Analyst of the items selected.

4.3 Submit Deliverables and Work Products for Evaluation


After receiving the list of selected deliverable and work products, the task force chairperson
will send the selected items to the QA analyst electronically. If an exception was approved
regarding a standard applicable to the deliverable or work product, the exception approval
letter from SCOJD should also be included with the submission.

4.4 Evaluate Deliverables and Work Products


After receiving the deliverables and work products, the QA analyst will evaluate each item
for compliance against the applicable standard(s). The results of each evaluation are
documented in preliminary QA evaluation reports. Each report documents where the items
are not in compliance with the applicable standards and references any exceptions that
have been granted. The report also includes recommended actions to address the areas of
non-compliance. When completed, the preliminary evaluation reports are returned to the
product task force chairperson.

Page 2 06/16/2009
Quality Assurance Standard 4.010.02S

4.5 Review Evaluation Reports and Provide Comments


After receiving the preliminary evaluation reports, the task force chairperson should
distribute the reports to the task force members and contractor. The QA analyst will meet
(normally by conference call) with task force and contractor representatives to review the
evaluation results and noncompliance issues.

4.6 Resolve Issues and Provide Comments


Following the meeting/telephone call with the QA analyst, the task force should analyze the
preliminary evaluation reports and any notes from the meeting and decide if any corrective
actions will be taken to resolve the noncompliance issues. The task force chair should then
prepare a response to the evaluation reports and send the response to QA analyst and copy
the task force members, contractor, and liaisons. The decision to resolve or not resolve
noncompliance issues should be included in the response. If a deliverable or work product
will be updated, a target date should be provided for the revised deliverable. Revised
deliverables and work products should be submitted through the task force chairperson to
the QA analyst for re-evaluation.

4.7 Prepare and Distribute Final Evaluation Reports


After receiving the task force response, the QA analyst will prepare final QA evaluation
reports, which include the task force response. If an item was resubmitted and re-
evaluated, these results/actions are included in the final report.
A cover letter to the task force chairperson will be prepared and sent with the final QA
evaluation reports. A copy is provided to the SCOJD chairperson and liaison, T&AA
chairperson and liaison, and AASHTO Staff manager and liaison. The product task force
chairperson distributes the reports to the task force members and contractor
representatives. Any additional distribution should be handled by the recipients.

4.8 Submit Second List of Completed Deliverables and Work Products


Near the end of the fiscal year, the product task force chairperson will send an email to
SCOJD and T&AA liaisons providing a list of deliverables and work products completed
during the prior six months. The list may be submitted by email or letter and should be
submitted by June 30 of each fiscal year.
As with the first list, multiple lists may be submitted in cases of multiple products, projects,
and/or contractors.

4.9 Repeat procedures 4.2 through 4.7.


The procedures to select, submit, and evaluate deliverables or work products are the same
as those for the first list of completed items. The procedures to review reports, resolve
issues, and prepare the final evaluation reports are also the same.

4.10 Meet With QA Analyst at Contractor Work Site


The QA analyst will schedule at least one visit per year at the contractor work site. This visit
will normally be scheduled after the end of the fiscal year or early in the following fiscal year.
If needed, the task force may request an additional visit in conjunction with the first six
month review. The purpose of these visits will be to review deliverables, work products, and
other items that help determine standard compliance. In addition, the QA analyst will solicit
feedback from the product contractor on issues and concerns with the current QA standard,
as well as the other AASHTOWare standards. The QA analyst will also collect suggestions
for improving standards. The T&AA product task force liaison will normally attend these
visits, and it is recommended that the task force chairperson or their designee attend.

Page 3 06/16/2009
Quality Assurance Standard 4.010.02S

4.11 Prepare and Review Annual QA Report


At the beginning of each fiscal year, the QA analyst will produce an Annual QA Report that
summarizes the results of all evaluations performed during the past fiscal year. The report
will include the number of evaluations performed, the types of deliverables or work products
evaluated, noncompliance issues that were identified and resolved, noncompliance issues
that were indentified and not resolved, and number and types of exceptions that have been
approved. The report will also document any trends found and will document the findings
and recommendations from contractor work site visits.
The QA analyst will submit the Annual QA Report to the SCOJD chairperson. The SCOJD
chairperson will then distribute the report to the SCOJD members, AASHTO Staff, T&AA
chairperson, and the project task force chairpersons. SCOJD will solicit comments and
recommendations that would help in minimizing future non compliance issues and exception
requests.
The task force and contractor should review the annual report, and may choose to provide a
response to SCOJD. Based on the comments and recommendations from all parties,
SCOJD and T&AA will determine if any changes in the QA standard or other AASHTOWare
standards are needed. If changes are needed, SCOJD will provide direction to T&AA
regarding the time frames for planning and implementation the standard revisions.

5. Technical Requirements
There are no technical requirements for this standard.

6. Deliverable and Work Product Definitions


6.1 List of Deliverables and Work Products Completed
6.1.1 Description
This list is prepared for the first six months of the fiscal year and at the end of the fiscal
year; and should be submitted to both the SCOJD and T&AA liaison. The list may be
sent by email or by letter.
6.1.2 Content
No specific content other than the names of the deliverables is required.

6.2 QA Evaluation Report


6.2.1 Description
This report is not prepared by the task force or contractor; however, the task force and
contractor should review the report and provide a response to the findings.
6.2.2 Content
The results of the QA evaluation will be provided in this report. The report will include
the following content. Other content may also be added.
○ Report Date: The date the report was prepared.
○ Prepared By: The name of the person that prepared the QA evaluation report.
○ Task Force: The name of the task force submitting the deliverable or work product.
○ Project/Product: Name of the project or product of the deliverable or work product.
○ Deliverable: Name of the deliverable that was evaluated.
○ Deliverable/WP Version: The version of the deliverable or work product.
○ Submitted Date: The date the deliverable was submitted for evaluation.

Page 4 06/16/2009
Quality Assurance Standard 4.010.02S

○ Standard: The name and date of the AASHTOWare standard used for evaluation.
○ Evaluation Results: The evaluation results and areas of non-compliance.
○ Recommended Action: Recommended action to address non compliance.
○ Comments: Any overall comments regarding the evaluation of all items.
○ Task Force Response: This section will initially be blank. It will be updated after the
QA analyst receives comments from the task force.
○ Resolution: Text describing the resolution (if any) to the area of noncompliance.

6.3 Annual QA Report


6.3.1 Description
This report is not prepared by the task force or contractor; however, the task force and
contractor should review the report and optionally provide comments.
6.3.2 Content
The report summarizes findings, areas of noncompliance, and exceptions to standards,
trends, resource requirements for QA activities, recommendations, and other significant
results from the prior year’s quality assurance activities. The specific content of this
report will be developed during the first annual reporting period and the standard will be
updated after feedback is gathered.

Page 5 06/16/2009
This page is intentionally blank.
DISASTER RECOVERY
STANDARD
S&G Number: 4.020.02S
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
01 Feb. 1999 Initial Version April 1999
02 06/10/2009 Changed standard number from 4.01.030.01 to 06/16/2009
4.020.01S; and the standard template applied. Approved by
Made minor changes and format modifications. T&AA

Modified on 06/24/2009 23:25 PM


Disaster Recovery Standard 4.020.02S

Table of Contents
1. I. Introduction...................................................................................................... 1
1.1 A. Preface ...........................................................................................................1
1.2 Objectives...........................................................................................................1
2. Backups .............................................................................................................. 1
2.1 The Backup Plan ................................................................................................1
2.2 What Must be Backed Up...................................................................................1
2.3 How Often and How Long Must the Backups be Kept.....................................1
2.4 Incremental Backups .........................................................................................2
2.5 Full Backups.......................................................................................................2
2.6 Offsite Storage ...................................................................................................2
2.7 Compatible Media...............................................................................................3
2.8 Care of Media......................................................................................................3
2.9 Document What is Being Done .........................................................................3
3. Contractor Check List ........................................................................................ 5
4. Compliancy Check List ...................................................................................... 6

Created on 04/30/2009 11:39 PM Page i Modified on 06/24/2009 11:25 PM


Disaster Recovery Standard 4.020.02S

1. I. Introduction
1.1 A. Preface
When a contractor works on an AASHTOWare Product/Project, AASHTO is investing both
time and money into this effort. Until a release point is reached, AASHTO has nothing in
hand to show for this investment. If a disaster were to happen at the contractor’s site, the
work the contractor had done on the project may be lost. This would be a loss to AASHTO in
both time and money invested to the point of the disaster. Therefore, it is not unreasonable
to expect the contractor to have steps in place that would protect AASHTO’s investment in
the event of a disaster.

1.2 Objectives
The objective of this standard is to present the minimum steps an AASHTOWare contractor
must take to safeguard AASHTO’s development investment in a Product/Project should a
disaster occur. These steps must be in place throughout the time the contractor is
developing new or maintaining present AASHTOWare software, as well as anytime the
contractor is working on Task Force directed assignments.

2. Backups
One of the best ways to guard against the loss of software is to have more than one copy of the
software at any point in time. While the software is in development it is changing daily, so copies
must be made daily. Backups to tape are one of the best and most cost effective ways to save
multiple copies of software in development. Following are some of the items to be considered
when setting up backups.

2.1 The Backup Plan


To protect software in development from loss with backups, a good backup plan must be in
place. This plan needs to consider and address all the topics in this backup section as well
as any special circumstances the project may have. The plan must be documented in writing
and updated whenever things change. It must include the procedures, steps to be followed,
to recover the production-working environment at a backup site. Roles and responsibilities
also need to be clearly stated in the plan. The backup plan should also be reviewed on an
annual basis.

2.2 What Must be Backed Up


A big reason for backing up data is to safeguard it against loss, so it will not have to be
redone. This means that all parts of the development effort must be backed up. This must
include, but is not restricted to the following:
■ Source code
■ All documentation for the system
■ Tools and development software used (have a copy available)
■ Test Scripts and test data
■ Databases used to document system processes. (Include problem report and test status
databases.)

2.3 How Often and How Long Must the Backups be Kept
At a minimum, backups must be done daily. Multiple copies at all levels of backups must be
kept and not released until the next level up has been completed. A typical example would
be:
■ Daily backups kept for 14 days.
■ Weekly backups kept for 5 weeks.

Created on 04/30/2009 11:39 PM Page 1 Modified on 06/24/2009 11:25 PM


Disaster Recovery Standard 4.020.02S
■ Monthly backups kept for 13 months.
■ Yearly backups kept for the life of the development effort.
The backup cycle must be documented in the Backup Plan.

2.4 Incremental Backups


Full backups are usually best, but where data is large and backup windows are small,
incremental daily backups may be done. This means you will only be backing up data which
has changed that day. If a drive is lost on the seventh day of the backup cycle and a full
restore is needed, all 6 of the daily tapes as well as the last full backup will be needed and in
the right order. As with all backup tapes, incremental tapes need to be well documented.

2.5 Full Backups


Full backups are backups that copy all the data, not just data that has been changed in a
system. The advantage of doing a full backup daily is that if the system should need to be
restored it could be done from this one backup. There would be no problems making sure all
the right incremental backups were applied and in the right order. A full backup would be
needed in any case as a starting point for the restore.

2.6 Offsite Storage


One of the reason for doing backups is to safeguard the work and investment that is being
done against loss in the event of a disaster. Disasters come in many forms, hardware
failures, fires, floods, or a disgruntled employee to name a few. Storing a recent copy of the
system being worked on at another site is a good safeguard against loss should a disaster
happen at the main site. One should pick this offsite location carefully. Consider why you are
storing this data offsite, to safeguard it against loss should a disaster occur. Some of the
things that should be considered when picking an offsite location are listed here.
1. Distance from Main Site
The offsite storage location should not be located near the main site. If the disaster were
a fire or flood and the offsite location was next door or in the same block as the main
site, the offsite location might also be lost. The offsite location should be at a distance
great enough so that any disaster at the main site will not impact the offsite location. One
must also make sure that the data can easily be moved from the main site to the offsite
location and back.
2. Environment
The environment at the offsite storage location should be one suited for the storage of
the backup tapes. A damp warehouse basement must not be used as an offsite storage
location.
3. Kind of Location
One of the best places to store offsite backups is at a site setup specifically for this
function. This site would have a controlled environment and could be managed by a
company who’s business it is to do this type of offsite storage. Another office of the main
company might also be an option, if the location and environment were right.
Another location that is sometimes used is an employee’s home; and although it is better
than nothing, it is not recommended as the main offsite storage location. This site is
often near the main site and has no controlled environment. In addition, if the disaster is
a disgruntled employee and it is at their home the backups are being kept, all could be
lost.

Created on 04/30/2009 11:39 PM Page 2 Modified on 06/24/2009 11:25 PM


Disaster Recovery Standard 4.020.02S
4. Frequency of Storage
Backups must be moved to the offsite storage location at least weekly. This should be
done more often if the development work is large with many changes happening daily.
This is not a complete list but does provide a starting point when picking a location for
offsite storage. The main thing to consider is that the offsite location should be safe and
accessible if a disaster should happen at the main site. In your planning cover as many
types of disaster at the main site as possible.

2.7 Compatible Media


When one backs up a system for the purpose of disaster recovery, it is important to
consider the compatibility of the media being used for the backup. Putting the backup on
a tape that requires special equipment or special software to read is not desirable. The
special equipment or software may not be available, or easily obtained, at the site being
used to recover from a disaster at the main site.
It is best to pick a media, usually tape, and equipment to read and write the media that is
common to many sites and that can easily be obtained from a market leader in supplying
this type of hardware. Picking the software that is used to write the data to the media is
also important, because it is best to have the same type of software to read the media
when restoring the data. Again it is best to use backup software that is common to many
sites and can easily be obtained from a market leader in supplying this type of software.
One of the best ways to make sure your media is compatible is to take the media to your
backup location, or a location that is setup similar to a location you could setup as a
backup site, and try to restore the data. If you can restore from the backup tape and use
the data at the backup site, a good test of the media’s compatibility has been done.

2.8 Care of Media


Once the backup of the system has been done, the media it is backed up on must be
properly cared for. The manufacturer of the media being used should have guidelines on the
care of its media. Some of the things these guidelines will cover and the AASHTOWare
contractor must follow are:
1. Environment
The environment the media is stored in is important. Temperature, humidity, and air
quality are a few environmental factors that effect the life of the media being stored.
2. Handling
The media should be handled properly throughout its life. Transport to and from the
offsite storage location are important factors.
3. Age
The age of the media should be tracked; and when it passes the manufacturer’s
recommendation for media life, it must be replaced.
4. Usage
The use the media gets (number of times the media has been written to and read from)
should be tracked; and when it passes the manufacturers use recommendation, the
media must be replaced.
The manufacturer’s guidelines for the care of the media being use must be completely
followed to ensure the best chance of having a readable media if it is needed.

2.9 Document What is Being Done


Documentation is an important part of your backups in your Disaster Recovery Plan. Should
a disaster happen at the main site and the system need to be restored at another site, not

Created on 04/30/2009 11:39 PM Page 3 Modified on 06/24/2009 11:25 PM


Disaster Recovery Standard 4.020.02S
only would the backup media be needed but knowing what is on each of the media would be
very useful. Knowing what is on each of the backups, which is the latest, and what might not
be on the backups, will all help shorten the time it takes to recover a system. To ensure that
this documentation is available when recovering from a disaster, it must also be stored
offsite.

Created on 04/30/2009 11:39 PM Page 4 Modified on 06/24/2009 11:25 PM


Disaster Recovery Standard 4.020.02S

3. Contractor Check List


The Contractor Check List must be completed by the contractor and sent to the Task Force for
their review.

Contractor Check List


Is there a written backup plan ? (yes/no)
What is being backed up
Source code (yes/no)
Project/Product documentation (yes/no)
Test scripts and test data
Databases used for the project/product (yes/no)
At a minimum, are daily backups of changed data being done (yes/no)
Are weekly, monthly, and yearly backups of all data being done (yes/no)
Is the media and software being used common, a market leader (yes/no)
List type and brand of media:
___________________________________________________
What software is being used for the backups:
___________________________________________________
Is the media being tracked for age and use (yes/no)
Offsite Storage
Is backup media being stored offsite (yes/no)
Distance main site is from offsite storage location
Does offsite storage location have a controlled environment for
media storage (yes/no)
Location of offsite storage:
_____________________________________________
How often is media taken to offsite location
Is the documentation of what is being stored on what media
being stored offsite as well (yes/no)

Created on 04/30/2009 11:39 PM Page 5 Modified on 06/24/2009 11:25 PM


Disaster Recovery Standard 4.020.02S

4. Compliancy Check List


The Compliancy Check List, shown on the next page, is the Contractor Check List with the
minimum level for compliancy written in italic. The Task Force may take this check list and
compare it with the Contractor Check List (made out by the contractor) to see if the contractor is
in compliance with this standard.

Compliancy Check List


Is there a written backup plan ? (yes/no) YES
What is being backed up
Source code (yes/no) YES
Project/Product documentation (yes/no) YES
Test scripts and test data YES
Databases used for the project/product (yes/no) YES
At a minimum, are daily backups of changed data being done (yes/no) YES
Are weekly, monthly, and yearly backups of all data being done (yes/no) YES
Is the media and software being used common, a market leader (yes/no) YES
List type and brand of media:
(A market leading brand media must be listed here)
What software is being used for the backups:
(A market leading brand software must be listed here)
Is the media being tracked for age and use (yes/no) YES
Offsite Storage
Is backup media being stored offsite (yes/no) YES
Distance main site is from offsite storage location MILES1
Does offsite storage location have a controlled environment for YES
media storage (yes/no)
Location of offsite storage:
(Offsite storage location must be listed here)
How often is media taken to offsite location WEEKLY
Is the documentation of what is being stored on what media YES
being stored offsite as well (yes/no)

Best distance will depend on location and type of controlled environment. In most cases it must be at
least a few miles from the main site.

Created on 04/30/2009 11:39 PM Page 6 Modified on 06/24/2009 11:25 PM


5 - Appendices
This page is intentionally blank.
AASHTOWARE LIFECYCLE
FRAMEWORK (ALF)
S&G Number: 5.010.02R
Effective Date: July 1, 2009

Document History
Version Revision Approval
Revision Description
No. Date Date
02 06/16/2009 Replaces AASHTOWare Lifecycle Framework 06/16/2009
Process Areas and Work Products documents Approved by
(1.01.G01.01 and 1.01.G02.01). T&AA

06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Table of Contents

1. Purpose ............................................................................................................... 1
2. Overview of ALF and CMMI-DEV....................................................................... 1
2.1 Process Areas.............................................................................................................2
2.2 Related Process Areas...............................................................................................2
2.3 Process Area Categories ...........................................................................................2
2.4 List of Categories and Process Areas.......................................................................2
2.5 Specific Goals.............................................................................................................3
2.6 Specific Practices.......................................................................................................3
2.7 Typical Work Products...............................................................................................4
2.8 Generic Goals .............................................................................................................4
2.9 Generic Practices .......................................................................................................4
2.10 Staged and Continuous Representation...................................................................4
2.11 Capability Levels ........................................................................................................5
2.12 AASHTOWare Implementation of Process Areas.....................................................6
3. Generic Goals and Practices ............................................................................. 6
3.1 GG 1: Achieve Specific Goals....................................................................................6
3.2 GG 2: Institutionalize a Managed Process................................................................6
3.3 GG 3: Institutionalize a Defined Process ..................................................................9
3.4 Applying Generic Practices .......................................................................................9
3.5 Process Areas That Support Generic Practices.......................................................9
4. Process Area Descriptions .............................................................................. 11
4.1 Organizational Process Focus ................................................................................11
4.2 Organizational Process Definition...........................................................................13
4.3 Organizational Training............................................................................................14
4.4 Project Planning .......................................................................................................15
4.5 Project Monitoring and Control ...............................................................................17
4.6 Supplier Agreement Management ...........................................................................19
4.7 Requirements Development ....................................................................................20
4.8 Requirements Management .....................................................................................22
4.9 Technical Solution....................................................................................................23
4.10 Product Integration...................................................................................................25
4.11 Verification................................................................................................................27
4.12 Validation ..................................................................................................................29
4.13 Configuration Management .....................................................................................30
4.14 Process and Product Quality Assurance................................................................31
4.15 Measurement and Analysis......................................................................................32
4.16 Advanced Process Areas.........................................................................................34

Page i 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

1. Purpose
The AASHTOWare Lifecycle Framework (ALF) was developed to:
● Improve the AASHTOWare software development and maintenance processes and,
subsequently, improve AASHTOWare products.
● Provide a framework for creating AASHTOWare process improvement projects. These
projects will involve the development of new standards and guidelines and the revision of
existing standards and guidelines that are based on goals and practices within the
framework.
● Recommend typical work products that should be created to support each standard or
guideline based on ALF. These work products are the recommended output or results that
should be created when implementing the practices defined by each standard and guideline.
● Provide a method for mapping the AASHTOWare standards and guidelines against the
framework and for reporting the status of process improvement projects.
● Provide a method for measuring improvement in AASHTOWare processes.
Process improvement projects will normally involve the development of standard processes that
implement specific practices with required outcomes or work products; therefore, most of these
projects will involve the development of new standards or the revision of existing standards.
Guidelines may also be developed in those cases where AASHTOWare management
determines that it’s best to implement the process as recommended practices rather than as a
requirement. In addition, AASHTOWare may choose to implement certain processes as a
guideline for an evaluation period with a future goal of implementing the processes as a
standard.
It should be noted, that additional standards and guidelines will be developed and maintained
independent of AASHTOWare Lifecycle Framework. These standards and guidelines typically
involve technical specifications or requirements for AASHTOWare software development and
maintenance. The process to develop and maintain the standards and guidelines is defined in
the AASHTOWare Standards and Guidelines Definition Process (ASGD) which is included in
the AASHTOWare Standards and Guidelines Notebook.

2. Overview of ALF and CMMI-DEV


The AASHTOWare Lifecycle Framework (ALF) is based on the Capability Maturity Model
Integration for Development (CMMI-DEV) which was developed by the Software Engineering
Institute (SEI) of Carnegie Mellon University. The CMMI-DEV model consists of best practices
that address development and maintenance activities that cover the product lifecycle from
conception through delivery and maintenance.
The current version of ALF is based on version 1.2 of CMMI-DEV. The complete
documentation for CMMI-DEV, V1.2 is available on the SEI web site at the following address:
http://www.sei.cmu.edu/publications/documents/06.reports/06tr008.html. Much of the content in
this document was extracted from the CMMI-DEV V1.2 document. It should be noted that ALF
does not include the additional requirements for the CMMI-DEV+IPPD model which is also
described in the CMMI-DEV V1.2 document.
The SEI has taken the process management premise that “the quality of a system or product is
highly influenced by the quality of the process used to develop and maintain it.” The CMMI-DEV
model was created to embrace this premise. This premise is the primary reason AASHTOWare
has chosen CMMI-DEV as the basis for improving its software development and maintenance
processes.

Page 1 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

2.1 Process Areas


A process area is a cluster of related practices in an area that, when implemented
collectively, satisfy a set of goals considered important for making improvement in that area.
The ALF model currently includes the fifteen process areas from the CMMI-DEV V1.2 model
that are classified as “Basic”. Refer to the next section for more information.
Each process area in the framework also includes information on related process areas,
specific goals and practices, and typical work products for each process area, as well as,
generic goals and practices which apply to multiple process areas. Each of these topics is
discussed below.

2.2 Related Process Areas


There are certain interactions among process areas that help to see an organization’s view
of process improvement and help to see which process areas build on the implementation of
other process areas. Relationships among process areas are presented in two dimensions.
The first dimension comprises the interactions of individual process areas that show how
information and artifacts flow from one process area to another. These interactions help to
see a larger view of process improvement. The second dimension comprises the
interactions of groups of process areas. Process areas are classified as either “Basic” or
“Advanced”. The “Basic” process areas should be implemented before the “Advanced”
process areas to ensure that the prerequisites are met to successfully implement the
“Advanced” process areas.
The “Process Area Descriptions” section includes a table of related process areas for each
process area described. An example of a related process area for the “Requirements
Management” process area is the “Project Planning” process area which provides additional
information about how project plans reflect requirements and need to be revised as
requirements change.

2.3 Process Area Categories


Process areas can also be grouped into group of related process areas by the four
categories listed below:
■ Process Management process areas contain the cross-project activities related to
defining, planning, deploying, implementing, monitoring, controlling, appraising,
measuring, and improving processes.
■ Project Management process areas cover the project management activities related to
planning, monitoring, and controlling the project.
■ Software Engineering process areas cover the development and maintenance activities
that are shared across software engineering disciplines. Software engineering includes
the requirements development, requirements management, technical solution, product
integration, verification, and validation process areas.
■ Support process areas cover the activities that support product development and
maintenance. The Support process areas address processes that are used in the
context of performing other processes. In general, the Support process areas address
processes that are targeted toward the project and may address processes that apply
more generally to the organization. For example, the “Process and Product Quality
Assurance” process area can be used with all the process areas to provide an objective
evaluation of the processes and work products described in all the process areas.

2.4 List of Categories and Process Areas


The following table includes the each process area is listed with its appropriate category and
its classification of “Basic or Advanced”. For the time being, development of process areas
under the AASHTOWare Lifecycle Framework (ALF) will be limited to the “Basic” process

Page 2 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

areas. Refer to the CMMI-DEV V1.2 document for additional information on the
relationships among process areas.
Categories / Process Areas Basic / Advanced
Process Management
Organizational Process Focus (OPF) Basic
Organizational Process Definition (OPD) Basic
Organizational Training (OT) Basic
Organizational Process Performance (OPP) Advanced
Organizational Innovation and Deployment (OID) Advanced
Project Management
Project Planning (PP) Basic
Project Monitoring and Control (PMC) Basic
Supplier Agreement Management (SAM) Basic
Integrated Project Management (IPM) Advanced
Risk Management (RSKM) Advanced
Quantitative Project Management (QPM) Advanced
Software Engineering
Requirements Management (REQM) Basic
Requirements Development (RD) Basic
Technical Solution (TS) Basic
Product Integration (PI) Basic
Verification (VER) Basic
Validation (VAL) Basic
Support
Configuration Management (CM) Basic
Process and Product Quality Assurance (PPQA) Basic
Measurement and Analysis (MA) Basic
Decision Analysis and Resolution (DAR) Advanced
Causal Analysis and Resolution (CAR) Advanced

2.5 Specific Goals


A specific goal describes the unique characteristics that must be present to satisfy the
process area. A specific goal is used in appraisals to help determine whether a process
area is satisfied. An example of a specific goal from the Requirements Management”
process area is: “Requirements are managed and inconsistencies with project plans and
work products are identified”. The ALF Process Area” section below provides a list of the
specific goals for each ALF process area.

2.6 Specific Practices


A specific practice is the description of an activity that is considered important in achieving
the associated specific goal. The specific practices describe the activities that are expected
to result in achievement of the specific goals of a process area. A specific practice is an
expected model component. An example of a specific practice from the “Requirements
Management” process area is: “Maintain bidirectional traceability among the requirements
and work products”. The specific will be implemented as procedures in each standard or
guideline that is based on ALF. If needed for clarity or simplicity, the procedures will be
divided into lower level activities and tasks. The “ALF Process Area” section below provides
a list of the specific practices for each specific goal in the process area.

Page 3 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

2.7 Typical Work Products


Most specific practices include one or more typical work products. These are typical outputs
or results from the specific practice. An example of a typical work product from the
“Maintain bidirectional traceability among the requirements and work products” specific
practice in the “Requirements Management” process area” is the “Requirements Traceability
Matrix”.
Each process area in the “ALF Process Areas” section below includes a list of Typical Work
Products. These represent potential outcomes or work products that should be considered
when developing new and revised standards and guidelines.

2.8 Generic Goals


A generic goal applies to multiple process areas, and describes the characteristics that must
be present to institutionalize the processes that implement a process area. As with a
specific goal, a generic goal is used in appraisals to determine whether a process area is
satisfied. An example of a generic goal is: “The process is institutionalized as a defined
process”. The “Generic Goals and Practices” section below provides a description of each
ALF generic goals.

2.9 Generic Practices


A generic practice applies to multiple process areas, and describes an activity that is
considered important in achieving the associated generic goal. An example generic practice
for the generic goal “The process is institutionalized as a managed process” is “Provide
adequate resources for performing the process, developing the work products, and
providing the services of the process.” The “Generic Goals and Practices” section below
provides a description of the generic practices for each generic goal.

2.10 Staged and Continuous Representation


Levels are used in CMMI to describe an evolutionary path recommended for an organization
that wants to improve the processes it uses to develop and maintain its products and
services. Levels can also be the outcome of the rating activity of appraisals. Appraisals can
be performed for organizations that comprise entire (usually small) companies, or for smaller
groups such as a group of projects or a division within a company. CMMI-DEV enables an
organization to approach process improvement and appraisals using two different
representations: continuous and staged.
2.10.1 Staged Representation
The staged representation is concerned with the overall maturity level of the
organization, whether individual processes are performed or incomplete is not the
primary focus. It prescribes an order for implementing process areas according to
maturity levels, which define the improvement path for an organization from the initial
level to the optimizing level.
If you do not know where to start and which processes to choose to improve, the staged
representation is a good choice for you. It gives you a specific set of processes to
improve at each stage that has been determined through more than a decade of
research and experience with process improvement.
2.10.2 Continuous Representation
The continuous representation offers maximum flexibility when using a CMMI model for
process improvement. An organization may choose to improve the performance of a
single process-related trouble spot, or it can work on several areas that are closely
aligned to the organization’s business objectives. The continuous representation also
allows an organization to improve different processes at different rates. There are some

Page 4 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

limitations on an organization’s choices because of the dependencies among some


process areas.
The continuous representation is concerned with selecting both a particular process area
to improve and the desired capability level for that process area. It has the same
process areas as staged but provides more flexibility for picking the development of
process areas in an order that fits business needs. The continuous representation uses
capability levels to characterize improvement in an individual process area and was
chosen as the best fit for ALF.
Refer to the “Tying It All Together” chapter in the CMMI-DEV V1.2 document for more
information on staged and continuous representations.

2.11 Capability Levels


Capability levels are used to support those using the continuous representation of the
CMMI-DEV model. A capability level consists of a generic goal and its related generic
practices as they relate to a process area, which can improve the organization’s processes
associated with that process area. As you satisfy the generic goal and its generic practices
at each capability level, you reap the benefits of process improvement for that process area.
CMMI-DEV includes six capability levels, designated by the numbers 0 through 5; however,
ALF will only include capability levels 0-3. Each process area in ALF may be developed to a
higher capability level independently of the other process areas in the framework. Initial
development of standard processes will be to capability level 1 and none will be developed
beyond capability level 3. The ALF capability levels are defined below
■ Capability Level 0: Incomplete. One or more of the specific goals of the process area
are not satisfied.
■ Capability Level 1: Performed. The process satisfies the specific goals and specific
practices of the process area; however, it is not institutionalized.
■ Capability Level 2: Managed. The process which was performed at capability level 1
becomes a managed process when:
○ There is a policy that indicates the process will be performed,
○ It is planned and executed in accordance with policy,
○ There are resources provided to support and implement the process and produce the
required work products,
○ Training is provided on how to perform the process,
○ The process and work products are monitored, controlled, and reviewed, and
○ The process and work products are evaluated for adherence to the standard
process.
■ Capability Level 3: Defined. The process which was managed at capability level 2
becomes a defined process when:
○ Tailoring guidelines are established that allows a specific project to customize the
standard process to suit the needs of that particular project. This allows consistency,
except for the differences allowed by the tailoring guidelines.
○ The process contributes work products, measures, and other process improvement
information to the organizational process assets.
○ The process clearly states the purpose, inputs, entry criteria, activities, roles,
measures, verification steps, outputs, and exit criteria. At capability level 3,
processes are managed more proactively using an understanding of the
interrelationships of the process activities and detailed measures of the process, its
work products, and its services.
Refer to the “Tying It All Together” chapter in the CMMI-DEV V1.2 document for more
information on capability levels.
Page 5 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

2.12 AASHTOWare Implementation of Process Areas


As discussed above, AASHTOWare will implement process improvement through
developing and implementing standards and guidelines that are based on a framework of
CMMI-DEV process areas, goals, and practices, as well as other industry process
improvement best practices, where appropriate.
The implementation of ALF will be accomplished over multiple years by incrementally
developing new standards and guidelines and by revising existing standards and guidelines.
Each standard or guideline will be based on the process improvements goals in ALF and will
include procedures that implement the ALF practices. A single ALF standard may include
the practices from a single process area or from multiple process areas. The time line to
develop and implement the ALF-based standards will be planned and scheduled through the
AASHTOWare strategic plans and annual work plans.
Although the goal will be to implement standards for all Process Area Descriptions and
practices, AASHTOWare management recognizes that this may not be possible within the
constraints of an organization that is primarily composed of part-time, volunteer employees.
Due to these constraints, in some cases, certain practices may not be included in initial
process implementations and others may never be implemented. In addition, certain
process areas may never be implemented. As discussed previously, ALF will initially only
address the process areas in the Basic classification.
Detailed descriptions of each ALF process area are included in the “Process Area
Descriptions” section below.

3. Generic Goals and Practices


This section describes generic goals one through three and the generic practices for each of
these goals. Each goal includes a number GG n followed by a title of the goal. The text of the
goal follows the goal number and title in italicized text.
As discussed previously to achieve capability level one for a process area, all generic practices
for goal one must be met. Capability level two is achieved by meeting all of the generic
practices for goal two; and capability level three is achieved by satisfied the generic practices for
goal three.

3.1 GG 1: Achieve Specific Goals


The process supports and enables achievement of the specific goals of the process area by
transforming identifiable input work products to produce identifiable output work products.
To achieve capability level one for a process area, the following practice must be performed
for that process area.
3.1.1 GP 1.1: Perform Specific Practices
Perform the specific practices of the process area to develop work products and provide
services to achieve the specific goals of the process area.
This practice is performed by producing the work products and delivering the services
that are defined for the process area. For example, by performing the specific practices
in the Project Management process area and by producing the recommended work
products, this general practice is satisfied.

3.2 GG 2: Institutionalize a Managed Process


The process is institutionalized as a managed process.
Achieving capability level two for a process area is equivalent to saying you manage the
performance of processes associated with the process area. To achieve capability level two

Page 6 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

for a process area, the work products for that process area must be produced, as in level
one, and all of the practices listed below must be performed.
3.2.1 GP 2.1: Establish an Organizational Policy
Establish and maintain an organizational policy for planning and performing the process.
This generic practice is performed for a process area when AASHTOWare implements a
standard or policy that requires the practices defined in the process area to be planned
and performed. For example, the Requirements Standards defines organizational
procedures and required work products that must be planned, created, submitted, and
approved for the Requirements Development and Requirements Management process
areas.
3.2.2 GP 2.2: Plan the Process
Establish and maintain the plan for performing the process.
This generic practice is performed for a process area when the project/product task force
or contractor plans the tasks and work products for that process area in the project plan,
work plan, or another planning document. An example of this is including tasks to
develop, submit, and obtain approval for the System Requirements Specification in the
work plan. Another example is to plan the configuration management activities and work
products as a component of the work plan or as a separate configuration management
plan.
3.2.3 GP 2.3: Provide Resources
Provide adequate resources for performing the process, developing the work products,
and providing the services of the process.
Resources include adequate funding, appropriate physical facilities, skilled people, and
appropriate tools. Examples include the following:
○ Skilled staff: Project management, quality assurance, configuration management,
database management, system analysis, software development, sub matter experts,
etc.
○ Tools: Project management and scheduling, configuration management, problem
tracking, software development, prototyping, process modeling, database
management, testing, requirements tracking, etc.
3.2.4 GP 2.4: Assign Responsibility
Assign responsibility and authority for performing the process, developing the work
products, and providing the services of the process.
Examples would be assigning staff to perform configuration management and quality
assurance processes.
3.2.5 GP 2.5: Train People
Train the people performing or supporting the process as needed.
Examples of training topics include the following:
○ Planning, managing, and monitoring projects
○ Change management
○ Configuration management
○ Process modeling
○ Risk management
○ Data management
○ Requirements definition and analysis

Page 7 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

○ Design methods
○ Testing
3.2.6 GP 2.6: Manage Configurations
Place designated work products of the process under appropriate levels of control.
Examples of work products that should be placed under control include the following:
○ Project plans
○ Organization’s set of standard processes
○ Work breakdown structures and project schedules
○ Status reports
○ Change requests
○ Quality assurance reports
○ User and system requirements
○ Requirements traceability matrix
○ System design documents
○ Code, build scripts, and installation scripts
○ Test plans, scripts, and test results
○ User, installation, operation, and maintenance documentation
○ Training materials
○ Deliverable submittal and acceptance documentation
3.2.7 GP 2.7: Identify and Involve Relevant Stakeholders
Identify and involve the relevant stakeholders of the process as planned.
Examples of stakeholder involvement include stakeholder reviewing work plans;
stakeholders participating in requirements collection, review, and validation;
stakeholders participating in problem or issue resolutions; and stakeholders participating
in testing activities.
3.2.8 GP 2.8: Monitor and Control the Process
Monitor and control the process against the plan for performing the process and take
appropriate corrective action.
An example is to monitor and control the schedule and budget against the project plan
and take appropriate corrective action. Another example is to monitor and control
process used for requirements changes against the plan for performing the change
control process and take appropriate corrective action. Monitoring and controlling the
test process against the test plan is another example.
3.2.9 GP 2.9: Objectively Evaluate Adherence
Objectively evaluate adherence of the process against its process description,
standards, and procedures, and address noncompliance.
Examples are objectively evaluating processes and work products against the
Requirements and Testing Standards and tracking and communicating noncompliance
issues.
3.2.10 GP 2.10: Review Status with Higher Level Management
Review the activities, status, and results of the process with higher level management
and resolve issues.
Examples include reviewing the status of process improvement projects with SCOJD,
and reviewing the results of a pilot process with SCOJD.

Page 8 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

3.3 GG 3: Institutionalize a Defined Process


The process is institutionalized as a defined process.
3.3.1 GP 3.1: Establish a Defined Process
Establish and maintain the description of a defined process.
Examples are to define and maintain AASHTOWare standards for that define an
organizational process for Project Management, Requirements, and Testing. These
would be standards are that the project/product task forces and contractors are required
to comply with. Defined tailoring methods would allow project specific modifications to
the standards.
3.3.2 GP 3.2: Collect Improvement Information
Collect work products, measures, measurement results, and improvement information
derived from planning and performing the process to support the future use and
improvement of the organization’s processes and process assets.
Examples of work products, measures, measurement results, and improvement
information include the following:
○ Records of significant deviations from plans
○ Corrective action results
○ Estimated costs versus actual costs
○ Quality assurance report that identifies areas for improvement
○ Number of requirements introduced at each phase of the project lifecycle
○ Number of unfunded requirements changes after baselining
○ Lessons learned reports
○ Results of applying new methods and tools
○ Number of product defects found during each testing phase

3.4 Applying Generic Practices


Generic practices are components that are common to all process areas. Think of generic
practices as reminders. They serve the purpose of reminding you to do things right, and are
expected model components.
For example, when you are achieving the specific goals of the Project Planning process
area, you are establishing and maintaining a plan that defines project activities. One of the
generic practices that applies to the Project Planning process area is “Establish and
maintain the plan for performing the project planning process” (GP 2.2). When applied to
this process area, this generic practice reminds you to plan the activities involved in creating
the plan for the project.
When you are satisfying the specific goals of the Organizational Training process area, you
are developing the skills and knowledge of people in your project and organization so that
they can perform their roles effectively and efficiently. When applying the same generic
practice (GP 2.2) to the Organizational Training process area, this generic practice reminds
you to plan the activities involved in developing the skills and knowledge of people in the
organization.

3.5 Process Areas That Support Generic Practices


While generic goals and generic practices are the model components that directly address
the institutionalization of a process across the organization, many process areas likewise
address institutionalization by supporting the implementation of the generic practices.
Knowing these relationships will help you effectively implement the generic practices.

Page 9 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Such process areas contain one or more specific practices that when implemented may also
fully implement a generic practice or generate a work product that is used in the
implementation of a generic practice. The following types of relationships between generic
practices and process areas occur:
■ The process areas that support the implementation of generic practices
■ The recursive relationships between generic practices and their closely related process
Both types of relationships are important to remember during process improvement to take
advantage of the natural synergies that exist between the generic practices and their related
process areas. Given the dependencies that generic practices have on these process
areas, and given the more “holistic” view that many of these process areas provide, these
process areas are often implemented early, in whole or in part, before or concurrent with
implementing the associated generic practices.
To support the meeting generic practices of generic goal 2 and achieving Capability Level 2,
the following process areas should be considered for early implementation:
■ Organizational Training
■ Project Planning
■ Project Monitoring and Control
■ Integrated Project Management (Advanced)
■ Configuration Management
■ Measurement and Analysis
■ Process and Product Quality Assurance
To support the meeting generic practices of generic goal 3 and achieving Capability Level 3,
the following process areas should be considered for early implementation:
■ Organizational Process Focus
■ Organizational Process Definition
■ Integrated Project Management (Advanced)
Refer to the “Process Areas That Support Generic Practices” section in “Part Two - Generic
Goals and Generic Practices, and the Process Areas” in the CMMI-DEV V1.2 document for
more information on this topic.

Page 10 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

4. Process Area Descriptions


This section provides a description of each of the ALF process area including the purpose,
related process areas, specific goals, specific process, and typical work products. Each ALF
specific goal and practice includes the same number, title, and description as used in CMMI-
DEV V1.2. As discussed previously, each process area from the CMMI-DEV V1.2 model is
included in the ALF framework; however, only those classified as Basic will be addressed
initially. Those process areas that are classified as Advance will not be addressed in the
foreseeable future, and will only be listed with their purpose. Additional details regarding each
process area, including those classified as Advanced, can be found in the “Generic Goals and
Generic Practices, and the Process Areas” chapter in the CMMI-DEV V1.2 document.
Each Basic process area description also includes a reference to the current standards,
guidelines, policies or procedures that requires or recommends the implementation of the
practices in the process area. Future implementations are also noted in general terms.

4.1 Organizational Process Focus


The purpose of Organizational Process Focus (OPF) is to plan, implement, and deploy
organizational process improvements based on a thorough understanding of the current
strengths and weaknesses of the organization’s processes and process assets.
AASHTOWare’s set of processes includes the Standards and Guidelines Notebook, the
Cooperative Computer Software Policies, Guidelines and Procedures (PG&P), and the
AASHTOWare Project/Product Task Force Handbook. The process assets include the
various standards, guidelines, the ALF framework, Lifecycle Models, policies, procedures,
Groove workspaces, templates, work products, QA evaluation reports, and other artifacts
and tools used to plan, develop, implement, manage, and improve the set of processes.
This process area is currently supported by the AASHTOWare Strategic Planning Process,
the AASHTOWare Standards and Guidelines Definition Standard, and the Quality
Assurance Standard. Organization Process Focus will be further addressed in the future by
a new or revised standard, guideline, policy, or procedure.
4.1.1 Related Process Areas
Process Area Related Topic
Organizational Process Definition Organizational process assets

4.1.2 Specific Goals and Practices


Specific Goals and Practices
SG 1: Determine Process Improvement Opportunities
Strengths, weaknesses, and improvement opportunities for the organization's processes are
identified periodically and as needed.
SP 1.1: Establish Organizational Process Needs
Establish and maintain the description of the process needs and objectives for the
organization.
Typical Work Products
• Organization’s process needs and objectives
SP 1.2: Appraise the Organization’s Processes
Appraise the organization's processes periodically and as needed to maintain an
understanding of their strengths and weaknesses.
Typical Work Products
• Plans for the organization's process appraisals
• Appraisal findings that address strengths and weaknesses of the organization's processes
• Improvement recommendations for the organization's processes

Page 11 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


SP 1.3: Identify the Organization's Process Improvements
Identify improvements to the organization's processes and process assets.
Typical Work Products
• Analysis of candidate process improvements
• Identification of improvements for the organization's processes
SG 2: Plan and Implement Process Improvements
Process actions that address improvements to the organization’s processes and process assets
are planned and implemented.
SP 2.1: Establish Process Action Plans
Establish and maintain process action plans to address improvements to the organization's
processes and process assets.
Typical Work Products
• Organization's approved process action plans
SP 2.2: Implement Process Action Plans
Implement process action plans.
Typical Work Products
• Commitments among the various process action teams
• Status and results of implementing process action plans
• Plans for pilots
SG 3: Deploy Organizational Process Assets and Incorporate Lessons Learned
The organizational process assets are deployed across the organization and process-related
experiences are incorporated into the organizational process assets.
SP 3.1: Deploy Organizational Process Assets
Deploy organizational process assets across the organization.
Typical Work Products
• Plans for deploying organizational process assets and changes to them across the
organization
• Training materials for deploying organizational process assets and changes to them
• Documentation of changes to organizational process assets
• Support materials for deploying organizational process assets and changes to them
SP 3.2: Deploy Standard Processes
Deploy the organization’s set of standard processes to projects at their startup and deploy
changes to them as appropriate throughout the life of each project.
Typical Work Products
• Organization's list of projects and status of process deployment on each project (i.e., existing
and planned projects)
• Guidelines for deploying the organization’s set of standard processes on new projects
• Records of tailoring the organization’s set of standard processes and implementing them on
identified projects
SP 3.3: Monitor Implementation
Monitor the implementation of the organization’s set of standard processes and use of process
assets on all projects.
Typical Work Products
• Results of monitoring process implementation on projects
• Status and results of process-compliance evaluations
• Results of reviewing selected process artifacts created as part of process tailoring and
implementation
SP 3.4: Incorporate Process-Related Experiences into the Organizational Process
Assets
Incorporate process-related work products, measures, and improvement information derived
from planning and performing the process into the organizational process assets.

Page 12 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• Process improvement proposals
• Process lessons learned
• Measurements on the organizational process assets
• Improvement recommendations for the organizational process assets
• Records of the organization's process improvement activities
• Information on the organizational process assets and improvements to them

4.2 Organizational Process Definition


The purpose of Organizational Process Definition (OPD) is to establish and maintain a
usable set of organizational process assets and work environment standards.
This process area is currently supported by the AASHTOWare Standards and Guidelines
Definition Standard.
4.2.1 Related Process Areas
Process Area Related Topic
Organizational Process Focus Organizational process-related matters

4.2.2 Specific Goals and Practices


Specific Goals and Practices
SG 1: Establish Organizational Process Assets
A set of organizational process assets is established and maintained.
SP 1.1: Establish Standard Processes
Establish and maintain the organization's set of standard processes.
Typical Work Products
• Organization's set of standard processes
SP 1.2: Establish Life-Cycle Model Descriptions
Establish and maintain descriptions of the life-cycle models approved for use in the
organization.
Typical Work Products
• Descriptions of lifecycle models
SP 1.3: Establish Tailoring Criteria and Guidelines
Establish and maintain the tailoring criteria and guidelines for the organization's set of standard
processes.
Typical Work Products
• Tailoring guidelines for the organization's set of standard processes
SP 1.4: Establish the Organization’s Measurement Repository
Establish and maintain the organization’s measurement repository.
Typical Work Products
• Definition of the common set of product and process measures for the organization’s set of
standard processes
• Design of the organization’s measurement repository
• Organization's measurement repository (that is, the repository structure and support
environment)
• Organization’s measurement data
SP 1.5: Establish the Organization’s Process Asset Library
Establish and maintain the organization's process asset library.

Page 13 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• Design of the organization’s process asset library
• Organization's process asset library
• Selected items to be included in the organization’s process asset library
• Catalog of items in the organization’s process asset library
SP 1.6: Establish Work Environment Standards
Establish and maintain work environment standards.
Typical Work Products
• Work environment standards

4.3 Organizational Training


The purpose of Organizational Training (OT) is to develop the skills and knowledge of
people so they can perform their roles effectively and efficiently.
This process area will be addressed in the future by a new or revised standard, guideline,
policy, or procedure.
4.3.1 Related Process Areas
Process Area Related Topic
Organizational Process Definition Organization’s process assets
Project Planning Specific training needs identified by projects
Applying decision-making criteria when
Decision Analysis and Resolution
determining training approaches

4.3.2 Specific Goals and Practices


Specific Goals and Practices
SG1: Establish an Organizational Training Capability
A training capability, which supports the organization's management and technical roles, is
established and maintained.
SP 1.1: Establish the Strategic Training Needs
Establish and maintain the strategic training needs of the organization.
Typical Work Products
• Training needs
• Assessment analysis
SP 1.2: Determine Which Training Needs Are the Responsibility of the Organization
Determine which training needs are the responsibilities of the organization and which will be
left to the individual project or support group.
Typical Work Products
• Common project and support group training needs
• Training commitments
SP 1.3: Establish an Organizational Training Tactical Plan
Establish and maintain an organizational training tactical plan.
Typical Work Products
• Organizational training tactical plan
SP 1.4: Establish Training Capability
Establish and maintain training capability to address organizational training needs.
Typical Work Products
• Training materials and supporting artifacts
• Provide Necessary Training
• Training necessary for individuals to perform their roles effectively is provided.

Page 14 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


SG 2 Provide Necessary Training
Training necessary for individuals to perform their roles effectively is provided.
SP 2.1: Deliver Training
Deliver the training following the organizational training tactical plan.
Typical Work Products
• Delivered training course
SP 2.2: Establish Training Records
Establish and maintain records of the organizational training.
Typical Work Products
• Training records
• Training updates to the organizational repository
SP 2.3: Assess Training Effectiveness
Assess the effectiveness of the organization’s training program.
Typical Work Products
• Training-effectiveness surveys
• Training program performance assessments
• Instructor evaluation forms
• Training examinations

4.4 Project Planning


The purpose of Project Planning (PP) is to establish and maintain plans that define project
activities.
This process area is currently supported by the PG&P and Task Force Handbook. Project
Planning will be further addressed in the future by a new or revised standard, guideline,
policy, or procedure.
4.4.1 Related Process Areas
Process Area Related Information
Requirements Development Developing requirements that define the
product and product components
Requirements Management Managing requirements needed for planning
and re-planning
Risk Management Identifying and managing risks
Technical Solution Transforming requirements into product and
product component solutions

4.4.2 Specific Goals and Practices


Specific Goals and Practices
SG 1: Establish Estimates
Estimates of project planning parameters are established and maintained.
SP 1.1: Estimate the Scope of the Project
Establish a top-level work breakdown structure (WBS) to estimate the scope of the project.
Typical Work Products
• Task descriptions
• Work package descriptions
• Work breakdown structure
SP 1.2: Establish Estimates of Work Product and Task Attributes
Establish and maintain estimates of the attributes of the work products and tasks.

Page 15 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• Technical approach
• Size and complexity of tasks and work products
• Estimating models
• Attribute estimates
SP 1.3: Define Project Lifecycle
Define the project lifecycle phases on which to scope the planning effort.
Typical Work Products
• Project lifecycle phases
SP 1.4: Determine Estimates of Effort and Cost
Estimate the project effort and cost for the work products and tasks based on estimation
rationale.
Typical Work Products
• Estimation rationale
• Project effort estimates
• Project cost estimates
SG2: Develop a Project Plan
A project plan is established and maintained as the basis for managing the project.
SP 2.1: Establish the Budget and Schedule
Establish and maintain the project’s budget and schedule.
Typical Work Products
• Project schedule
• Schedule dependencies
• Project budget
SP 2.2: Identify Project Risks
Identify and analyze project risks.
Typical Work Products
• Identified risks
• Risk impacts and probability of occurrence
• Risk priorities
SP 2.3: Plan for Data Management
Plan for the management of project data. Note: Data refers the various deliverables and non-
deliverable documents and data (minutes, research results, notes, working papers, action
items, etc.). The data can take any form of reports, spreadsheets, manuals, notebooks, charts,
drawings, specifications, files, emails, correspondence, and any other medium used to support
the project.
Typical Work Products
• Data management plan
• Master list of managed data
• Data content and format description
• Data requirements lists for acquirers and for suppliers
• Privacy requirements
• Security requirements
• Security procedures
• Mechanism for data retrieval, reproduction, and distribution
• Schedule for collection of project data
• Listing of project data to be collected
SP 2.4: Plan for Project Resources
Plan for necessary resources to perform the project.

Page 16 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• WBS work packages
• WBS task dictionary
• Staffing requirements based on project size and scope
• Critical facilities/equipment list
• Process/workflow definitions and diagrams
• Program administration requirements list
SP 2.5: Plan for Needed Knowledge and Skills
Plan for knowledge and skills needed to perform the project.
Typical Work Products
• Inventory of skill needs
• Staffing and new hire plans
• Databases (e.g., skills and training)
SP 2.6: Plan Stakeholder Involvement
Plan the involvement of identified stakeholders.
Typical Work Products
• Stakeholder involvement plan
SP 2.7: Establish the Project Plan
Establish and maintain the overall project plan content.
Typical Work Products
• Overall project plan
SG3: Obtain Commitment to the Plan
Commitments to the project plan are established and maintained.
SP 3.1: Review Plans That Affect the Project
Review all plans that affect the project to understand project commitments.
Typical Work Products
• Record of the reviews of plans that affect the project
SP 3.2: Reconcile Work and Resource Levels
Reconcile the project plan to reflect available and estimated resources.
Typical Work Products
• Revised methods and corresponding estimating parameters (e.g., better tools and use of off-
the-shelf components)
• Renegotiated budgets
• Revised schedules
• Revised requirements list
• Renegotiated stakeholder agreements
SP 3.3: Obtain Plan Commitment
Obtain commitment from relevant stakeholders responsible for performing and supporting plan
execution.
Typical Work Products
• Documented requests for commitments
• Documented commitments

4.5 Project Monitoring and Control


The purpose of Project Monitoring and Control (PMC) is to provide an understanding of the
project’s progress so that appropriate corrective actions can be taken when the project’s
performance deviates significantly from the plan.
This process area is currently supported by the PG&P and Task Force Handbook. Project
Monitoring and Control will be further addressed in the future by a new or revised standard,
guideline, policy, or procedure.

Page 17 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

4.5.1 Related Process Areas


Process Area Related Information
Project Planning Project plan, including how it specifies the
appropriate level of project monitoring, the
measures used to monitor progress, and known
risks
Measurement and Analysis Process of measuring, analyzing, and recording
information

4.5.2 Specific Goals and Practices


Specific Goals and Practices
SG1: Monitor Project Against Plan
Actual performance and progress of the project are monitored against the project plan.
SP 1.1: Monitor Project Planning Parameters
Monitor the actual values of the project planning parameters against the project plan.
Typical Work Products
• Records of project performance
• Records of significant deviations
SP 1.2: Monitor Commitments
Monitor commitments against those identified in the project plan.
Typical Work Products
• Records of commitment reviews
SP 1.3: Monitor Project Risks
Monitor risks against those identified in the project plan.
Typical Work Products
• Records of project risk monitoring
SP 1.4: Monitor Data Management
Monitor the management of project data against the project plan.
As noted in above in Project Planning, data refers the various deliverables and non-deliverable
documents and data (minutes, research results, notes, working papers, action items, etc.).
Typical Work Products
• Records of data management
SP 1.5: Monitor Stakeholder Involvement
Monitor stakeholder involvement against the project plan.
Typical Work Products
• Records of stakeholder involvement
SP 1.6: Conduct Progress Reviews
Periodically review the project's progress, performance, and issues.
Typical Work Products
• Documented project review results
SP 1.7: Conduct Milestone Reviews
Review the accomplishments and results of the project at selected project milestones.
Typical Work Products
• Documented milestone review results
SG2: Manage Corrective Action to Closure
Corrective actions are managed to closure when the project's performance or results deviate
significantly from the plan.
SP 2.1: Analyze Issues
Collect and analyze the issues and determine the corrective actions necessary to address the
issues.

Page 18 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• List of issues needing corrective actions
SP 2.2: Take Corrective Action
Take corrective action on identified issues.
Typical Work Products
• Corrective action plan
SP 2.3: Manage Corrective Action
Manage corrective actions to closure.
Typical Work Products
• Corrective action results

4.6 Supplier Agreement Management


The purpose of Supplier Agreement Management (SAM) is to manage the acquisition of
products from suppliers.
This process area is currently supported by the PG&P, Task Force Handbook, and AASHTO
contracting and payment processes. Supplier Agreement Management will be further
addressed in the future by a new or revised standard, guideline, policy, or procedure.
4.6.1 Related Process Areas
Process Area Related Information
Project Monitoring and Control Monitoring projects and taking corrective action
Requirements Development Defining requirements
Requirements Management Managing requirements, including the
traceability of requirements for products
acquired from suppliers
Technical Solution Determining the products and product
components that may be acquired from
suppliers

4.6.2 Specific Goals and Practices


Specific Goals and Practices
SG1: Establish Supplier Agreements
Agreements with the suppliers are established and maintained.
SP 1.1: Determine Acquisition Type
Determine the type of acquisition for each product or product component to be acquired.
Typical Work Products
• List of the acquisition types that will be used for all products and product components to be
acquired
SP 1.2: Select Suppliers
Select suppliers based on an evaluation of their ability to meet the specified requirements and
established criteria.
Typical Work Products
• Market studies
• List of candidate suppliers
• Preferred supplier list
• Trade study or other record of evaluation criteria, advantages and disadvantages of
candidate suppliers, and rationale for selection of suppliers
• Solicitation materials and requirements
SP 1.3: Establish Supplier Agreements
Establish and maintain formal agreements with the supplier.

Page 19 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• Statements of work
• Contracts
• Memoranda of agreement
• Licensing agreement
SG2: Satisfy Supplier Agreements
Agreements with the suppliers are satisfied by both the project and the supplier.
SP 2.1: Execute the Supplier Agreement
Perform activities with the supplier as specified in the supplier agreement.
Typical Work Products
• Supplier progress reports and performance measures
• Supplier review materials and reports
• Action items tracked to closure
• Documentation of product and document deliveries
SP 2.2: Monitor Selected Supplier Processes
Select, monitor, and analyze processes used by the supplier.
Typical Work Products
• List of processes selected for monitoring or rationale for non-selection
• Activity reports
• Performance reports
• Performance curves
• Discrepancy reports
SP 2.3: Evaluate Selected Supplier Work Products
Select and evaluate work products from the supplier of custom-made products.
Typical Work Products
• List of work products selected for monitoring or rationale for non-selection
• Activity reports
• Discrepancy reports
SP 2.4: Accept the Acquired Product
Ensure that the supplier agreement is satisfied before accepting the acquired product.
Typical Work Products
• Acceptance test procedures
• Acceptance test results
• Discrepancy reports or corrective action plans
SP 2.5: Transition Products
Transition the acquired products from the supplier to the project.
Typical Work Products
• Transition plans
• Training reports
• Support and maintenance reports

4.7 Requirements Development


The purpose of Requirements Development (RD) is to produce and analyze customer,
product, and product component requirements.
This process area is currently supported by the Requirements Standard.
4.7.1 Related Process Areas
Process Area Related Topic
Requirements Management Managing customer and product requirements,
obtaining agreement with the requirements
provider, obtaining commitments with those
implementing the requirements, and
maintaining traceability

Page 20 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Process Area Related Topic


Technical Solution How the outputs of the requirements
development processes are used, and the
development of alternative solutions and
designs used in refining and deriving
requirements
Product Integration Interface requirements and interface
management
Verification Verifying that the resulting product meets the
requirements
Validation How the product built will be validated against
the customer needs
Risk Management Identifying and managing risks that are related
to requirements
Configuration Management Ensuring that key work products are controlled
and managed
4.7.2 Specific Goals and Practices
Specific Goals and Practices
SG1: Develop Customer Requirements
Stakeholder needs, expectations, constraints, and interfaces are collected and translated into
customer requirements.
SP 1.1: Elicit Needs
Elicit stakeholder needs, expectations, constraints, and interfaces for all phases of the product
lifecycle.
Typical Work Products
• List of needs, expectations, enhancements, etc.
SP 1.2: Develop the Customer Requirements
Transform stakeholder needs, expectations, constraints, and interfaces into customer
requirements
Typical Work Products
• Customer requirements
• Customer constraints on the conduct of verification
• Customer constraints on the conduct of validation
SG2: Develop Product Requirements
Customer requirements are refined and elaborated to develop product and product component
requirements
SP 2.1: Establish Product and Product Component Requirements
Establish and maintain product and product component requirements, which are based on the
customer requirements.
Typical Work Products
• Derived requirements
• Product requirements
• Product component requirements
SP 2.2: Allocate Product Component Requirements
Allocate the requirements for each product component.
Typical Work Products
• Requirement allocation sheets
• Provisional requirement allocations
• Design constraints
• Derived requirement
• Relationships among derived requirements
SP 2.3: Identify Interface Requirements
Identify interface requirements.

Page 21 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• Interface requirements
SG3: Analyze and Validate Requirements
The requirements are analyzed and validated, and a definition of required functionality is
developed.
SP 3.1: Establish Operational Concepts and Scenarios
Establish and maintain operational concepts and associated scenarios.
Typical Work Products
• Operational concept
• Product or product component installation, operational, maintenance, and support concepts
• Disposal concepts
• Use cases
• Timeline scenarios
• New requirements
SP 3.2: Establish a Definition of Required Functionality
Establish and maintain a definition of required functionality.
Typical Work Products
• Functional architecture
• Activity diagrams and use cases
• Object-oriented analysis with services or methods identified
SP 3.3: Analyze Requirements
Analyze requirements to ensure that they are necessary and sufficient.
Typical Work Products
• Requirements defects reports
• Proposed requirements changes to resolve defects
• Key requirements
• Technical performance measures
SP 3.4: Analyze Requirements to Achieve Balance
Analyze requirements to balance stakeholder needs and constraints.
Typical Work Products
• Assessment of risks related to requirements
SP 3.5: Validate Requirements
Validate requirements to ensure the resulting product will perform as intended in the user's
environment.
Typical Work Products
• Record of analysis methods and results

4.8 Requirements Management


The purpose of Requirements Management (REQM) is to manage the requirements of the
project’s products and product components and to identify inconsistencies between those
requirements and the project’s plans and work products.
This process area is currently supported by the Requirements Standard.
4.8.1 Related Process Areas
Process Area Related Information
Requirements Development Transforming stakeholder needs into product
requirements and deciding how to allocate or
distribute requirements among the product
components
Technical Solution Transforming requirements into technical
solutions
Project Planning How project plans reflect requirements and
need to be revised as requirements change

Page 22 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Process Area Related Information


Configuration Management Baselines and controlling changes to
configuration documentation for requirements
Project Monitoring and Control Tracking and controlling the activities and work
products that are based on the requirements
and taking appropriate corrective action
Risk Management Identifying and handling risks associated with
requirements
4.8.2 Specific Goals and Practices
Specific Goals and Practices
SG1: Manage Requirements
Requirements are managed and inconsistencies with project plans and work products are
identified.
SP 1.1: Obtain an Understanding of Requirements
Develop an understanding with the requirements providers on the meaning of the
requirements.
Typical Work Products
• Lists of criteria for distinguishing appropriate requirements providers
• Criteria for evaluation and acceptance of requirements
• Results of analyses against criteria
• An agreed-to set of requirements
SP 1.2: Obtain Commitment to Requirements
Obtain commitment to the requirements from the project participants.
Typical Work Products
• Requirements impact assessments
• Documented commitments to requirements and requirements changes
SP 1.3: Manage Requirements Changes
Manage changes to the requirements as they evolve during the project.
Typical Work Products
• Requirements status
• Requirements database
• Requirements decision database
SP 1.4: Maintain Bidirectional Traceability of Requirements
Maintain bidirectional traceability among the requirements and work products.
Typical Work Products
• Requirements traceability matrix
• Requirements tracking system
SP 1.5: Identify Inconsistencies Between Project Work and Requirements
Identify inconsistencies between the project plans and work products and the requirements.
Typical Work Products
• Documentation of inconsistencies including sources, conditions, and rationale
• Corrective actions

4.9 Technical Solution


The purpose of Technical Solution (TS) is to design, develop, and implement solutions to
requirements. Solutions, designs, and implementations encompass products, product
components, and product-related lifecycle processes either singly or in combination as
appropriate.
This process area is currently supported by several standards and guidelines that deal with
design, construction, and implementation. Technical Solution will be further addressed in
the future by new or revised standards, guidelines, policies, or procedures.

Page 23 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

4.9.1 Related Process Areas


Process Area Related Information
Requirements Development Requirements allocations, establishing an
operational concept, and interface requirements
definition
Verification Conducting peer reviews and verifying that the
product and product components meet
requirements
Decision Analysis and Resolution Formal evaluation
Requirements Management Managing requirements
Organizational Innovation and Deployment Improving the organization’s technology

4.9.2 Specific Goals and Practices


Specific Goals and Practices
SG1: Select Product Component Solutions
Product or product component solutions are selected from alternative solutions.
SP 1.1: Develop Alternative Solutions and Selection Criteria
Develop alternative solutions and selection criteria.
Typical Work Products
• Alternative solution screening criteria
• Evaluation reports of new technologies
• Alternative solutions
• Selection criteria for final selection
• Evaluation reports of COTS products
SP 1.2: Select Product Component Solutions
Select the product component solutions that best satisfy the criteria established.
Typical Work Products
• Product component selection decisions and rationale
• Documented relationships between requirements and product components
• Documented solutions, evaluations, and rationale
SG2: Develop the Design
Product or product component designs are developed.
SP 2.1: Design the Product or Product Component
Develop a design for the product or product component.
Typical Work Products
• Product architecture
• Product component designs
SP 2.2: Establish a Technical Data Package
Establish and maintain a technical data package.
Note: A technical data package should include the following as appropriate for the type of
product being developed: product architecture description, allocated requirements, product
component descriptions, product-related lifecycle process descriptions, key product
characteristics, required physical characteristics and constraints, interface requirements, bills
of material, verification criteria used to ensure that requirements have been achieved,
conditions of use (environments) and operating/usage scenarios, modes and states for
operations, support, training, disposal, and verifications throughout the life of the product, and
rationale for decisions and characteristics (requirements, requirement allocations, and design
choices).
Typical Work Products
• Technical data package
SP 2.3: Design Interfaces Using Criteria
Design product component interfaces using established criteria.

Page 24 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• Interface design specifications
• Interface control documents
• Interface specification criteria
• Rationale for selected interface design
SP 2.4: Perform Make, Buy, or Reuse Analyses
Evaluate whether the product components should be developed, purchased, or reused based
on established criteria.
Typical Work Products
• Criteria for design and product component reuse
• Make-or-buy analyses
• Guidelines for choosing COTS product components
SG3: Implement the Product Design
Product components, and associated support documentation, are implemented from their
designs.
SP 3.1: Implement the Design
Implement the designs of the product components.
Typical Work Products
• Implemented design
SP 3.2: Develop Product Support Documentation
Develop and maintain the end-use documentation.
Typical Work Products
• End-user training materials
• User's manual
• Operator's manual
• Maintenance manual
• Online help

4.10 Product Integration


The purpose of Product Integration (PI) is to assemble the product from the product
components, ensure that the product, as integrated, functions properly, and deliver the
product.
This process area is will be addressed in the future by a new or revised standard, guideline,
policy, or procedure.
4.10.1 Related Process Areas
Process Area Related Information
Requirements Development Identifying interface requirements
Technical Solution Defining the interfaces and the integration
environment (when the integration environment
needs to be developed
Verification Verifying the interfaces, the integration
environment, and the progressively assembled
product components
Validation Performing validation of the product
components and the integrated product
Risk Management Identifying risks and the use of prototypes in
risk mitigation for both interface compatibility
and product component integration

Page 25 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Process Area Related Information


Decision Analysis and Resolution Using a formal evaluation process for selecting
the appropriate integration sequence and
procedures and for deciding whether the
integration environment should be acquired or
developed
Configuration Management Managing changes to interface definitions and
about the distribution of information
Supplier Agreement Management Acquiring product components or parts of the
integration environment
4.10.2 Specific Goals and Practices
Specific Goals and Practices
SG1: Prepare for Product Integration
Preparation for product integration is conducted.
SP 1.1: Determine Integration Sequence
Determine the product component integration sequence.
Typical Work Products
• Product integration sequence
• Rationale for selecting or rejecting integration sequences
SP 1.2: Establish the Product Integration Environment
Establish and maintain the environment needed to support the integration of the product
components.
Typical Work Products
• Verified environment for product integration
• Support documentation for the product integration environment
SP 1.3: Establish Product Integration Procedures and Criteria
Establish and maintain procedures and criteria for integration of the product components.
Typical Work Products
• Product integration procedures
• Product integration criteria
SG2: Ensure Interface Compatibility
The product component interfaces, both internal and external, are compatible.
SP 2.1: Review Interface Descriptions for Completeness
Review interface descriptions for coverage and completeness.
Typical Work Products
• None
SP 2.2: Manage Interfaces
Manage internal and external interface definitions, designs, and changes for products and
product components.
Typical Work Products
• Table of relationships among the product components and the external environment (e.g.,
main power supply, fastening product, and computer bus system)
• Table of relationships among the different product components
• List of agreed-to interfaces defined for each pair of product components, when applicable
• Reports from the interface control working group meetings
• Action items for updating interfaces
• Application program interface (API)
• Updated interface description or agreement
SG3: Assemble Product Components and Deliver the Product
Verified product components are assembled and the integrated, verified, and validated product is
delivered.

Page 26 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


SP 3.1: Confirm Readiness of Product Components for Integration
Confirm, prior to assembly, that each product component required to assemble the product has
been properly identified, functions according to its description, and that the product component
interfaces comply with the interface descriptions.
Typical Work Products
• Acceptance documents for the received product components
• Delivery receipts
• Checked packing lists
• Exception reports
• Waivers
SP 3.2: Assemble Product Components
Assemble product components according to the product integration sequence and available
procedures.
Typical Work Products
• Assembled product or product components
SP 3.3: Evaluate Assembled Product Components
Evaluate assembled product components for interface compatibility.
Typical Work Products
• Exception reports
• Interface evaluation reports
• Product integration summary reports
SP 3.4: Package and Deliver the Product or Product Component
Package the assembled product or product component and deliver it to the appropriate
customer.
Typical Work Products
• Packaged product or product components
• Delivery documentation

4.11 Verification
The purpose of Verification (VER) is to ensure that selected work products meet their
specified requirements.
This process area is currently supported by the Testing Standard.
4.11.1 Related Process Areas
Process Area Related Information
Confirming that a product or product component
Validation fulfills its intended use when placed in its
intended environment
Generation and development of customer,
Requirements Development
product, and product component requirements
Requirements Management Managing requirements

4.11.2 Specific Goals and Practices


Specific Goals and Practices
SG1: Prepare for Verification
Preparation for verification is conducted.
SP 1.1: Select Work Products for Verification
Select the work products to be verified and the verification methods that will be used for each.
Typical Work Products
• Lists of work products selected for verification
• Verification methods for each selected work product

Page 27 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


SP 1.2: Establish the Verification Environment
Establish and maintain the environment needed to support verification.
Typical Work Products
• Verification environment
SP 1.3: Establish Verification Procedures and Criteria
Establish and maintain verification procedures and criteria for the selected work products.
Typical Work Products
• Verification procedures
• Verification criteria
SG2: Perform Peer Reviews
Peer reviews are performed on selected work products.
SP 2.1: Prepare for Peer Reviews
Prepare for peer reviews of selected work products.
Typical Work Products
• Peer review schedule
• Peer review checklist
• Entry and exit criteria for work products
• Criteria for requiring another peer review
• Peer review training material
• Selected work products to be reviewed
SP 2.2: Conduct Peer Reviews
Conduct peer reviews on selected work products and identify issues resulting from the peer
review.
Typical Work Products
• Peer review results
• Peer review issues
• Peer review data
SP 2.3: Analyze Peer Review Data
Analyze data about preparation, conduct, and results of the peer reviews.
Typical Work Products
• Peer review data
• Peer review action items
SG3: Verify Selected Work Products
Selected work products are verified against their specified requirements.
SP 3.1: Perform Verification
Perform verification on the selected work products.
Typical Work Products
• Verification results
• Verification reports
• Demonstrations
• As-run procedures log
SP 3.2: Analyze Verification Results
Analyze the results of all verification activities.
Typical Work Products
• Analysis report (e.g., statistics on performances, causal analysis of nonconformances,
comparison of the behavior between the real product and models, and trends)
• Trouble reports
• Change requests for the verification methods, criteria, and environment

Page 28 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

4.12 Validation
The purpose of Validation (VAL) is to demonstrate that a product or product component
fulfills its intended use when placed in its intended environment.
This process area is currently supported by the Testing Standard.
4.12.1 Related Process Areas
Process Area Related Information
Requirements Development Requirements validation
Transforming requirements into product
specifications and for corrective action when
Technical Solution
validation issues are identified that affect the
product or product component design.
Verifying that the product or product component
Verification
meets its requirements
4.12.2 Specific Goals and Practices
Specific Goals and Practices
SG1: Prepare for Validation
Preparation for validation is conducted.
SP 1.1: Select Products for Validation
Select products and product components to be validated and the validation methods that will
be used for each.
Typical Work Products
• Lists of products and product components selected for validation
• Validation methods for each product or product component
• Requirements for performing validation for each product or product component
• Validation constraints for each product or product component
SP 1.2: Establish the Validation Environment
Establish and maintain the environment needed to support validation.
Typical Work Products
• Validation environment
SP 1.3: Establish Validation Procedures and Criteria
Establish and maintain procedures and criteria for validation.
Typical Work Products
• Validation procedures
• Validation criteria
• Test and evaluation procedures for maintenance, training, and support
SG2: Validate Product or Product Components
The product or product components are validated to ensure that they are suitable for use in their
intended operating environment.
SP 2.1: Perform Validation
Perform validation on the selected products and product components.
Typical Work Products
• Validation reports
• Validation results
• Validation cross-reference matrix
• As-run procedures log
• Operational demonstrations
SP 2.2: Analyze Validation Results
Analyze the results of the validation activities.
Typical Work Products
• Validation deficiency reports
• Validation issues
• Procedure change request

Page 29 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

4.13 Configuration Management


The purpose of Configuration Management (CM) is to establish and maintain the integrity of
work products using configuration identification, configuration control, configuration status
accounting, and configuration audits.
Configuration management of deliverables and work products is required in many of the
existing standards; however, no formal configuration management process exists. This
process area will be further addressed in the future by a new or revised standard, guideline,
policy, or procedure.
4.13.1 Related Process Areas
Process Area Related Information
Developing plans and work breakdown
Project Planning structures, which may be useful for determining
configuration items
Project Monitoring and Control Performance analyses and corrective actions

4.13.2 Specific Goals and Practices


Specific Goals and Practices
SG1: Establish Baselines
Baselines of identified work products are established.
SP 1.1: Identify Configuration Items
Identify the configuration items, components, and related work products that will be placed
under configuration management.
Typical Work Products
• Identified configuration items
SP 1.2: Establish a Configuration Management System
Establish and maintain a configuration management and change management system for
controlling work products.
Typical Work Products
• Configuration management system with controlled work products
• Configuration management system access control procedures
• Change request database
SP 1.3: Create or Release Baselines
Create or release baselines for internal use and for delivery to the customer.
Typical Work Products
• Baselines
• Description of baselines
SG 2: Track and Control Changes
Changes to the work products under configuration management are tracked and controlled.
SP 2.1: Track Change Requests
Track change requests for the configuration items.
Typical Work Products
• Change requests
SP 2.2: Control Configuration Items
Control changes to the configuration items.
Typical Work Products
• Revision history of configuration items
• Archives of the baselines
SG3: Establish Integrity
Integrity of baselines is established and maintained.
SP 3.1: Establish Configuration Management Records
Establish and maintain records describing configuration items.

Page 30 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• Revision history of configuration items
• Change log
• Copy of the change requests
• Status of configuration items
• Differences between baselines
SP 3.2: Perform Configuration Audits
Perform configuration audits to maintain integrity of the configuration baselines.
Typical Work Products
• Configuration audit results
• Action items

4.14 Process and Product Quality Assurance


The purpose of Process and Product Quality Assurance (PPQA) is to provide staff and
management with objective insight into processes and associated work products.
This process area is currently supported by the Quality Assurance Standard.
4.14.1 Related Process Areas
Process Area Related Information
Identifying processes and associated work
Project Planning
products that will be objectively evaluated
Verification Satisfying specified requirements

4.14.2 Specific Goals and Practices


Specific Goals and Practices
SG 1: Objectively Evaluate Processes and Work Products
Adherence of the performed process and associated work products and services to applicable
process descriptions, standards, and procedures is objectively evaluated.
SP 1.1: Objectively Evaluate Processes
Objectively evaluate the designated performed processes against the applicable process
descriptions, standards, and procedures.
Typical Work Products
• Evaluation reports
• Noncompliance reports
• Corrective actions
SP 1.2: Objectively Evaluate Work Products and Services
Objectively evaluate the designated work products and services against the applicable process
descriptions, standards, and procedures.
Typical Work Products
• Evaluation reports
• Noncompliance reports
• Corrective actions
SG 2: Provide Objective Insight
Noncompliance issues are objectively tracked and communicated, and resolution is ensured.
SP 2.1: Communicate and Ensure Resolution of Noncompliance Issues
Communicate quality issues and ensure resolution of noncompliance issues with the staff and
managers.
Typical Work Products
• Corrective action reports
• Evaluation reports
• Quality trends

Page 31 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


SP 2.2: Establish Records
Establish and maintain records of the quality assurance activities.
Typical Work Products
• Evaluation logs
• Quality assurance reports
• Status reports of corrective actions
• Reports of quality trends

4.15 Measurement and Analysis


The purpose of Measurement and Analysis (MA) is to develop and sustain a measurement
capability that is used to support management information needs.
This process area will be addressed in the future by a new or revised standard, guideline,
policy, or procedure.
4.15.1 Related Process Areas
Process Area Related Information
Estimating project attributes and other planning
Project Planning
information needs
Monitoring project performance information
Project Monitoring and Control
needs
Configuration Management Managing measurement work products.
Meeting customer requirements and related
Requirements Development
information needs
Maintaining requirements traceability and
Requirements Management
related information needs
Establishing the organization’s measurement
Organizational Process Definition
repository
Understanding variation and the appropriate
Quantitative Project Management
use of statistical analysis techniques
4.15.2 Specific Goals and Practices
Specific Goals and Practices
SG1: Align Measurement and Analysis Activities
Measurement objectives and activities are aligned with identified information needs and
objectives.
SP 1.1: Establish Measurement Objectives
Establish and maintain measurement objectives that are derived from identified information
needs and objectives.
Typical Work Products
• Measurement objectives
SP 1.2: Specify Measures
Specify measures to address the measurement objectives.
Typical Work Products
• Specifications of base and derived measures
SP 1.3: Specify Data Collection and Storage Procedures
Specify how measurement data will be obtained and stored.
Typical Work Products
• Data collection and storage procedures
• Data collection tools
SP 1.4: Specify Analysis Procedures
Specify how measurement data will be analyzed and reported.

Page 32 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

Specific Goals and Practices


Typical Work Products
• Analysis specifications and procedures
• Data analysis tools
SG2: Provide Measurement Results
Measurement results, which address identified information needs and objectives, are provided.
SP 2.1: Collect Measurement Data
Obtain specified measurement data.
Typical Work Products
• Base and derived measurement data sets
• Results of data integrity tests
SP 2.2: Analyze Measurement Data
Analyze and interpret measurement data.
Typical Work Products
• Analysis results and draft reports
SP 2.3: Store Data and Results
Manage and store measurement data, measurement specifications, and analysis results.
Typical Work Products
• Stored data inventory
SP 2.4: Communicate Results
Report results of measurement and analysis activities to all relevant stakeholders.
Typical Work Products
• Delivered reports and related analysis results
• Contextual information or guidance to aid in the interpretation of analysis results

Page 33 06/16/2009
AASHTOWare Lifecycle Framework (ALF) 05.010.02R

4.16 Advanced Process Areas


4.16.1 Organizational Process Performance
The purpose of Organizational Process Performance (OPP) is to establish and maintain
a quantitative understanding of the performance of the organization’s set of standard
processes in support of quality and process-performance objectives, and to provide the
process-performance data, baselines, and models to quantitatively manage the
organization’s projects.
4.16.2 Organizational Innovation and Deployment
The purpose of Organizational Innovation and Deployment (OID) is to select and deploy
incremental and innovative improvements that measurably improve the organization’s
processes and technologies. The improvements support the organization’s quality and
process-performance objectives as derived from the organization’s business objectives.
4.16.3 Integrated Project Management
The purpose of Integrated Project Management (IPM) is to establish and manage the
project and the involvement of the relevant stakeholders according to an integrated and
defined process that is tailored from the organization’s set of standard processes.
4.16.4 Risk Management
The purpose of Risk Management (RSKM) is to identify potential problems before they
occur so that risk-handling activities can be planned and invoked as needed across the
life of the product or project to mitigate adverse impacts on achieving objectives.
4.16.5 Quantitative Project Management
The purpose of Quantitative Project Management (QPM) is to quantitatively manage the
project’s defined process to achieve the project’s established quality and process-
performance objectives.
4.16.6 Decision Analysis and Resolution
The purpose of Decision Analysis and Resolution (DAR) is to analyze possible decisions
using a formal evaluation process that evaluates identified alternatives against
established criteria.
4.16.7 Causal Analysis and Resolution
The purpose of Causal Analysis and Resolution (CAR) is to identify causes of defects
and other problems and take action to prevent them from occurring in the future.

Page 34 06/16/2009
Standards and Guidelines Glossary 05.020.02R

AASHTOWare Standards and Guidelines Glossary


1. General Definitions:
The following are definitions are general and apply to the majority of all standards and
guidelines.
Item Definition
Standard Describes mandatory procedures that must be followed, results that
must be produced, and technologies and technical specifications
that must be used or adhered to during the development and
maintenance of AASHTOWare products. AASHTOWare standards
are created and implemented in order to ensure a consistent
approach is used to develop, maintain and deliver software
products.
Guideline Describes procedures, results, technical specifications and/or
technologies that are considered good practices to follow, produce,
or use; however, these are not required. A proposed standard or
standard process may be initially implemented as a guideline with
future plans to implement it as a requirement.
Standards and The published document and repository of all approved
Guidelines Notebook AASHTOWare standards and guidelines.
Project/Product Work This term refers to the activities, schedule, and resource costs
Plan (PWP) proposed and contracted to satisfy the defined user requirements.
This plan is developed in the Tactics / Solicitation phase of the
AASHTOWare Lifecycle and is established in the Contract phase. A
Project/Product Work Plan is usually an annual plan, and the work
described within it is scheduled to correspond to the AASHTO fiscal
year.
Work Product For the purposes of AASHTOWare Standards and Guidelines, a
work product is defined as a result or artifact of the software
development or project management process.
Deliverable A deliverable is also a work product; however, deliverables must be
planned and tracked in the project/product work plan and must be
formally submitted to the task force for approval or rejection.
Deliverable or Work A deliverable or work product definition is used to define the
Product Definition purpose, format, content, usage, and responsibilities of a work
product or deliverable.
New Development A new development project includes the addition of major new
Project functional requirements to an existing product line or to an existing
product module; or the creation of a new product line or product
module. New development projects are formally identified and
approved through user groups, technical advisory committees,
project task forces, and SCOJD.

Page 1 06/16/2009
Standards and Guidelines Glossary 05.020.02R

Item Definition
Enhancement Project An enhancement project includes the addition of new features to an
existing product module; or the correction of limited-scope, non-
critical inconsistencies or inadequacies of a product module.
Enhancement projects are formally identified and approved through
user groups, technical advisory committees, project task forces, and
SCOJD.
Major Enhancement An enhancement project that requires significant funding and effort
Project to implement.
Minor Enhancement A very small enhancement effort that requires minimum funding and
Project effort to implement.
Major Maintenance A project that includes the scheduled repair of an existing product
module or the product's technical operating environment which is
required to enable successful execution of the product as
prescribed by business requirements.
Minor Maintenance A project to provide a temporary fix or repair of an existing product
module. The temporary fix or repair results must not add to, change
nor delete from the functionality of a product module.

2. Requirements Definitions
The following are definitions are associated with the Requirements Standard (3.010.nnS). Refer
to this standard for additional information.
Item Definition
User Requirement A user requirement describes what a user or business stakeholder
expects from a proposed product (what the user wants to product to
do).
User Requirements The URS is a required deliverable which contains all of the
Specification (URS) approved user requirements that are to be accomplished in a
specified contract period for a specified product. The URS is
normally incorporated in or referenced by the project/product work
plan; however, in some cases, a separate document is created.
System Requirement A system requirement describes what the proposed product must
do in order to one or more user requirements (how the product will
do it). These may describe functionality or impose constraints on the
design or implementation (such as performance requirements,
security, or reliability). System requirements are documented in the
language of the software developer or integrator with the
appropriate detail needed to design the proposed product.
System Requirements The SRS is a required deliverable which contains all of the system
Specification (SRS) requirements. The SRS should describe all functional, non-
functional, technical, role, and data requirements of the proposed
system in sufficient detail to support system design.

Page 2 06/16/2009
Standards and Guidelines Glossary 05.020.02R

Item Definition
Requirements The RTM is a required deliverable that describes the backward
Traceability Matrix traceability and forward traceability of the requirements in the URS.
(RTM) The RTM documents that system requirements are be traced to
source user requirements. The RTM also documents that each
requirement is traced to a design object and a testing procedure.
Deliverable This is a work product which is used to document the task force
Acceptance acceptance of deliverables. A separate Deliverable Acceptance
must be created for each of accepted deliverable and may be
documented in various formats (email, letter, form. etc.)
Change Control A documented procedure that provides the ability to monitor
Procedure requests that add, change, or remove functionality or requirements
documented in the approved URS. The procedure includes
activates to submit, review, analyze, approve, and reject change
requests, and the communication of the approval/rejection decision.
Change Request This work product is the record of a change request submittal,
Acceptance impact analysis, and the task force approval or rejection decision. A
separate Change Request Acceptance must be created for each
change request and may be documented in various formats (email,
letter, form. etc.)

3. Testing Definitions
The following are definitions of work products and deliverables associated with the Testing
Standard (3.080.nnS). Refer to this standard for additional information.
Item Definition
Test Plan This plan describes the testing methodology, what will be tested,
testing schedule, and testing deliverables. The test plan is required
and may be included or referenced in the project/product work plan
or submitted as separate deliverable.
Alpha Testing This report is a required deliverable that documents the results from
Acceptance Report Alpha Testing (what was tested, results, problems found,
corrections made, outstanding issues, etc.). The Alpha Testing
Acceptance Report is submitted to the task force with a request to
accept the completion of Alpha Testing.
Distribution Test This contains all of the materials needed to release a product for
Materials Beta Testing. The Distribution Test Materials includes the product,
instructions, installation procedures, methods to record
testing/results and report problems, etc.
Beta Testing This report is a required deliverable that documents the results from
Acceptance Report Beta Testing (what was tested, who participated, results, problems
found, corrections made, outstanding issues, etc.) The Beta
Testing Acceptance Report is submitted to task force with a request
to accept the completion of Beta Testing and to acknowledge that
the product is ready for implementation.
Installation Materials This contains all procedures, executables, documentation needed to
install, implement, and operate the product at the user agency site.

Page 3 06/16/2009
Standards and Guidelines Glossary 05.020.02R

4. AASHTOWare Lifecycle Framework Definitions


The AASHTOWare Lifecycle Framework (ALF) is framework created to improve AASHTOWare
software development and maintenance processes and, subsequently, improve AASHTOWare
products. Process improvement projects are implemented to develop new or revised standards
and guidelines that are based on goals and practices within the framework. ALF is based on
the CMMI-DEV model (see definition below).
The following are definitions associated with the AASHTOWare Lifecycle Framework (ALF).
Refer to the AASHTOWare Lifecycle Framework document in the appendices for additional
information regarding ALF.
Item Definition
Capability Maturity A process improvement maturity model that provides a
Model Integration for comprehensive integrated solution for development and
Development (CMMI- maintenance activities applied to products and services. CMMI-DEV
DEV) was developed by the Software Engineering Institute (SEI) of
Carnegie Mellon University. The CMMI-DEV model consists of best
practices that address development and maintenance activities that
cover the product lifecycle from conception through delivery and
maintenance.
Process Areas A process area is a cluster of related practices in an area that, when
implemented collectively, satisfy a set of goals considered important
for making improvement in that area. The ALF model currently
includes the 15 process areas that are classified as “Basic” and 7
that are classified as “Advanced”. For the time being, development
of process areas under ALF will be limited to the “Basic” process
areas.
Example process areas include Project Planning, Requirements
Development, Requirements Management, Verification, Validation,
and Process and Product Quality Assurance. A full list of process
areas and their purpose is provided in the ALF document.
Process Area These are groups of related ALF process areas. The Standards
Categories and Guidelines Notebook uses the same categories to group
standards and guidelines. The ALF categories are listed below:
• Process Management process areas contain the cross-project
activities related to defining, planning, deploying, implementing,
monitoring, controlling, appraising, measuring, and improving
processes.
• Project Management process areas cover the project
management activities related to planning, monitoring, and
controlling the project.
• Software Engineering process areas cover the development and
maintenance activities that are shared across engineering
disciplines.
• Support process areas cover the activities that support product
development and maintenance. The Support process areas
address processes that are used in the context of performing
other processes.

Page 4 06/16/2009
Standards and Guidelines Glossary 05.020.02R

Item Definition
Specific Goals and Specific goals and practices apply to a given process area. A
Practices: specific goal describes the unique characteristics that must be
present to satisfy the process area. A specific practice is the
description of an activity that is considered important in achieving
the associated specific goal. Each process area includes one or
more specific goal, and each specific goal includes one or more
specific practices.
Generic Goals and General goals and practices apply to multiple process areas. A
Practices: generic goal describes the characteristics that must be present to
institutionalize the processes that implement a process area. A
generic practice describes an activity that is considered important in
achieving the associated generic goal. ALF includes 3 generic
goals, and 13 generic practices. Each goal includes one or more of
the generic practices.
Capability Levels A capability level is a process improvement achievement within an
individual process area. As an organization satisfies each generic
goal (1-3) and its generic practices, the equivalent capability level
(1-3) is achieved. The ALF capability level are listed below:
• Capability Level 0 (Incomplete). One or more of the specific goals
of the process area are not satisfied.
• Capability Level 1 (Performed). The process satisfies the specific
goals and specific practices of the process area; however, it is not
institutionalized.
• Capability Level 2 (Managed). The process which was performed
at capability level 1 becomes a managed process when:
■ There is a policy that indicates the process will be performed,
■ It is planned and executed in accordance with policy,
■ There are resources provided to support and implement the
process and produce the required work products,
■ Training is provided on how to perform the process,
■ The process and work products are monitored, controlled, and
reviewed, and
■ The process and work products are evaluated for adherence to
the standard process.
• Capability Level 3 (Defined). The process which was managed at
capability level 2 becomes a defined process when:
■ Tailoring guidelines are established that allows a specific
project to customize the standard process to suit the needs of
that particular project,
■ The process contributes work products, measures, and other
process improvement information to the organizational process
assets, and
■ The process clearly states the purpose, inputs, entry criteria,
activities, roles, measures, verification steps, outputs, and exit
criteria.

Page 5 06/16/2009
Standards and Guidelines Glossary 05.020.02R

5. Lifecycle Phases
The highlighted table below shows the AASHTOWare Lifecycle phases. Although these phases
are depicted consecutively, they actually overlap each other, even when a waterfall
methodology is being employed. Where iterative techniques are being used the Planning
through Verification phases are repeated for as many cycles as are needed. In addition there
may be multiple parallel threads of development occurring concurrently. In some methodologies,
Design, Construction, and Verification may be a single unified operation which iterates with the
Requirements/Analysis phase.

Strategy/ Tactics/ Requirements/ Verification/ Product


Contract Planning Design Construction Implementation
Proposal Solicitation Analysis Validation Maintenance

The AASHTOWare Product Lifecycle Phases are defined as follows:

Lifecycle Phases Phase Definitions


Strategy / Proposal During this phase, strategy planning is performed to identify beneficial
opportunities, which may be realized through development and
technology initiatives. Goals/Practices, Organization, Technology,
Information, and Application Architectures are planned to support
proposals designed to realize these opportunities.
Tactics / Solicitation During this phase, tactical work plans, guided by strategic plans and
aimed at satisfying identified requirements, are developed. The Project
or Product Work Plan (PWP) is developed in this phase and is used to
define the scope of a project. The AASHTOWare Tactical Plan (ATP) is
also developed in this phase.
Contract AASHTOWare contracts are developed, submitted, negotiated, and
approved in this phase. The phase is completed when AASHTO and
the contractor agree on its terms. The contract includes the approved
provisions of the Project or Product Work Plan (PWP).
Planning This phase includes all of the planning activities required to manage the
project.
Requirements / This phase includes all of the requirements and analysis activities
Analysis needed to specify and track the user and system requirements of
product development, product enhancement, or product maintenance.
All requirements to be Verified and Validated are developed and
managed during this phase
Design This phase includes all of the activities needed to explore alternative
solutions and build the external/internal design of the product or
product enhancement.
Construction This phase includes all of the activities needed to construct the product,
product enhancement, or product maintenance.

Page 6 06/16/2009
Standards and Guidelines Glossary 05.020.02R

Lifecycle Phases Phase Definitions


Verification / This phase is composed of verification and validation activities.
Validation Verification includes all of the activities needed to compare the
developed product or product enhancements with the developed and
managed requirements. Validation includes all of the activities needed
to determine if the product or product enhancements work acceptably
in the intended environment. This phase ends with acceptance by the
user of the product.
Implementation This phase consists in preparing the product or product enhancement
for distribution and implementing it at customer sites.
Product After the product is accepted it goes into the maintenance phase.
Maintenance

Page 7 06/16/2009
Standards and Guidelines Glossary 05.020.02R

6. CLIENT/SERVER
A computing architecture in which application functions are decomposed and data bases are
segmented so that processing and information can be distributed over multiple platforms across
a network to precisely meet information and processing requirements while optimally utilizing
available computing resources.
Definition form I/S Resource Group's presentation on Client/Server Development Principles.
The key benefit to a client/server environment is the flexibility it provides which allows maximum
utilization of both staff and computing resources. Therefore, a Standard or Guideline for
Client/Server Architecture should ensure that the computing elements of a client/server
application are properly identified and defined. The client/server computing elements are
distributed processing, distributed data, and the network. The distributed processing elements
must assure that the application is decomposed into component functions. This decomposition
will allow application processes to be performed on the most appropriate client or server
platform at the most appropriate network location. Distributed data requires that the database be
physically organized into logical data segments. This segmentation then allows for the
individual data segment to be placed on the most appropriate network location. The network
provides the connectivity to link the clients and servers via communications protocols so that
programs can interact to share data and processing.

7. PORTABILITY
I f we wish to sell a program to many different users or if we wish to use a program over a long
period of time, we must be concerned with the extent to which a program can be easily and
effectively operated in a variety of computing environments. The goal is development and
implementation of software and data base code which will execute with different (technical
environments) hardware platforms and operating systems with a minimum of modification, if
any.
To port an application from one technical environment to another involves changing every
specification in the application which is specific to a particular environment.
This implies that at least one, and probably all, of the following statements should be true.

1. The product is constructed with a case tool which supports all of the target environments
and allows easy portation by regeneration of the application.

2. The specifications of the application which are environment specific are minimal, are well
documented and easy to change.

3. The specifications of the application which are environment specific are isolated to modules
or components which can easily be re-written or are replicated with commercially available
equivalents targeting the new environment.
The primary benefit of developing software with maximum portability is to broaden the base of
potential customers and to protect the AASHTO software investment over time.
A checklist is helpful in accomplishing this goal.

1. Is the program written in machine independent language?

2. Is the program written in a widely used standardized programming language, and does the
program use only a standard version and features of that language?

Page 8 06/16/2009
Standards and Guidelines Glossary 05.020.02R

3. Does the program use primarily standard, universally available library functions and
subroutines?

4. Does the program use operating system functions minimally if at all?

5. Does the program isolate and document machine-dependent statements?

6. Is the program structured to phased (overlay) operation on a smaller computer?

7. Has dependency on internal bit representation of alphanumeric or special characters been


avoided or documented in the program?

Reference material used included:|


Software Maintenance: The Problem and Its Solutions. Martin, James and McClure, Carma,
PRENTICE-HALL, 1983.

Page 9 06/16/2009
This page is intentionally blank.

Das könnte Ihnen auch gefallen