Beruflich Dokumente
Kultur Dokumente
and Simulation
1
Course Objectives
2
Course Overview
• Introductory Material
1. Systems Analysis
2. Modeling & Simulation
• Systems Analysis
3. Systems Life Cycles
4. Systems Engineering Role of Systems Analysis
5. Model-Based Systems Engineering
3
Course Overview
(cont)
• Modeling Applications
10. Requirements Analysis & Validation
11. Effectiveness Analysis
12. Margin Modeling
13. Risk Analysis
4
Definitions
5
Lesson 1:
Introduction to Systems Analysis
6
Objectives
7
What is a System?
8
What is Analysis?
9
System Hierarchy
System
… … … …
m m m m
1 3 1 3 1 3 1 3
2 2 2 2
10
A Radar System Model
Transmitter Power
Supply
Transmit/
Antenna Receive Data Processor
Switch & Controller Displays
Signal
Receiver Processor
Radar System
The system is more than the product – hence systems analysis must address
key processes including test, manufacturing, operations & disposal.
12
Complex Systems
Failed
Elements Repair
&
Machines Spares & Supply
Repairs
External
Systems Required
Skills
Training Available
System Skills
Workers Operators
Maintainers
PHYSICAL ENVIRONMENT
SOCIAL & ORGANIZATIONAL ENVIRONMENT
13
The “System of Interest”
16
References
• IEEE Standard for Application and Management of
the Systems Engineering Process, IEEE Std 1220-
2005, September 2005.
• NASA Systems Engineering Processes and
Requirements, NPR 7123, April 2013.
• Systems Engineering Fundamentals, Supplementary
Text Prepared by the Defense Acquisition University
Press, Fort Belvoir, VA 22060-5565, January 2001.
• Systems engineering – System life cycle processes,
ISO/IEC 15288, February 2008.
17
Lesson 2:
Introduction to
Modeling & Simulation
18
Objectives
19
What is Modeling?
20
Building a Model – the 4 Step Process
21
Types of Models
• Deterministic
– mathematical models
• lift of an airplane wing
• thrust of a rocket engine
• Stochastic
– random discrete event models
• wind velocities encountered by a flight vehicle during ascent
• component failures during system operation
• Hybrid
– elements of mathematical & random discrete event models
• ascent performance of a flight vehicle through the atmosphere
22
The Black Box View of a Model
Controls
Neglected
Variables
Inputs Outputs
Exogenous Endogenous
Variables Transformation Variables
Mechanisms
23
A Deterministic System Model --
Lorenz Model of Atmospheric Dynamics
x
x0
y0 dx/dt = (y-x) y
z
z0 dy/dt = x-xz-y t
t0 dx/dt
dz/dt = xy-z dy/dt
dz/dt
24
A Stochastic System Model --
The Random Walk
(Staggering Drunk)
Random
Number
Stream N
x0 Decision x
y0 Logic y
25
Properties of Models
26
What is Simulation?
27
Simulation versus Models
28
Conducting a Simulation – the 4 Step Process
29
Remarks
• Simplify
• Simplify
• Simplify
30
Key Points
31
References
33
Objectives
Development Operations
Program
A B Initiation C IOC FOC
Materiel Technology Engineering and Production & Operations &
Solution Development Manufacturing Development Deployment Support
Analysis
Materiel Post PDR Post-CDR FRP
Development Assessment Assessment Decision
Decision PDR PDR Review
or
NSS formally
tailored DOD
5000.2 to suit
small production
lots (<50) in
highly complex
product
developments.
39
Comparison of Life Cycle Models
Subsystem Definition Production
System Definition Fabrication,
Preliminary Detailed Assembly,
Design Design Integration, and Support
Test (FAIT)
Development Operations
Formulation Implementation
Pre A A B C D Fabrication, E F
Concept Concept Preliminary Detail Assembly, Operations & Disposal
Studies Development Design Design Integration, & Test Sustainment
40
Systems Analysis Across Life Cycle
41
Systems Analysis Supports Entire Development
Cycle
Systems Development & Analysis
Design Verification
tio n
Sy
.
ca tio
a process-centric
ste
n
r ifi lida
Integrate System and Perform
.
Develop System Performance
System Verification to
view of the systems Specification
a
Performance Specifications
And System Verification Plan
& mV
.
De
… …
engineering
Ve
ste
s ig
Expand Performance Assemble CIs and
Sy
process, whereas Specifications into CI “Design-to” perform CI Verification
n
Specifications and CI Verification to CI “Design-to”
Plan Specifications
42
Systems Analysis During Concept Development
Requirements Validation
Demonstrate and
Validate System to
User Acceptance Plan
: Impact of Technologies - A
tio n
Sy
SSTO Metric Example .
i ca tio
ste
n
rif alida
Integrate System and Perform
.
Develop System Performance
System Verification to
Specification
m
Performance Specifications
25K to LEO And System Verification Plan
& mV
.
De
… …
Ve
ste
s ig
Expand Performance Assemble CIs and
Sy
Specifications into CI “Design-to” perform CI Verification
n
Baseline Specifications and CI Verification
Plan
to CI “Design-to”
Specifications
(AL Tankage)
Evolve “Design-to”
400 Specifications into “Build-to”
Inspect to
“Build-to”
Documentation and
Composites Inspection Plan
Documentation
Vehicle Dry Weigh, 1000#
System Development
300
Practical Limit
200 of Vehicle Size
~
~
ref.: Bob Ryan
5 10 15 20 25 30 35
• Delta IV H Launch
– Extensive mass margin for baseline mission;
– Dual manifest opens cheapest path to full system (lander, rover, Nav/Comm)
• LOX / LH2 main engine
– Link to potential ISRU; look-ahead to manned systems
• Transfer and capture phases: Lander Main Engine vs. SEP
– Potential payload increase with SEP is minimal (at best); transfer and capture phases extend to years.
• Powered Descent and Landing: modified RL-10 (5klb thrust, throttle to 10%) alone
– Alternative (off-ramp) is combination of unmodified RL-10 with lower thrust auxiliary for final descent
– Development of modified RL-10 deemed less risky than mission and design complexity for alternative
Critical Mission Trades bound the baseline
45
and point to key Phase 1 studies
Lander Capability-Current Mission
10.76 m
Lander Payload
Element Mass (kg) Power (W) Volume Notes
Instrument 1 M1 P1 V1 xyz1
Instrument 2 M2 P2 V2 xyz2
Instrument 3 M3 P3 V3 xyz3
Instrument 4 M4 P4 V4 xyz4
Instrument n Mn Pn Vn
Rover Payload
Element Mass (kg) Power (W) Volume Notes
Instrument 2 M2 P2 V2 xyz2
Instrument 4 M4 P4 V4 xyz4
surface Instrument n Mn Pn Vn
tio n
TPM Application -- Example
Sy
.
i ca tio
ste
n
rif alida
Integrate System and Perform
.
Develop System Performance
System Verification to
Specification
m
Performance Specifications
And System Verification Plan
& mV
.
De
… …
Ve
ste
s ig
Define (for each deviation from plan) Expand Performance Assemble CIs and
Acoustics
Sy
Specifications into CI “Design-to” perform CI Verification
n
• What Changed vs Expectations Specifications and CI Verification to CI “Design-to”
Idle Plan Specifications
• How recovered
4.5 • Sensitivity to parameter variations/changes
Evolve “Design-to”
• Configuration Specifications into “Build-to”
Inspect to
“Build-to”
Documentation and
H/W Inspection Plan
Documentation
SCSI code
4.4
Servo code Fab, Assemble, and
Code to “Build-to”
Electronics Documentation
Bels
4.2
Actual Measure
Design Profile
Design Requirements
4.1 (or Product Spec)
Test Phase
Entry
Define at Entry
4.0
Ref: Alan Ray
Baseline
Measurement Events Calendar Dates
47
Systems Analysis During Integration
Systems Development & Analysis
Design Verification
tio n
Sy
.
ica tio
ste
n
rif alida
Integrate System and Perform
.
Develop System Performance
System Verification to
Specification
m
Performance Specifications
And System Verification Plan
& mV
.
De
… …
Ve
ste
s ig
Expand Performance Assemble CIs and
Sy
Specifications into CI “Design-to” perform CI Verification
n
Specifications and CI Verification to CI “Design-to”
Plan Specifications
Evolve “Design-to”
Inspect to
Specifications into “Build-to”
“Build-to”
Documentation and
Documentation
Inspection Plan
Performance Requirement Requirement Source Capability/Margins Planned Verification Verification Fab, Assemble, and
Code to “Build-to”
(Spacecraft Specification (Parent Requirement) (Physical, Functional, Method Requirements Event
Documentation
Paragraph) Performance)
376239 LAE Thrust Vector DERVD 3.6 Spacecraft IPS Derived Analysis Verified by EQ.LAE
System Development
Alignment Component + 0.25 Requirement IOC AXAF.95.350.121 measurement at the
degrees AXSC 3.2.9.1 Structures & Mechanical engine level.
Subsystem
376240 LAE Location +3 DERVD 3.6 Spacecraft IPS Derived Analysis Verified by analysis. SE30.TRW
inches Requirement IOC AXAF.95.350.121
AXSC 3.2.9.1 Structures & Mechanical
Subsystem
376241 RCS Minimum Impulse DERVD 3.6 Spacecraft IPS Derived Analysis Demonstrated during EQ.PROP REM
Bit TBD Requirement IOC AXAF.95.350.121 thruster qualification
AXSC 3.2.9.2 Thermal Control testing.
Subsystem (TCS)
376242 RCS Thrust 21 lbf + 5% DERVD 3.6 Spacecraft IPS Derived Analysis Demonstrated during EQ.PROP REM
(at 250 psia inlet pressure) Requirement to be resolved AXSC thruster qualification
3.2.9.2.1 Heaters testing.
376243 RCS Minimum Specific DERVD 3.6 Spacecraft IPS Derived Analysis Demonstrated during EQ.PROP REM
Impulse (inlet press. = 250 psia) Requirement IOC AXAF.95.350.121 thruster qualification
225 sec (BOL steady state) AXSC 3.2.9.2.1 Heaters testing.
376244 Total Pulses (each DERVD 3.6 Spacecraft IPS Derived Analysis Demonstrated during EQ.PROP REM
thruster) 50,000 Requirement IOC AXAF.95.350.121 thruster qualification
AXSC 3.2.9.2.1 Heaters testing.
376245 Propellant Throughput DERVD 3.6 Spacecraft IPS Derived Predicted Throughput: 92.6 Analysis Demonstrated during EQ.PROP REM
200 lbm Requirement IOC AXAF.95.350.121 lbm thruster qualification
AXSC 3.2.9.2.1 Heaters Expected Margin: 107 lbm testing.
48
NASA Procedures and Guidelines
MANDATORY PROCEDURES NPG: 7120.5A
FOR Effective Date: April 3, 1998
Expiration Date: April 3, 2003
MAJOR DEFENSE
ACQUISITION PROGRAMS (MDAPS)
AND Responsible Office: Code AE/Office of Chief Engineer
MAJOR AUTOMATED
NASA Program and Project Management Processes and Requirements
INFORMATION SYSTEM (MAIS)
ACQUISITION PROGRAMS
June 2001
49
Process Relations for Engineering a System
We will examine
the role of
Systems
Analysis in the
Systems
Engineering
Process as
defined in IEEE-
1220 in the next
lesson.
tio n
Sy
.
ica tio
ste
n
ri f alida
Integrate System and Perform
.
Develop System Performance
System Verification to
Specification
m
Performance Specifications
And System Verification Plan
& mV
.
De
… …
Ve
ste
s ig
Expand Performance Assemble CIs and
Sy
Specifications into CI “Design-to” perform CI Verification
n
Specifications and CI Verification to CI “Design-to”
Plan Specifications
Evolve “Design-to”
Inspect to
Specifications into “Build-to”
“Build-to”
Documentation and
Documentation
Inspection Plan
System Development
52
References
• Buede, Dennis M. The Engineering Design of Systems, 2nd
edition, Wiley, 2009.
• IEEE Standard for Application and Management of the Systems
Engineering Process, IEEE Std 1220-2005, September 2005.
• NASA Systems Engineering Processes and Requirements, NPR
7123, April 2013.
• National Security Space Acquisition Policy, NSS 03-01,
December 2004
• Systems Engineering Fundamentals, Supplementary Text
Prepared by the Defense Acquisition University Press, Fort
Belvoir, VA 22060-5565, January 2001.
• Operation of the Defense Acquisition System, DoD Instruction
5000.2, December 2008 (Note: superceded by DoD 5000.02
with a current date of November 2013, but 5000.2 is the correct
reference for the information used).
53
Lesson 4:
Systems Engineering Role of
Systems Analysis
54
Objectives
55
Systems Analysis Process
The project shall perform the tasks of systems
analysis for the purpose of resolving conflicts
identified during requirements analysis,
decomposing functional requirements and allocating
performance requirements during functional
analysis, evaluating the effectiveness of alternative
design solutions and selecting the best design
solution during synthesis, assessing system
effectiveness, and managing risk factors throughout
the systems engineering effort. Systems analysis
provides a rigorous quantitative basis for
establishing a balanced set of requirements and
for ending up with a balanced design. The tasks
associated with systems analysis are identified in
Figure 16. Even if a trade-off analysis is not done, an
overall assessment of the system effectiveness
should be completed.
Ref: IEEE-1220, Figure 16
56
Ref: IEEE-1220, Figure 16
57
Ref: IEEE-1220, Figure 16
58
6.7.1 Assess Requirement Conflicts
78
Lesson 5:
Model-Based System Engineering
79
Objectives
80
Modeling & Simulation over the Life Cycle per
EIA 632
Assessment Investment System Subsystem Deployment,
of Decision Concept Design & Operations, Support
Opportunities Development Pre-Deployment & Disposal
81
Advantages of Modeling and Simulation
83
Models and Modeling
84
Taxonomy of Models
(ref. Table 3.1, Buede)
85
Physical Models
86
Quantitative Models
Launch Systems Analysis
Technical Models Cost & Operations Models Economic Model
Trajectory
POST
Vehicle Performance
Trajectory
OPGUID Reliability / Safety Vehicle Losses
Risk Model
87
Qualitative Model -- Schematic Block Diagram
(ref. Figure 6.5, Systems Engineering Fundamentals)
88
Notes on Modeling
F1 F5
R1 R2
F4
Issue
R1-1
Equipment List
Risk
Product Evaluation
Product Evaluation &
& Document
Document Generation
Generation
•Risk Assessment •Test Planning
•Compliance & Cost Assessment •Select Design Solution
•Design Verification & Validation •Document Generation
Analysis Results Specifications
Technical
Technical Rules,
Rules, Standards,
Standards, and
and Conventions
These
ThesePrimary
PrimaryConcurrent
Concurrent/ /Iterative
IterativeActivities
ActivitiesAre
ArePerformed
PerformedFor
For rEach
Fo Each
Each
Product/System Architecture Design Layer
Product/System Architecture Design Layer
90
Cross-reference of IEEE 1220 SE Process to a
Model-based SE Process
Hierarchical
refinement of
functional & physical
architectures.
Originating
Requirements Specifications
Develop
Functional
Architecture
Develop
System Develop
& Definition
& Physical & Specification
&
Architecture
Develop
Allocated
Architecture
Manage
Process
91
SE Models Are the Infrastructure
of the SE Process
• Missions are really top level functions from an operational point
of view.
• We acquire assets because we need them to accomplish a
mission.
– Not just hardware, but plans, procedures, etc.
• We specify requirements in order to acquire the assets we need.
• The functional architecture serves as the tie between the
operational missions and the design requirements.
• At any given level in the system engineering hierarchy:
– Start with the Functions allocated to your Component in the
allocated architecture.
– Refine the functional architecture model until each leaf-level
function can be allocated to a single child Component.
– Populate the physical architecture with your new child Components.
– Specify Requirements for each sub-Component. Link constraints
directly to the child Components they affect, and functional
requirements to Functions.
– Link new Requirements to their parents.
92
Top Level Systems Engineering Process
Originating
Requirements Specifications
Develop
Functional
Architecture
Develop
System Develop
& Definition
& Physical & Specification
&
Architecture
Develop
Allocated
Architecture Define the design solution one
hierarchal level below the
Define the System Requirements,
System level that satisfies the
including background information to fill in
system requirements and all of
all of the project requirements and the
the stakeholder objectives. This
operational concept
includes allocated and derived
Manage requirements, trade studies,
Process
physical models, etc..
93
Architecture Modeling is Key to Process
ENV Mission 11 FFBD
Ground
Support
System
Lift Off
LL & LM
LL & LM Surface Return to
Earth to LLO Ops LLO to LS
LEO to LLO Ops Earth
AND LEO AND
5 6
3 7 8
1
SM & CEV
CEV & SM
Space
Environment
Physical Interface Architecture
Earth to
LEO to LLO
LEO Flight
Meteors
Environment
4 Lunar
2 Environment
Thermal
Crew
Transportation Launch &
Space System Missile
Regulations
Support
Environment
1
Space Environmental
Meteors
Weather Regulators
Destination Communication_
Surface Navigation In
Ground
Systems Space Support
Develop Support
System
6 System Systems
Functional
Health 3
Architecture 5
ETO Launch and Science Data
Transportation Comm Nav Mission
Support Science
Develop Cargo
System Write Uplink/Downlink Community
Delivery
& Definition
& Physical & Specification
&
System Science Data
Architecture CDS Health Science
Space Robotic Rqmt Desires
2
EnvironmentMeteors Thermal Precuser
Flight System
Develop Environment
Allocated 4
Launch
Architecture
Service Public Space Meteors
Provider Environment
Manage
Process
94
System Definition
Develop Architecture functional requirements
and start developing the Architecture mission /
Develop Functional
Architecture
behavior model
Develop and
Define the
& problem
& & refine
requirements
Develop
Operational
Concept
95
Develop Functional Architecture
Develop define the functionality for the level of design
below the Architecture based on the mission and
& & & & scenarios modeled at the Architecture level
Develop Component
Models
Associate the non-functional
constraints and physical
characteristics to the applicable
Link Constraints to
component in the physical components
architecture model.
& &
Define Physical
Define alternative approaches for physical Interfaces
decompositions, functional decompositions, Identify physical interfaces
or functional to physical allocation and between components based
choose the best approach based on set on the breakouts depicted
evaluation criteria, i.e. trade studies.
Analyze Design
Alternatives
97
Develop Allocated Architecture
Develop Functional
Map the functional behavior upon
the allowable set of components in
Architecture
& &
Conduct
Performance and
Risk Analysis
98
Let Engineering Products Drive Models
• Start with the products you want to produce:
– Spec Tree, Concept of Operations, Requirements Documents, Data
Dictionary, Functional Architecture Model, Risk List, etc.
• Think about the content of these products and how they are related:
– The Functional Model, for example, is an organizing structure for one
section of a Requirements Document.
– Every Risk in the Risk List should be associated with a Component or
Function.
• Use this information to define the structure and content of Models:
– Items
– Attributes
– Relationships
• Don’t Repeat Yourself
– Each piece of information should be kept in one place only.
• The model schema will grow with the product list.
KPP.4.1
Functional Model
System Compatibility, Block I
Requirements
verified by
Determine Ground Mode
OR
KPP #4.1, Evaluation Passive or Active
Criteria, System
Compatibility, Block I
Engagement Feel Minimum Altitude
0.0
Alert Tactor Indication
VerificationRequirement
Model
ResponsibleOrganization VerificationEvent VerificationEvent
Ground Mode Active Engagement
Perform Ground Mode
assigned to defined by uses configuration assigned to defined by uses configuration
Active Engagement Actuate Maximum
1.0 Altitude Alert Indication
KPP #4.1, System KPP #4.1, System KPP #4.1, System KPP #4.1, System Tactors
ResponsibleOrganization
assigned to
Lockheed Martin
ResponsibleOrganization
built from
KPP.4.1.BFV.TS.1.1
KPP #4.1 Test Set 1,
Component 1, Bradley
Fighting Vehicle
Component
built from
KPP.4.1.BFV.TS.1.2
KPP #4.1 Test Set 1,
Component 2, Bradley
Fighting Vehicle
built from
KPP.4.1.BFV.TS.1.3
KPP #4.1 Test Set 1,
Component 3, Bradley
Fighting Vehicle
ResponsibleOrganization
assigned to
Lockheed Martin
ResponsibleOrganization
built from
KPP.4.1.COM.TS.1.1
built from
KPP.4.1.COM.TS.1.2
KPP #4.1, Test Set 1, KPP #4.1, Test Set 1, KPP #4.1, Test Set 1, KPP #4.1, Test Set 1,
KPP #4.1, Test Set 1, KPP #4.1, Test Set 1, KPP #4.1, Test Set 1, KPP #4.1, Test Set 1,
Bradley Fighting Vehicle, Bradley Fighting Vehicle, Bradley Fighting Vehicle, Bradley Fighting Vehicle, Maximum Altitude Generate Maximum
Comanche, 1553 Comanche, 1553 Comanche, 28V Discrete Comanche, 28V Discrete
1553 1553 28V Discrete 28V Discrete Platform Position and
1.0 Altitude Alert Tactor
Actuation Signals
Motion Data
Link Link Link Link Link Link Link Link
connects to
KPP.4.1.BFV.TS.1.1
KPP #4.1 Test Set 1,
Component 1, Bradley
Fighting Vehicle
Component
connects to
KPP.4.1.BFV.TS.1.2
KPP #4.1 Test Set 1,
Component 2, Bradley
Fighting Vehicle
Component
connects to
KPP.4.1.BFV.TS.1.2
KPP #4.1 Test Set 1,
Component 2, Bradley
Fighting Vehicle
Component
connects to
KPP.4.1.BFV.TS.1.3
KPP #4.1 Test Set 1,
Component 3, Bradley
Fighting Vehicle
Component
connects to
KPP.4.1.COM.TS.1.1
Component
connects to
KPP.4.1.COM.TS.1.2
Component
connects to
KPP.4.1.COM.TS.1.2
Component
connects to
KPP.4.1.COM.TS.1.3
Component
Determine Altitude Alerts
OR
• Functions
• Inputs/Outputs
Minimum Altitude Generate Minimum
0.0 Altitude Alert Tactor
Actuation Signals
• Establish Source/Originating Requirements Host Compuer - TSAS Simulator Terminal • Time Sequence
Rx Host Platform Altitude Tx Platform Position and
Simulator
AND
Host Platform Altitude
Alert Conditions AND
• Logic
• Managed Traceability Host Computer - TMCS
Generate Altitude Alert
Platform Conditions
• Scenario Development
• Level I to Derived Requirements • Operational
• Requirements to Simulation and • Simulation
Verification Elements
System ...
Ethernet; Host Computer - Flight Model Com...
Tracking- Simulated
Primary Power; Simulator - Host Compu...
Ethernet; Host Computer - Image Gener...
RPA/CDA
- Retinal- Host
0 Primary Power; Simulator - Audio System
- Head
Generator
Computer
Computer
Rx Platform Position and Motion Data
- Tactile
Simulator
Sync; Image
Analyses Definition
Power; Host
Pilot - Host
Power; Simulator
Primary Power;
Maximum Altitude Alert Tactor Actuation Signals
Software;
0
• Validation of Logic
Primary Human;
Primary
Actuate Maximum Altitude Alert Indication Tactors
101
References
102
Lesson 6:
Symbolic Models
103
Objectives
104
System Functional Breakdown
System Requirements
5.0
Function E
6.0
Function F
~
Function E-A Function E-C Function E-E
5.2 5.4
Function E-B Function E-D
105
Working From the Operational Need
Need
Feasibility Analysis
or
Results of analysis
(select Airborne Transportation capability)
106
Progressive Refinement from Need to
Functional Analysis Model
107
Maintenance Functional Analysis
108
Top Level Capability/ Mars Mission Def. Trade Space
Split SEP‐Chem
Transit Consumables Deployment Pre‐Deploy All Up
Earth to Orbit
Phase
1.0.A success
AND AND
12
CLV Recovery
Phase
1.0
OR
Launch Mission
112
Example – 1.0 Launch Mission
lift-off
EXIT
1.0.A success
1.8
Scrub - turnaround
Perform Scrub
Turnaround
113
Example 1.3.0 – Perform Multi-element Testing
1.3.1
Integrate CEV/CDV
1.3.3
1.3.2
Integrate CLV
1.3.7
114
Example – 1.3.2.0 Integrate CLV
1.3.2.4
Provide
simulation of
CLV environ...
1.3.2.1
Integrate 1st
Stage
1.3.2.3 1.3.2.5
Integrate Upper
AND AND AND AND
Stage to 1st Conduct test
Stage
1.3.2.2 test
comma...
Integrate Upper
Stage
1.3.2.6
Respond to
simulated
commands
115
Example – 2.0 Earth to Orbit Phase
2.5
Monitor CEV for
Escape/Abort
Conditions
CLV
health a...
2.1
Monitor CLV for
Abort/Escape 2.0.a Ascent
Conditions
TT ARM launch
2.6
command abort c...
Perform Orbit OR
Insertion Burn
2.3 2.3
ascent abort ascent abort
Perform Ascent Perform Ascent
Abort Abort
2.2
Boost to
AND OR AND AND AND
CEV-CLV
Separation
Range 2.3.3
Separation OK
Safety d...
Manage CEV
2.4
116
Example – 12.0 CLV Recovery Phase
117
Introduction to IDEF0
(Integration Definition for Function Modeling)
118
IDEF0 Approach Characteristics
• Comprehensive and expressive
– Can graphically represent a wide variety of operations to any level
of detail.
• Coherent and simple language
– Provides rigorous and precise expression, promoting consistency of
usage and interpretation.
• Enhances communication between system analysts, developers
and users.
• Well-tested and proven
– Years of Air Force and other government agency use.
• Can be generated manually or through a wide range of software
packages.
– CORE
– DOORS
– Cradle
119
IDEF0 Purpose & Viewpoint
Controls
Inputs Outputs
A0
Mechanisms
121
An IDEF0 Functional Decomposition
(ref. Figure 3.5, Buede)
A-0 A-0
A0 A1 A2 A3
122
Functional Decomposition
123
Functional Decomposition in an IDEF0
Model (ref. Figure 3.6, Buede)
C1 C2 C3
Transform I1 & I2
I1 into O1, O2, & O3 O1
as determined by
O2
I2 C1, C2, and C3
using M1 A0 O3
M1
124
A 2-Level IDEF0 Functional
Decomposition (ref. Figure 3.6, Buede)
125
3 Elements of Feedback Control in
Functional Design
126
Closed Loop Control Process
(ref. Figure 7.5, Buede, abridged)
Sense Output
Compare
Desired Control
Desired to
Output Process Control
Actual delta
Variable
Transformation
Input Output
Process
127
IDEF0 Feedback Semantics
(ref. Figure 3.4, Buede)
label
Control Up & over
Feedback
Input
Feedback label
Down & under
Mechanism
Feedback label
Down & under
128
Mental Models – System Views
• The operational view describes how the system will serve it’s
users. It is useful when defining requirements of “how well” and
“under what conditions.”
129
A Process Flow from Two Viewpoints
Customer Functional Flow
Get Hot
Food 3.1 Pack in Deliver
Collect
Payment 2 Sack 3.3 Order 4
Pour Cold
Drinks3.2
130
Architecture Development
(ref. Figure 1.9, Buede)
Operational Concept
132
What is the Purpose of the Functional
Architecture Development?
User
requirements Design
User
requirements Design
User
System Requirements
requirements Design
User
Functional
requirements
Definition Design
User
requirements Design
User
requirements Design
133
Functional Architecture Terminology
136
Concept Description Sheet
(ref. Figure 6.3, Systems Engineering Fundamentals)
137
Process Fast Food Order
Physical Architecture
• Servers
• Computer Register
• Cooks
• Machines
138
Operational Architecture
139
Functional/Physical Allocation Matrix
(ref. Figure 6.2, Systems Engineering Fundamentals)
140
Functional/Physical Allocation Matrix
Take Fast
Food X X
Order
Prepare
Order X X
Deliver
Order X
Collect
Payment X X
141
Systems Engineering use of IDEF0
Models
142
Process Fast Food Order
Operational Architecture
Order Entry Food Delivery
Procedures Preparation Instructions
Instructions
Take
Order Customer
Fast Food
Entered Bill
Order A1
Order
Updated
Food and Prepare Food and
Supplies Order A2 Supplies
Packaged
Inventory Inventory
Order
Deliver
Order A3
Delivered
Additional Order Paid
Order Collect Order
Payment A4 Update
Cash
Receipts
Servers Computer Cooks
Register Machines 143
Key Points
144
References
145
Lesson 7:
Mathematical Models
146
Objectives
147
Space Transportation System
148
Building a Model – the 4 Step Process
149
Exercise
150
Stage Mass Relations
Initial
Payload mpl
Vehicle
Mass minitial
Guidance, Telemeter,
& Equipment Bare
Vehicle
mp Full or
Propellant Loaded
Propulsion
System
Tank, Structure, Empty Mass
Residual Propellant Propulsion
Rocket Engine System Mass
151
Exercise
152
Derivation of the Rocket Equation
L
dv/dt
mg = W
T
L
dv/dt
mg = W
T
dv
m T D mg sin
dt
154
Integration Yields the Rocket Equation
D mg minitial
v gI sp 1 sin ln
T T mfinal
Drag losses
Gravity losses
m0
v = ve ln m
f
Where:
v = change in velocity (delta-v) to perform in-space
maneuver
Ve = exhaust velocity (engine)
m0 = stage initial mass – structure mass + propellent
mf = stage final mass
Example Delta-V for Mars Missions
Crew Vehicle Total Delta‐V
Opposition Class ‐ 2033 "Good" Opportunity
60
20 Day Stay
40 Day Stay
60 Day Stay
80 Day Stay
50 100 Day Stay
Conjunction
Opposition Class “Short‐Stay” Conjunction Class “Long‐Stay”
40
Total Delta‐v (km/s)
60‐Day One‐Way Transits
30
Stay Time Varies
(550‐730 Days)
20
200‐Day One‐Way Transits
ORBIT ASSUMPTIONS
Earth Departure Orbit = 400 X 400 km
10 Mars Arrival Oribt = 250 X 33,813 km
Mars Departure Oribt = 250 X 33,813 km
Direct Entry at Earth Return
PLANETARY ARRIVAL ASSUMPTIONS
Mars Propulsive Capture
Capture Plane: As is
Direct Earth Entry @ 13 km/s
No Venus Swing‐
0 by
‐ 200 400 600 800 1,000
Total Mission Duration (Days) Trajectory Set: 27 January 2012
Drake, B. G., et al “Trajectory Options for Exploring Mars and the Moons of Mars”, 2012
157
Your Title Here
Relationship of Delta-V and Isp
Isp Mp at 7 Mp at
km/sec 11 km/sec
360 6.27mf 21.6mf
Chemical Stages Advanced Propulsion
LO2/LH2 Nuclear Thermal (LH2)
LO2/LCH4 Nuclear Electric
NTO/MMH Solar Electric
Exercise
• Step 3. Is it useful?
160
Black Box View of Rocket Equation
Orbit Mass
Thrust
Drag
161
Exercise
162
Propulsion System Trades Examples
Chemical Stages Advanced Propulsion
LO2/LH2 Nuclear Thermal (LH2)
LO2/LCH4 Nuclear Electric
NTO/MMH Solar Electric
Transportation Options for Mars
Standard Chemical Propulsion Nuclear Thermal Propulsion
Exploration Upper Stage Specific Impulse = 896 s
Useable Prop = 118 mt Varying degrees of technology All LH2 fuel with zero
development required boil-off
Engine = RL10-C2
Leveraging commonality with
Specific Impulse = 462 s SLS (EUS) or other Mars Requires 20K
elements (methane engines cryocoolers for LH2
Total Thrust = 99 klbf
from lander) where possible
EUS can provide the first TMI burn if it does not have Implementation requires a “core stage” with engines
to loiter in Earth orbit for a long duration (i.e. less and nuclear reactors. The core stage is
than 1 week). This drives very aggressive supplemented with in-line tanks and drop tanks to
operations assumptions for any multi-launch provide the required propellant for the mission.
architecture.
EUS cannot be launched full with SLS 2B so this Mars architecture balances long term propellant ARM-derived SEP can deliver 35-45mt of cargo to
stage would be scaled to fit the mission and lift storage with specific impulse for all other propulsive Mars with 3-4 years of flight time. Other, more
capabilities of the SLS. With near-ZBO propellant maneuvers by using a Lox/Methane propulsion aggressive trajectories may enable increases is
storage, higher specific impulse provides stage payload delivery but have not yet been fully vetted.
advantages over Lox/Methane
Other Mission Elements
SLS Launch Vehicle Mars Lander
Configuration = Block 2B w/
Advanced Solid boosters and a
Oxidizer = Liquid Oxygen
10m fairing
Fuel = Liquid Methane
Performance data from SLS
Mission Planners Guide Specific Impulse = 360 s
Delivery orbit optimized
A 2B SLS is required to provide the The mass of the lander can be tailored to fit within
necessary lift capability to support human the constraints of the transportation systems
exploration of Mars. 10m fairing is required selected however, the 10m diameter must be
to package large hydrogen tanks for NTP accommodated. Smaller landers will result in more
and Mars landers for surface operations. landers required for a specified surface mission
Orion can potentially be used in two modes. The first Coordination between MSFC, LaRC, and JSC
is as a means to deliver the crew to and from an habitat teams to develop rules of thumb for
aggregation point in Earth orbit. The second is as a consistent habitat sizing as a function of crew
direct re-entry vehicle for crew return directly from a size and mission duration.
Mars-Earth trajectory.
Trajectory Types
Conjunction Opposition
• “Long Stay Mission” • “Short Stay Mission”
• Typical stay time ~500 days • Typical stay time ~30 days
• Lower energy trajectories • Higher energy trajectories
• Many involve Venus swing-by
Transportation Tech. Trajectories
One-Way Trip Times on the order of 250 days One-Way Trip Times on the order of 1400 days
(Near-Term Electric option requires methane
chemical propulsion stages for crew delivery;
crew trajectories are high-thrust)
Mathematical Models: Further Exercises
168
Key Points
169
References
170
Lesson 8:
Integrated Models
171
Objectives
172
Integrated Models
• Often, we wish to Technical Models Cost & Operations Models Economic Model
know more about System
SystemWeights
Weights Development
Development&&
&&Sizing Vehicle Acquisition Costs
a system than a Sizing
INTROS
INTROS
Weights
Unit
UnitCosts
Costs
NAFCOM/PRICE/SEER
NAFCOM/PRICE/SEER
& Vehicle
single model can System
SystemWeights
Weights
Description
$$//lb
&&Sizing lbto
toOrbit
Orbit
Sizing
tell us. CONSIZ
CONSIZ
Business
Business CaseClosure
Case
Business
Closure
BusinessModel
Model
System
SystemWeights,
Weights,
Sizing
Sizing&&Trajectory
Trajectory Facilities
Facilities&& Flight Rate
HAVOC
HAVOC Operations
OperationsCosts
Costs
• In these cases, NROC/AATE
NROC/AATE
Facilities & Ops Cost
173
Vehicle Mass Relations
Initial
Payload mpl
Vehicle
Mass minitial
Guidance, Telemeter,
& Equipment Bare
Vehicle
mp Full or
Propellant Loaded
Propulsion
System
Tank, Structure, Empty Mass
Residual Propellant Propulsion
Rocket Engine System Mass
174
Equations of Motion
(1-D Spherical, non-rotating Earth)
dv T D k
2 sin
dt m m r
where 2 g local gravitational acceleration
k
r
r radius from Earthcenter
d v t L k
2 cos
dt r vm vm vr
dr
v sin
dt
dm T
dt Isp
Integrate until r = orbit, v = orbital speed, = 0 deg
solving Mass Ratio (MR)
175
Closure Model
176
Single-Stage-to-Orbit Concept
177
Launch Vehicle Spreadsheet Sizer (LVSS)
Modules
Input Weights Model
- Mission (# crew, duration, etc.) - Scaling relationships,
- Geometry (lengths, areas, volumes) Weight Estimating
- Mass Ratio, T/W Relationships, or analytical
subroutines (macros)
- Scale T/W
Volume Model
- Tank Volume = f(body no
Weights
volume, fixed volume, tank Converged
efficiency)
- Propellant = f(tank yes
volume, oxidizer/fuel ratio,
propellant densities,
Mass Ratio Stop
ullage)
Sizing Model
- Compute propellant require to meet MR
- Compute body volume (inverse of volume
model)
- Compute scale geometry factors for new body
volume
178
1.0 Wing 26545
2.0 Tail 2969
LH2 tank 12793
LO2 tank
Basic structure
9541
18680
Launch Vehicle
3.0 Body
Secondary structure
53025
12012
Spreadsheet
TPS
Internal insulation
21064
1075 Sizer (LVSS)
Purge, vent, drn, & hazrd gas det. 941
4.0 Induced environment protectio 23079 Output
5.0 Undercarriage and aux. system 8851
6.0 Propulsion, main 72287
7.0 Propulsion, reaction control (R 3536
8.0 Propulsion, orbital maneuver ( 3040
9.0 Prime power 2968 19.0 Growth allowance 69116
10.0 Electric conversion and distr. 8710 20.0 Personnel 0
11.0 Hydraulic conversion and dis 0 21.0 Payload accomodations 0
12.0 Control surface actuation 3648 22.0 Payload 1840062
13.0 Avionics 6504 23.0 Residual and unusable fluids 2701
14.0 Environmental control 2839 25.0 Reserve fluids 8629
15.0 Personnel provisions 0 26.0 Inflight losses 9536
27.0 Propellant, main 1663724
16.0 Range safety 0
28.0 Propellant, reaction control 1070
17.0 Ballast 3225
29.0 Propellant, orbital maneuver 0
18.0 Payload provisions 0
10000
1 10 100 1000
ESTIMATING RELATIONSHIPS
NAFCOM99:
• Cost = A * Wt ^ b * Complexity Factors
• Cost = C * Wt ^ w * New Design ^ x * Technology
^ y * Management ^ z
Other:
• Rocketdyne’s Liquid Rocket Engine Cost Model
OUTPUTS
• DDT&E Cost
• First Unit Cost
182
Cost per Flight
Category Component
Vehicle recurring cost (fabrication, assembly, and verification),
Vehicle amortization share
Cost Refurbishment Cost (including spares)
Pre-launch ground operations cost, Mission and flight operations cost
Direct Propellants, gases, and consumables
Operations Ground transportation cost
Cost Launch site user fee (per launch)
Mission abort and premature vehicle loss charge
Program administration and system management
Marketing, customer relations, and contracts office
Indirect
Technical support and vehicle improvements
Operations
Development amortization and royalty or cost recovery of technical
Cost
changes
Profit, taxes, and fees
183
Production Costs
Production Cost = Theoretical First Unit x Learning Factor
Where
Learning Factor = Number of Units B
And
B = ln(100/Learning curve slope)/ln 2
Where
Learning curve slope = percentage reduction in cumulative average
cost when the production number of units is doubled.
Unit Production Average Unit
number cost, TFU x L cost cost
1 1.00 1.00 1.00
2 1.90 0.95 0.90
3 2.77 0.92 0.87
4 3.61 0.90 0.84
5 4.44 0.89 0.83
A 95% Learning curve example 184
Safety Modeling Process
185
Example Master Logic Diagram
(Fault Tree)
Loss of
Vehicle
Variable Fuel
Nozzle Fails to Significant
Fails Ignite Cooling
Engine Engine Door/ Passages
Failure Failure Ramp Fails Leak
Uncontained Shutdown To Close
186
Example Event Tree Loss of
Vehicle
RAMJET
IVHM detects SCRAMJET
Turbojet TPS Debond LH2
weak bonds Actively
Failure
Failure(Fuselage or Wing) Failure Cooled Walls
priorTurbopump
to launch Failure
Failure
B
Trajectory
POST
Vehicle Performance
Trajectory
OPGUID Reliability / Safety Vehicle Losses
Risk Model
189
ModelCenter® for Integrated Analysis
191
ModelCenter® for Analysis
(Optimization)
• Optimization Methods
include:
– Variable Metric
– Conjugate Gradient
– Feasible Directions
– Sequential Linear
Programming
– Sequential Quadratic
Programming
192
Key Points
• Models can be networked together if necessary.
Excellent tools are available to facilitate model
integration.
– ModelCenter was outlined, but several other tools are
available (e.g. Isight with SIMULIA).
193
Lesson 9:
Systems Simulation
194
Objectives
195
Vehicle Simulation – Levels of Maturity
1- to 3-D 6-D
Trajectory Simulation Trajectory Sim
Total In-
Flight
Simulator
Specialized Simulators
196
Classes of Simulations
197
Types of Simulations
198
Recall the 4 Step Simulation Process
199
Uses of DOE and Response Surfaces
200
Design of Experiments
100 Klb -1
201
Full Factorial Array
(all possible combinations)
This image cannot currently be display ed.
6 7
High
2 3
3 Parameters (A, B, C) C
2 Variations (High, Low)
8 Experiments
5 8Low
High
B
1 4 Low
LowThis image cannot currently be display ed.
A High
202
Full Factorial Array for 2nd Order Effects
(all possible combinations)
This image cannot currently be display ed.
High
3 Parameters (A, B, C)
C
3 Variations (High, Medium,Low)
27 Experiments
Low
High
B
Low
Low This image cannot currently be display ed.
A High
203
Taguchi Methods
B B B
be be be
display ed. display ed. display ed.
6 7 6 7 6 7
2 3 2 3 2 3
This image cannot This image cannot This image cannot
currently be currently be currently be
display ed. display ed. display ed.
5 8 A 5 8 A 5 8 A
1 This image
cannot currently
be display ed.
4 1 This image
cannot currently
be display ed.
4 1 This image
cannot currently
be display ed.
4
C C C
204
Design Point Requirements for a 2nd Order
Model
3 27 15 11
4 81 25 16
5 243 27 22
7 2187 79 37
205
Example – Engine Simulation
206
Example (continued)
207
Example Run Matrix
208
Example Results – Average Horsepower
Valve Duration
Valve Lift 252/258 276/282 294/300
0.472/0.480 306.7 317.3 291.5
0.502/0.510 308.4 319.2 293.3
0.540/0.562 309.4 320.9 294.9
209
Example – Plot of Experimental Results
330
320
310
HP 300
290
280 S3
S2
270 Valve Lift
S1
1 2
3
Valve Duration
210
Use of Response Surface Methods (RSM)
211
Constructing the Response Surface Model
212
Fitting a Simple Model From DOE
The following mathematical model for the 23 array can be determined...
1 -1 -1 -1 1 1 1 -1
1 -1 -1 1 1 -1 -1 1
1 -1 1 -1 -1 1 -1 1
1 -1 1 1 -1 -1 1 -1
y = 1 1 -1 -1 -1 -1 1 1
1 1 -1 1 -1 1 -1 -1
1 1 1 -1 1 -1 -1 -1
1 1 1 1 1 1 1 1
214
Example – Optimization
• Dyno Simulation
– 323.7 HP at 270/276 degrees duration & 0.540/0.562 inches
lift
215
Key Points
216
Lesson 10:
Requirements Analysis & Validation
217
Objectives
218
Requirements Analysis
219
Effect of Requirements Definition
Investment on Program Costs
200
GRO78
Target Cost Overrun, Percent
OMV
TDRSS PAY NOW
160 IRAS OR
Gali
HST PAY LATER
TETH
GOES I-M
120 LAND76
CEN
EDO (recent start)
MARS
ACTS
ERB77
80 STS COBE
CHA.REC LAND78
GRO82
ERB80
SEASAT
40 UARS VOYAGER HEAO
EUVE/EP
DE Ulysses ISEE
SMM PIONVEN IUE
0
0 5 10 15 20
Ref: NASA/W. Gruhl
Requirements Cost/Program Cost, percent 220
Design Feasibility
Concepts
Customer
Needs & Models
Wants & Data
Validation
Requirements Analysis
Validation
Architecture Technology
Needs & Wants Performance
Predictions
Technologies
222
DAC Process Overview
• The essence of the Design
Analysis Cycle process is the
cyclical execution of a Analysis
DAC Planning
planned set of Products
complementary engineering Products
•Analysis Results
•Design Issues
analyses and trade studies •Master Analysis Plan •Tool & Model Validation
& Schedule (MAPS) •Presentation of results
to characterize the capability •Initialization Data List
and functionality of a
system.
Products
• The objective is to ensure Products •Proposed Design or
•Analysis Reports & Operational Scenario Updates
that the system will perform Master Summary (ARMS) •Action Item Data Base
•New Design
the intended mission. Requirements
•Integration Forum Minutes
•New Operations Scenarios
• The process is divided into
four main functions: DAC
Issue Resolution
planning, analysis, issue Documentation
resolution, and
documentation.
223
Coverage Matrix
• Validate Coverage Example Coverage Matrix
– Are all requirements properly covered by this cycle ?
Enables a cognizant decision on each requirement
AN AN AN AN AN
Requirement
• CM Elements #1 #2 #3 #4 #5
– Requirement – from respective Spec Rqmt #1 -
– Analysis – Task associated with TDS X
TBR
• Coverage Stages
– Requirements (this example) – analysis to solidify
Rqmt #2 -
1
requirements TBR
– Design – analysis to develop and mature the design to a
stated level Rqmt #3 - 2
X X
– Verification – analysis to verify that the design meets the TBR
requirements
• Example Issues
Rqmt #4 - 3
No coverage – no analysis being performed in support of X
Baselined
a requirement – may be intentional (not required)
Duplicate coverage – multiple analyses working same
requirement – may be intentional (validation of results) Rqmt #5 -
X
Unnecessary coverage – analysis against a requirement TBR
that needs no work to meet the goals of this cycle
Missing requirement – analysis cannot be mapped to a 4 X
current requirements – may need to add
No Rqmt
224
Example: CEV Project RAC 1 Coverage Matrix
Feasibiliy Requirement Definition
Requirement
Req. # Requirement Feasibility
Feasibility Feasibilty Definition Definition
Method
Approach Organization (Correctness) Organization
(TDS)
Approach
CV0001 The CEV shall transport crews of 2, 3, and 4 crew members between Earth Analysis 1 S/C SE&I CARD Level 2
and lunar orbit in accordance with Table 1, Total Lunar DRM Crew,
Destination Cargo, and Equipment Definition. [CV0001]
CV0002 The CEV shall have a lunar mission rate of no less than three per year. (TBR- Analysis 5 Operations CARD Level 2
002-106) [CV0002]
CV0003 The CEV shall be capable of conducting a lunar mission within 30 days (TBR- Assessment 7 Ground Ops CARD Level 2
002-003) following a failed mission attempt.[CV0003]
CV0034 The CEV hatch shall be capable of being opened and closed by ground Previous 7 Ground Ops 7 Ground Ops
personnel. [CV0034] Programs/
Engineering
CV0042 The CEV Launch Abort System shall provide a thrust of not less than 15 Analysis 1 S/C SE&I 9 AFSIG
(TBR-002-12) times the combined weight of the CM+LAS for a duration of 2
(TBR-002-149) seconds. [CV0042]
CV0062 The CEV shall provide pressurized transfer of crew and cargo between mated Agency 1 S/C SE&I CARD Level 2
habitable elements in orbit. [CV0062] Decision /
CARD
CV0063 The CEV shall equalize pressure between CEV pressurized volume and the Previous 1 S/C SE&I 4 Operations
transfer vestibule prior to opening the CEV transfer hatch. [CV0063] Programs/
Engineering
CV0064 The CEV shall monitor the pressure of the mated vestibule volume when the Previous 4 Operations 4 Operations
CEV docking hatch is closed. [CV0064] Programs/
Engineering
CV0065 The CEV shall provide not less than two (TBR-002-007) vestibule Analysis 1 S/C SE&I 6 Operations
pressurization cycles per mission. [CV0065]
225
Requirements Allocation & Flowdown
Contiguous
Satellite Prop Weight
Area Scan Orbit Altitude Relay Access
Allowable Satellite
Constraints Stowed Length
Orbit Inclination
Analysis
Global Daily
Coverage/ TT&C Access
Coverage Relay Constraints
Revisit Time Spacecraft
Location
Relay Analysis Configuration TT&C Antenna
Studies Mounting Geometry
226
Resource Allocation
Celestial Location Error
Radial
---------------------------------
1.0 RMS radius
Celestial Location
1-axis, RMS
-------------------------
Systems Analysis
0.707
Sum enables prudent
Random
Errors
-------------
Margin
--------------
Systematic
Errors
---------------
allocation of
0.133
RMS
0.420
RMS
0.154
requirements/
FTS Stability
during Obs
LOS Temporal Noise
from FL Positions
FTS Stability since
Calibration resources and
----------------- ------------------------ ------------------
0.018 0.011 0.068 margin from system
Fiducial Light LOS Temporal Noise Boresight Calibration
Stability during Obs
-----------------------
from Star Positions
----------------------
Residual
-----------------------
to component.
0.034 0.030 0.128
227
ISS Atmosphere Control Systems Analysis
• Planned, deliberate
evolution & maturation of
key systems models over
the life cycle
229
Verification Compliance Matrix
Performance Requirement Requirement Source Capability/Margins Planned Verification Verification
(Spacecraft Specification (Parent Requirement) (Physical, Functional, Method Requirements Event
Paragraph) Performance)
376239 LAE Thrust Vector DERVD 3.6 Spacecraft IPS Derived Analysis Verified by EQ.LAE
Alignment Component + Requirement IOC AXAF.95.350.121 measurement at the
0.25 degrees AXSC 3.2.9.1 Structures & engine level.
Mechanical Subsystem
376240 LAE Location +3 DERVD 3.6 Spacecraft IPS Derived Analysis Verified by analysis. SE30.TRW
inches Requirement IOC AXAF.95.350.121
AXSC 3.2.9.1 Structures &
Mechanical Subsystem
376241 RCS Minimum DERVD 3.6 Spacecraft IPS Derived Analysis Demonstrated during EQ.PROP
Impulse Bit TBD Requirement IOC AXAF.95.350.121 thruster qualification REM
AXSC 3.2.9.2 Thermal Control testing.
Subsystem (TCS)
376242 RCS Thrust 21 lbf + DERVD 3.6 Spacecraft IPS Derived Analysis Demonstrated during EQ.PROP
5% (at 250 psia inlet Requirement to be resolved AXSC thruster qualification REM
pressure) 3.2.9.2.1 Heaters testing.
376243 RCS Minimum DERVD 3.6 Spacecraft IPS Derived Analysis Demonstrated during EQ.PROP
Specific Impulse (inlet press. Requirement IOC AXAF.95.350.121 thruster qualification REM
= 250 psia) 225 sec (BOL AXSC 3.2.9.2.1 Heaters testing.
steady state)
376244 Total Pulses (each DERVD 3.6 Spacecraft IPS Derived Analysis Demonstrated during EQ.PROP
thruster) 50,000 Requirement IOC AXAF.95.350.121 thruster qualification REM
AXSC 3.2.9.2.1 Heaters testing.
376245 Propellant DERVD 3.6 Spacecraft IPS Derived Predicted Throughput: Analysis Demonstrated during EQ.PROP
Throughput 200 lbm Requirement IOC AXAF.95.350.121 92.6 lbm thruster qualification REM
AXSC 3.2.9.2.1 Heaters Expected Margin: 107 testing.
lbm
230
Sample Analysis Fidelity Definitions
Fidelity Level Geometry & Packaging Structures & Materials Sizing & Closure Trajectory & Performance
Documented vehicle from
Rocket equation/mass ratio
Beginning literature with similar Mass fraction estimate Weight closure only
estimate
technology
231
Sample Analysis Fidelity Definitions (con’t)
Fidelity Level Geometry & Packaging Structures & Materials Sizing & Closure Trajectory & Performance
Documented vehicle from
Rocket equation/mass ratio
Beginning literature with similar Mass fraction estimate Weight closure only
estimate
technology
232
Other Disciplines for Which Fidelity Levels have
been Defined for Launch System Development
233
Progressive Fidelity over Time
Systems Analysis Fidelity 0 1 2 3 4 Systems Analysis Fidelity 0 1 2 3 4
Geometry & Packaging Geometry & Packaging
Structures & Materials Structures & Materials
Sizing & Closure Sizing & Closure
Trajectory & Performance Trajectory & Performance
Aerodynamics & Aerothermal Aerodynamics & Aerothermal
Avionics & Software Avionics & Software
Propulsion Design & Performance Propulsion Design & Performance
Aerothermal & TPS Sizing Aerothermal & TPS Sizing
Thermal Management & Design Thermal Management & Design
Airframe & Engine Subsystems Airframe & Engine Subsystems
Safety & Reliability Safety & Reliability
Operability, Supportability & Maintainability Operability, Supportability & Maintainability
Cost & Economics Cost & Economics
MCR PDR
Systems Analysis Fidelity 0 1 2 3 4
Geometry & Packaging
Structures & Materials
Sizing & Closure
Trajectory & Performance
Aerodynamics & Aerothermal
Avionics & Software
Note the progression of fidelity
Propulsion Design & Performance throughout the life cycle.
Aerothermal & TPS Sizing
Thermal Management & Design
Airframe & Engine Subsystems
Safety & Reliability
Operability, Supportability & Maintainability
Cost & Economics
CDR 234
General Fidelity Level Descriptions
Fidelity Level Assembly Level Program Phase Analyses Type TPM Level
0 System Pre-Phase A rapid assessment definition of
of system system level
architectures TPM's
1 Subsystem Pre-Phase A initial definition of
assessment of subsystem level
as-drawn system TPM's
design
2 Assembly Phase A refined definition of
assessment of assembly level
as-drawn system TPM's
& subsystem
design
3 Component Phase B preliminary definition of
assessment of Component
as-drawn level TPM's
system,
subsystem &
assembly design
4 Part Phase C detailed definition of
assessment of part/material/pro
as-drawn perty level
system, TPM's
subsystem,
component &
part design
235
Key Points
236
Lesson 11:
Effectiveness Analysis
237
Objectives
238
Technical Metrics
• Metrics are measurements collected for the purpose of
determining project progress and overall condition by observing
the change of the measured quantity over time. Management of
technical activities requires use of three basic types of metrics:
239
Systems Analysis Cycle
Risk
System Systems
(and Alternatives) Analysis
Requirements
and Design TPMs MOPs
(Models
Reference
And Performance,
Missions
Technology Simulations) Cost, Safety,
(and Alternatives) etc.
MOEs
Verification and
Validation
Cost, Schedule,
Systems, Technologies, Program
Requirements, Control
Design Reference
Missions
240
Technical Metrics – DAU Example
• Product metrics are those that track key attributes of the design to observe
progress toward meeting customer requirements. Product metrics reflect three
basic types of requirements: operational performance, life-cycle suitability, and
affordability. The key set of systems engineering metrics are the Technical
Performance Measurements (TPM.) TPMs are product metrics that track design
progress toward meeting customer performance requirements. They are closely
associated with the system engineering process because they directly support
traceability of operational needs to the design effort. TPMs are derived from
Measures of Performance (MOPs) which reflect system requirements. MOPs
are derived from Measures of Effectiveness (MOEs) which reflect operational
performance or stakeholder requirements.
• MOE: The vehicle must be able to drive fully loaded from Washington, DC, to
Tampa on one tank of fuel.
• MOPs: Vehicle range must be equal to or greater than 1,000 miles. Vehicle
must be able to carry six passengers and 300 pounds of cargo. The vehicle
must meet DOT requirements for driving on interstate and secondary highways.
• TPMs: Fuel consumption, vehicle weight, tank size, power train friction, etc.
241
Mars Mission Example FOMs & Metrics
CATEGORY FOM STD. METRIC
Requires detailed
Safety/Risk LOC / LOM Risk Assessment risk assessment
Crew Health Rad. Exposure
Programmatic Risk High Sensitivity of Gear Ratio = More risk
Tech Dev Risk R&D Degree of Difficulty Requires extensive
trajectory analysis
Performance Trip Time Reduction Add 1 Launch & Reduce Trip Time
Affordability Development Cost DDTE $ Requires detailed
Recurring Cost Recurring $ cost analysis
• MOEs
– $/lb to Low Earth Orbit
– Safety
• MOPs
– Probability of Loss of Crew
– Probability of Loss of Vehicle
– Probability of Loss of Payload
– Probability of Loss of Mission
– Total Annual Operational Cost
– Acquisition Cost
– Operability
– Design Risk
244
Example -- 2GRLV Product Metrics (con’t)
• TPMs
– Development Cost Acquisition Cost
– Production Cost
– Turn-around Time
– Dispatch Time
– Integration Time Operability
– Maintenance Time
– etc.
245
Example -- 2GRLV Systems Analysis
Performance Models Cost & Operations Models Economic Model
Trajectory
POST
Vehicle Performance
Trajectory
OPGUID Reliability / Safety Vehicle Losses
Risk Model
246
Technical Performance Measurement –
The Concept
Ref. SE Fundamentals
247
Example – Chandra Weight History
(estimates) (estimates & actuals) (actuals)
Allocation
Margin
Control
248
Case Study
TPM Application
HDD Development/Qualification
R Allan Ray
2/8/01
Application of Systems Engineering Tools
250
Hard Disk Drive Industry
• Market Segments--Personal and Enterprise
• Enterprise Products--Workstations, Servers, Storage Area
Networks,RAIDs, Network Attached Storage, Set-top
Appliances(“Smart VCRs”)
• Customer Set-- EMC, Sun, SGI, IBM, HP, Dell, Compaq,
Gateway
• Major Players/Competitors--Seagate, Maxtor, IBM, Fujitsu
• Market Survival--TTM, Price, Responsiveness
• Drive Info
– 9G-180G available, 500G in development
– 7200 - 15000 rpm available, 20000 rpm in development
– 4 - 10 msec seek times
– 100% Duty Cycle, 24/7
– Industry “sweet spot” 18 & 36G
– Design--US Manufacture—Asia
– 5 year Warranty, Typical 3 year use in OEMs
251
HDD Development - to - Qualification Process
EVT
Marketing / Lessons
LessonsLearned
Learned (TPMs) Master
Meas List
Customer
MRD Reliability
Reliability Master Test List Design
Design
Assessment
Assessment
TPMs
TPMs
Specs Req’ts
Req’ts Preliminary Metrics Define
Metrics Define
(by phase) List
List Design Test
Test
Suite
Suite
Elect &&
Identify
Identify Build
Risks Risk Build
Mech Risks Risk Plan
Plan
Assessment
Assessment
SC F/W Tech
Designers Code
PM
Mfg/Test Schedule
Eqpt &
Code Cost Master
Profit Test List
SRE
Req’ts
Req’ts
Revision
Revision
ENGR
Qual Test
QE Plan
QE Defined
252
Technical Performance Measurement (TPM)
• What
– A closed-loop process for continuing analysis, test and
demonstration of the degree of maturity of selected drive
technical parameters.
• Objective
– Provide a set of measurements that characterize drive
performance and “health” at any stage of design or
development.
253
TPM Application -- HDD
• WHY?
– Assess compliance with drive specifications and customer
expectations
– Assess drive design/development maturity
– Assess and mitigate technical risks
– Provide drive characteristics to Product Assurance Lab for test
equipment set-up
– Provide reliability requirements to design groups
– Reduce technical problems encountered in reliability demonstration
test
– Provide data for design assessment, readiness reviews, decisions
– Support shortened cycle times
– Reduce development costs
– Management visibility of drive program health
254
Test Phases
A0 A1 A2 & A3 Transfer
Integration
Development & Engineering Model
Qualification
Depop Capacity,
Alternate Interfaces
Ship Test
• Approach
– Select parameters that are performance based, measurable
indicators of drive maturity related to customer requirements,
design requirements, reliability and producability.
– Measure selected parameters of drive operation against
specs and expectations from Design Phase (A0) through
Qualification (A3).
– Leverage tests scheduled and conducted by design
engineering and manufacturing.
– Product Assurance (PA) Engineers and Core Team specify
required measurement data.
– Characterize and document drive configuration &
performance at each phase entry and at any deviation from
planned profile.
256
TPM Application -- HDD
257
TPM Application – HDD (con’t.)
258
Technical Performance Measures (TPM)
TPP Selection
Field Designer PM
Returns Interviews Inputs
Line
Returns
Drive Technical
Level Performance
Parameters (TPP)
DPPM
Model
259
Technical Performance Measures (TPM)
Paretos Function Function
Interviews TPP Selection
Experience
Failure Servo Firmware
Mode
Buckets
Failure
Analysis CODE
--Detailed--
Function
HSA Flex Ckt PCBA
261
Technical Performance Measures (TPM)
TPM TPM
•Non-Op Shock Base / Cover
Heads / Media
Motor
Suspension
Code
262
SE Tools/Methods
• Failure Mode, Effect & Analysis (FMEA)
– Owned by PA Test Engr
– Tailor the process to quickly identify critical problems
– Account for failures in previous programs (development, factory,
field)
– Performed during pre-A0 and maintained/updated throughout
development
263
TPM Application -- HDD
• Parameter Characteristics
– Significant determinants of total drive performance
– A direct measure of value determined from test
– May represent areas of risk that require visibility
– Time-phase values can be predicted for
each parameter and substantiated during design,
development, test
– Select enough parameters to sufficiently describe drive
health
• Too many – may focus on non-essentials
• Too few – may miss important indicators
• 5 to 10 as a rule
264
TPM Application -- HDD
• Implementation
– Deploy through the Quality & Design Engr Core Team Reps
to Functional Depts
– Initiate on next generation product as Pathfinder
– Refine procedures and measurements
– DRs are those documented in the Phase 0 deliverables and
Qual Phase Entry requirements
– Documented as part of the Product Quality Plan
– PA Test Engr will supply measurements reqt’s and
coordinate with DE Core Team Rep to determine test
schedule detail
– PA Test Engr & Core Team work together to obtain data
– Collect cost-to-manufacture data at transfer for ROI
comparison
265
TPM Application -- HDD
Technical Performance Measure Process Example Data Sheet
I
TVM
23 10 days
6/10/00 6/19/00
OpVar (4x) OpVar (1x) OpVar U3 (4x) ECC Test (1x) SCSI FEQ (4x) SCSI FEQ (1x) Squeeze Testing Performance Test Power On Reset
(4x) (7x)
4 3 days 1 3 days 4 3 days 1 5 days 4 7 days 1 7 days 4 4 days 4 2 days 7 4 days
6/10/00 6/12/00 6/10/00 6/12/00 6/10/00 6/12/00 6/10/00 6/14/00 6/10/00 6/16/00 6/10/00 6/16/00 6/10/00 6/13/00 6/10/00 6/11/00 6/10/00 6/13/00
Cache Stress (4x) Cache Stress (1x) Cache Stress U3 HW Error Injection Reassignment Test
(4x) (1x)
4 1 day 1 1 day 4 1 day 1 4 days 4 3 days
6/13/00 6/13/00 6/13/00 6/13/00 6/13/00 6/13/00 6/15/00 6/18/00 6/12/00 6/14/00
• Test plans are generated for each Qual configuration (e.g. A1,
Format Size (4x) Format Size (1x)
4 4 days 1 4 days
A2, A3)
6/15/00 6/18/00 6/15/00 6/18/00
• Testing is done per this plan. Results are monitored by the test
engineers & the Core Team
267
TPM Application -- Example
Acoustics
Idle Define (for each deviation from plan)
• What Changed vs Expectations
4.5
• How recovered
• Sensitivity to parameter
variations/changes
4.4 • Configuration
H/W
SCSI code
Servo code
4.3 Electronics
Other
•Test equipment
Bels
4.2
Actual Measure
Design Profile
Design Requirements
4.1 (or Product Spec)
Test Phase
Entry
Define at Entry
4.0
Baseline
Measurement Events Calendar Dates
268
Key Points
269
References
270
Lesson 12:
Margin Modeling
271
Objectives
272
Life Cycle Cost Gets Locked In Early
273
The Knowledge Curve
Improved systems
analysis can
accelerate the
knowledge curve,
leading to better
decisions earlier.
274
Types of Design Margins
• Weight
– Development weight increment
– Growth in system capability
275
Weight Margin
X-20
1.4
Mercury X-15 Apollo Lm
1.3
Spacecraft
1.2 Apollo CSM
Weight Growth
History
1.1
Gemini
1.0
0.9
0 1 2 3 4 5 6
Time, years 277
Weight Allotment Convention Definition
• Define allotment
– Structure
– Systems
278
NASA Design Margins for Spacecraft
• Pre-Phase A -- 25-35%
• Phase A -- 25-35%
• Phase B -- 20-30%
• Phase C -- 15-25%
279
Technical Resource Management
Allocation (estimates) (estimates & actuals) (actuals)
Margin
Control
280
Space Shuttle Weight Growth
Phase C/D (1972-1983)
Wing 0.27
Tail 0.14
LH Tank 0.13
LOX Tank 0.13
Body 0.03
Gear 0.06
TPS 0.01
Propulsion 0.12
Subsystems 0.50
Isp, sec -2.5
281
Single-Stage-to-Orbit Weight Distribution
Growth Wing
Subsystems
0% 10% Tail
12%
2%
LH Tank
10%
LOX Tank
Propulsion 6%
27%
Body
17%
Gear
4% TPS
12%
282
Historical Weight Estimating Relationship
(Wing)
100000 0.5544
y = 3079.7x
+20% Shuttle
Wing Weight = H-33, Phase B Shuttle
NAR, Phase B Shuttle
30790.554
Weight, 747
(1+.20
- ) C-5
lbs
.17 L-1011
737
727-200
707
-17% DC-8
10000
1 10 100 1000
Design Weight*Maneuver Load*Safety Factor*Structural Span
( Root Chord
)
283
Conceptual Weight Estimation Uncertainties
Wing -0.17 0.20
Historical
Tail -0.70 1.06
Weight LH Tank -0.18 0.41
Estimation
Relationship LOX Tank -0.51 0.49
Errors
Body -0.36 0.64
Gear -0.15 0.21
0
-0.2 -0.15 -0.1 -0.05 0 0.05 0.1 0.15 0.2 0.25
285
Examples of
Probabilistic Uncertainty Models
Uniform(-0.17000, 0.20000)
X <= -0.15150 X <= 0.18150
5.0% 95.0%
3
2.5
1.5
0.5
0
-0.2 -0.15 -0.1 -0.05 0 0.05 0.1 0.15 0.2 0.25
3.5 5
3
4
2.5
3
2
1.5 2
1
1
0.5
0
0
-0.2 -0.15 -0.1 -0.05 0 0.05 0.1 0.15 0.2 0.25
-0.5 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4
4
Triang(-.17, 0, .2)
2
1
6 LOX Tankl
X <= -0.11392
5.0%
X <= 0.13917
95.0%
0
-0.2 -0.15 -0.14 -0.05 0 0.05 0.1 0.15 0.2 0.25
3
Triang(-.17, 0, .2)
2
1
6 LH Tankl
X <= -0.11392
5.0%
X <= 0.13917
95.0%
0
-0.2 -0.15 -0.14 -0.05 0 0.05 0.1 0.15 0.2 0.25
3
Triang(-.17, 0, .2)
Conduct Experiment
X <= -0.11392 X <= 0.13917
Tail
2
5.0% 95.0%
6
1
5
0
-0.2 -0.15 -0.14 -0.05 0 0.05 0.1 0.15 0.2 0.25
Wing
Triang(-.17, 0, .2)
X <= -0.11392 X <= 0.13917
2
5.0% 95.0%
6
1
5
0
-0.2 -0.15 -0.14 -0.05 0 0.05 0.1 0.15 0.2 0.25
0
-0.2 -0.15 -0.1 -0.05 0 0.05 0.1 0.15 0.2 0.25
Output
No Distribution
7
6
5
4
3
2
1
0
200 300 400 500
Values in Thousands
5% 90% 5%
273.26 423.17
Dry Weight = 339Klbs ± 25% with 90% Confidence
288
Weight Uncertainty Impacts
100%
80%
Mean = 340Klbs
Cumulative
Probability 60%
40% 95% = 426Klbs
20%
0%
250000 300000 350000 400000 450000
289
Weight Uncertainty Impacts
Cum. Dry Weight, Dry Weight Dry Weight
Probability lbs Mean Dry Wt. Payload
Mean 339,884 0.00 0.0
60% 348,204 0.02 0.3
SSTO 70% 361,014 0.06 0.8
80% 376,621 0.11 1.5
90% 401,815 0.18 2.5 Payload
95% 426,189 0.25 3.5
= 25,000
Mean 349,035 0.00 0.0 lbs
SSTO 60% 355,326 0.02 0.3
Isp = ± 5%
Drag = ± 15% 70% 371,730 0.07 0.9
Volume = ± 5% 80% 392,926 0.13 1.8
90% 430,897 0.23 3.3
95% 456,781 0.31 4.3
290
Ranking of Weight Uncertainty Impacts
%Dry Weight
27%
17%
12%
12%
6%
10%
2%
10%
4%
291
Theoretical State-of-
Pre-Phase A
Limit the-Art
(Technology
Development
TRL 0-3)
Probability
Phase B
(TRL 6)
Expected
Values Metric Value
292
Estimates without Technology Investment
Distribution for Dry/K114
9
Values in 10^ -6 8 Mean=339646.8
7
6
5
4
3
2
1
0
200 300 400 500
Values in Thousands
5% 90% 5%
273.26 423.17
3.000
2.500
2.000
1.500
1.000
0.500
0.000
350 370 390 410
Values in Thousands
5% 90% 5%
360.09 393.25
Dry Weight = 377Klbs ± 5% with 90% Confidence
Technology Program should be structured to reduce
uncertainties to an acceptable level for a given confidence
294
Benefit of Strategic Technology Investment
Phase C
(TRL 6) Pre-Phase A
(TRL 0-3)
100%
80%
Cumulative
Probability 60%
40%
20%
0%
250000 300000 350000 400000 450000
Dry Weight, lbs
295
Key Points
296
References
297
Lesson 13:
Risk Analysis
298
Objectives
299
Risk Classification
5
Legend
4
Likelihood
1 2 3 4 5
Consequence
300
Sample Definitions of Likelihood & Consequence
301
Risk Information & Mitigation Planning
302
Technology Risk Analysis Model
Risk
Integration
Integration Cost Safety Performance
Integration
Success Factors
Cost Factors Safety Factors Performance Factors
Model
Model
Technology Technology
303
Technology Audit & Assessment
Technology Audit
Define Establish Forecast
Technologies Projection Expected performance
Metrics Basis of Projection Best Case
Projection Confidence Worst Case
TRL
Current Status
Theoretical Best
Metric Assessment
How Does the Current Status
Compare with Projection? Feedback
mature to meet
•Analytical Tools and Databases
•Technical Issues
Vehicle System WBS
the
•TRL
1.0 Vehicle Sys. Engineering & Integr. performance
1.1, 1.2, etc. (rqmts. allocation, risk mgt,
config. control, etc.) R Y R
expectation on
2.0 Stage 1 For Each ECBS Element Relative to
2.1, 2.2, etc. (airframe, propulsion, R R R R
schedule &
avionics, software, etc) Requirements
3.0 Stage 2
•Assessment of technical issues
R within budget?
3.1, 3.2, etc. (airframe, propulsion, R R Y Y Y Y
R
avionics, software, etc) •Current Capabilities (baseline)R
4.0 Vehicle and Flight Operations
4.1, 4.2, etc. (mission planning,
•CRL
R
ground ops, etc. to run the spaceline) •Shortfall (Very specific)
5.0 Ground Infrastructure
•Advancement Degree of Difficulty
5.x, 5.xx, etc. (mfg. and launch & R
landing facilities, equipment, etc.)
R
•Mitigating tasks and recommended
R
priorities
•Cost and schedule Y
January 2003 305
Capability Readiness
306
Materials & and Process Capability Readiness
Level (CRL) (Example)
Materials Readiness Level (MRL) Process Readiness Level (PRL)
Material routinely available and used CRL 6 Process applied to object has produced
in components defect free components; process
parameter ranges identified
Material applied to shapes of the size CRL 5 Process has been applied to shapes of
and type of objective component with the size and type of the objective
verified properties component
Material applied to objective shape CRL 4 Process has been modified to apply to
with verified properties objective shape
benefit of the technology? • Satellite Return • Significant RLV mission drivers are HEO,
• Space Platform Assembly & Boeing CS-1
satellite return, crew rescue, and polar orbit
Servicing Boeing Lockh eed
Orbital
missions (major architectural impacts)
Sciences
Desired Capabilities • DRM 7.2.2 needs clarification. Contractors
• Crew Rescue interpreted it differently (HEO vs. LEO Mission).
DRM2ndGenSTS
Assessment
2ndGenRLV
• Polar Orbit
• CTV & Upper Stage Delivery
Lockheed Martin
VentureStar
Kelly Space
& Technology Sp ace Access Starcraft
Boosters
ISAT assessments assumed HEO mission.
Science Platform
Satellite Return
Polar Orbit
Crew Rescue
B 7.2.2 CTV & Upper Stage Delivery for HEO Multiple burns of larger (>45 klb) upper stage.
7.2.2.1 CTV Return from HEO Ballistic capsule w/land or water landing.
- C
• NASA req’ments not met for: 7.2.3 Crewed Satellite Servicing Use VentureStar with CTV and custom mission module
for servicing.
7.2.4 Satellite Return Potential downmass limitation.
7.2.6
Science Platform
7.3.1
7.3.2
Space Platform Assembly & Servicing
Crew Rescue
Polar Orbit
Use VentureStar with custom mission module
with extended kit and extensive EVA.
May not meet 2-day rescue timeline.
Design Criteria:
– FAR margin (factors of safety) criteria advocated. General SF = 1.5. Other specific factors
Discipline Teams
much higher.
308
Technology Risk Analysis Example
Metrics
SSTO Metric Example : Impact of Technologies - A
Risk Analysis
25K to LEO
• The use of composites
Baseline for tankage significantly
(AL Tankage)
400
increases Vehicle dry
Composites weight margin above that
Vehicle Dry Weigh, 1000#
• But, composite
application in cryogenic
Practical Limit
of Vehicle Size
tankage are novel, and
200
unforeseen problems
~ could significantly increase
~
cost & slip schedules.
5 10 15 20 25 30 35
310
References
311