Sie sind auf Seite 1von 12

ASCI ALLIANCE

CENTERS

VIRTUAL PROTOTYPING OF SOLID


PROPELLANT ROCKETS
Researchers seek a detailed, whole-system simulation of solid propellant rockets under
normal and abnormal operating conditions. A virtual-prototyping tool for solid propellant
rocket motors based on first principles models of rocket components and their dynamic
interactions meets this goal.

afety and reliability are paramount concerns in rocket motor design because of
the enormous cost of typical payloads and,
in the case of the Space Shuttle and other
manned vehicles, the crews safety. In the spring
of 1999, for example, a series of three consecutive
launch failures collectively cost more than US $3.5
billion. The most notorious launch failure, of
course, was the tragic loss of the Space Shuttle
Challenger and its seven crew members. Thus,
there is ample motivation for improving our understanding of solid rocket motors (SRMs) and the
materials and processes on which they are based,
as well as the methodology for designing and manufacturing them.
The use of detailed computational simulation
in the virtual prototyping of products and devices
has heavily influenced some industriesfor example, in automobile and aircraft designbut to
date, it hasnt made significant inroads in rocket
motor design. Reasons for this include the markets relatively small size and the lack of sufficient
computational capacity. Traditional design practices in the rocket industry primarily use topdown, often 1D modeling of components and systems based on gross thermomechanical and

1521-9615/00/$10.00 2000 IEEE

MICHAEL T. HEATH AND WILLIAM A. DICK


Center for Simulation of Advanced Rockets, UIUC

MARCH/APRIL 2000

chemical properties, combined with engineering


judgment based on many years of experience,
rather than detailed, bottom-up modeling from
first principles. Moreover, there has been a tendency to study individual components in isolation,
with relatively little emphasis on the often intimate
coupling between the various components. For example, SPP1an industry-standard code for analyzing solid propulsion systemsincludes a fairly
detailed model of the propellant thermochemistry
but no structural analysis and no detailed model
of internal flow.
One of our primary goals at the Center for Simulation of Advanced Rockets (CSAR) is to develop
a virtual prototyping tool for SRMs based on detailed modeling and simulation of their principal
components and the dynamic interactions among
them. Given a design specificationgeometry,
materials, and so onwe hope to predict the entire systems resulting collective behavior with sufficient fidelity to determine both nominal performance characteristics and potential weaknesses or
failures. Such a response tool could explore the
space of design parameters much more quickly,
cheaply, and safely than traditional build-and-test
methods. Of course, we must validate such a capability through rigorous and extensive comparison with data for known situations to have confidence in its predictions for unknown situations.
Although it is unlikely that simulation will ever
totally replace empirical methods, it can potentially dramatically reduce the cost of those meth-

21

Figure 1. Idealized solid


propellant
rocket. Major
components
are along the
top, our initial
simulation
models along
the bottom.

22

Solid propellant
Igniter

Combustion-injection
boundary layer model
Compressibleturbulence LES model
Thermoviscoelastic
model

Interior cavity
Combustion interface

Case
Insulation

Nozzle

Exhaust
plume

Elasticity/
LES
ablation model model
Thermomechanical
foam model
Thermoelastic model

ods by identifying the most promising approaches


before building actual hardware.

we note the following major challenges in achieving our goals.

Challenges in rocket simulation


Solid propellant boosters are the heavy lifters
of the space launch industry. Most of the worlds
large, multistage launch vehiclesincluding the Ariane, Delta, Titan, and Space Shuttleemploy two
or more SRBs in the initial stage to provide 80% or
more of the immense thrust needed to lift a payload in excess of 10,000 pounds off the launch pad
and propel it the first few tens of miles above Earth.
Beyond this point, subsequent stagestypically liquid-fueledtake over into orbit and beyond.
SRMs are notably simpler than liquid rocket
engines.2 The latter have far more moving parts
(pumps, valves, and so on) and require storing
and handling of liquids that might be cryogenic
or potentially hazardous. SRMs, though, have almost no moving parts (often only a gimballed
nozzle for thrust vector control), and the composite solid propellant (containing both fuel and
oxidizer) forms the combustion chamber. The
main disadvantage of SRMs is that once ignited,
combustion is essentially uncontrollable: the propellant burns at maximum rate until exhausted.
Thus, solid motors are ideal for the initial stages
of flight, when raw power is more important than
finesse, and then liquid-propellant rockets take
over for the portions of flight requiring more delicate maneuvering. Despite their relative simplicity, SRMs are still fiendishly complex in terms
of the chemical and thermomechanical processes
that take place during firing, as well as the design
and manufacturing processes required to make
them reliable, safe, and effective.
Figure 1 shows a schematic drawing of a typical
SRMthe major parts are indicated along with
the types of mathematical models that might be
used to represent them. Reality, of course, is considerably more complex than this 2D picture, and

The complex behavior of SRMs requires fully


3D modeling to capture the essential physics
adequately. Examples include the combustion
of composite energetic materials; the turbulent, reactive, multiphase fluid flows in the
core and nozzle; the global structural response
of the propellant, case, liner, and nozzle; and
potential accident scenarios such as pressurized crack propagation, slag ejection, and propellant detonation.
The coupling between components is strong
and nonlinear. For example, the loading due
to fluid pressure deforms the solid propellant,
which changes the geometry of the fluid flow,
which in turn affects pressure, and so on. Similarly, the burn rate increases with pressure
and vice versa.
The geometry is complex and changes dynamically as the rocket consumes propellant.
The inclusion of slots and fins, which forms
a star-shaped cross section, enhances the
amount of burning surface area. Whatever
its initial shape, the propellant surface regresses at a pressure-dependent rate as the
propellant burns, and discrete representations of the solid and fluid components, as
well as the interface between them, must
adapt accordingly.
The spatial and temporal scales are extremely diverse. For example, processes such
as combustion and crack propagation occur
on micron-length scales and microsecond
time scales, or less, which are entirely infeasible to treat a two-minute burn of a 125foot-long rocket.
Manufacturing and transportation constraints
necessitate the use of numerous joints, including field joints where motor segments are
assembled at the launch site. This significantly

COMPUTING IN SCIENCE & ENGINEERING

In September 1997, CSAR embarked on an ambitious plan to tackle these daunting challenges
and produce a virtual-prototyping tool for SRMs.3
This article is a progress report almost two years
into our five-year plan.
Our initial plans seemed audacious, but the
substantial resources our sponsor (the US Department of Energys Accelerated Strategic
Computing Initiative program) provided let us
assemble a team of over 100 researchers, including roughly 40 faculty, 40 graduate students, and
20 staff (research scientists, programmers, and
postdoctoral associates), that represents 10 departments across our university. This diverse
group provides the broad expertise needed in
combustion, fluid dynamics, structural mechanics, and computer science, but it also presents
the additional challenge of coordinating a large

MARCH/APRIL 2000

Accidents
Geometrical complexity

complicates the geometry and structural response of the motor and introduces potential
points of failure.
Modeling and simulating each component
is challenging both methodologically and
computationally. Although there is considerable experience in the field in modeling
the various rocket motor components, a
more fundamental understanding of the
constitutive and energetic properties of materials and of the processes they undergo requires much greater detail along with terascale computational capacity.
Modeling and simulating component coupling is even more demanding because it not
only requires still greater computational capacity, but also demands that the corresponding software modules interact in a manner
that is physically, mathematically, and numerically correct and consistent. When data
are transferred between components, they
must honor physical conservation laws, mutually satisfy mathematical boundary conditions, and preserve numerical accuracy, even
though the corresponding meshes might differ in structure, resolution, and discretization
methodology.
Integrated, whole-system SRM simulation
requires enormous computational capacity,
currently available only through massively
parallel systems that have thousands of
processors. Thus, the software integration
framework, mesh generation, numerical algorithms, input/output, and visualization
tools necessary to support such simulations
must be scalable to thousands of processors.

GEN2
family

Joints
Star grain
3D

GEN1
family

2D
1D

GEN0
Weakly
Fully
coupled
coupled
Physical complexity

Figure 2. Our
code development follows
this staged
approach with
increasing
complexity in
component
models and
coupling.

Detailed

collaborative project that cuts across traditional


departmental boundaries. We organized our effort along basic disciplinary lines without regard
to the academic departments of the participants.
This has had the salutary effect of inducing collaboration among faculty and students in a given
discipline, such as fluid dynamics, regardless of
which department they might occupy. Crosscutting teamssuch as System Integration and
Validation and Specificationdraw members
from all four disciplinary groups and require an
additional level of collaboration.
Staged approach
We realized from the outset that a project of
this complexity would require a staged approach:
we would need to learn to walk before we could
run (much less fly). The primary axes of complexity in our problem are physical and geometric
(see Figure 2). Physical complexity refers to the
detail and sophistication of physical models employed and the degree of coupling among them.
Geometric complexity refers to the dimension of
the problem and the degree of detail and fidelity
in representing a real SRM. In essence, we wish
to move along the diagonal of this diagram over
time. In this spirit, we defined three successive
generations of integrated rocket simulation codes:
GEN0: A 2D ideal rocket with steady-state
burning at chamber pressure, power law for
propellant regression, Euler equations for
fluid flow, a rigid case, linearly elastic propellant, and one-way coupling from fluid to
solid. We based its physical parameters on
the Space Shuttle reusable solid rocket motor (see sidebar). GEN0 was intended primarily as a warm-up exercise.
GEN1: A fully 3D whole-system simulation
code using relatively simple component

23

Space Shuttle Reusable Solid Rocket Motor


We chose the Space Shuttle Reusable Solid Rocket Motor as our
primary simulation target for a variety of reasons, including its national importance, its public visibility, its fairly typical design, and
the availability of detailed specifications and extensive test data.
We outline here the basic technical facts about the Space Shuttle
RSRM.1,2 Figure A shows a composite drawing of the RSRM.
Height: 126.11 ft.
Diameter: 12.16 ft.
Weight: 149,275 lb., empty; 1,255,415 lb. full
Case: High-strength D6AC steel alloy, 0.479 in. to 0.506 in.
thick
Nozzle: Aluminum nose-inlet housing and steel exit cone,
with carbon-cloth phenolic ablative liners and glass-cloth
phenolic insulators. The nozzle is partially submerged and is
movable for thrust vector control.
Insulation: Asbestos-silica-filled nitrile butadiene rubber
Propellant (material percent by weight):
Ammonium perchlorate oxidizer: 70
Powdered aluminum fuel: 16
Polybutadiene polymer (PBAN) binder: 12
Epoxy curative agent: 2
Ferric oxide burn rate catalyst: trace
Propellant grain: 11-point star-shaped perforation in head
end of forward segment, aft-tapered cylindrical perforation in remaining segments. Liquid and solid ingredients
are first thoroughly mixed into a thick paste; then curative
agent is added before the mixture is vacuum-cast into a
mold and then cured in a slow oven for several days.
Consistency of the resulting composite solid is similar to

that of a pencil eraser.


Igniter: Solid rocket pyrogen igniter mounted in forward end,
47.5-in. long and containing 134 lb. of TP-H1178 propellant
Total launch weight: 4.5 million lb. (including two SRBs, external tank, orbiter, and payload)
Maximum thrust: 3,320,000 lb. force (each SRB)
Acceleration: Lift-off 1.6 g (maximum 3.0 g)
Launch timeline:
Liquid engines fire: 6.0 sec
SRB igniter initiated: 0.0 sec
Lift-off pressure: 564 psia reached at 0.23 sec
All exposed propellant ignited: 0.3 sec
Maximum operating pressure: 914 psia reached at 0.6 sec
Roll program begins: 10 sec
Star grain burnout: 21 sec
Liquid engines throttled down: 30 sec
Mach 1 reached: 40 sec
Solid propellant burnout: 111 sec
SRB separation: 126 sec
Velocity at separation: 3,100 mph
Altitude at separation: 25 nmi
References
1. Design Data Book for Space Shuttle Reusable Solid Rocket Motor, Publication
No. 930480, TRW-16881, Revision A, Thiokol Space Operations, Brigham
City, Utah, 1997.
2. A.J. McDonald, Return to Flight with the Redesigned Solid Rocket Motor,
Proc. AIAA/ASME/SAE/ASEE 25th Joint Propulsion Conf., AIAA Paper No. 892404, AIAA Press, Reston, Va., 1989, pp. 115.

models, two-way coupling, and reasonably


realistic geometry approximating that of the
Space Shuttle RSRM. The star grain of the
Shuttle RSRM is included but not joints, inhibitors, or cracks. Solid components include
viscoelastic propellant and linearly elastic
case. The fluid component is an unsteady,
viscous, compressible flow, with a large-eddy
simulation turbulence model but with no
particles, radiation, or chemical reactions in
the flow. The combustion model assumes homogeneous surface burning and a pressuredependent regression rate. There is full, twoway aeroelastic coupling between the fluid
and solid components. The development of
GEN1 was expected to span the first three
years of the five-year project.
GEN2: A fully capable rocket simulation tool
with detailed component models, complex
component interactions, and support for subscale simulations of accident scenarios such as

24

pressurized crack propagation, slag accumulation and ejection, and potential propellant
detonation. GEN2 includes more detailed
geometric features, such as joints and inhibitors, and also includes more detailed and
accurate models for materials and processes
based on separate subscale simulations.
GEN2 was expected to span the last three
years of the five-year project, overlapping
with the final year of GEN1.
Progress to date
We assembled the integrated GEN0 code from
existing in-house modules for fluids, solids, and
combustion components, and we completed it in
May 1998. We ran it with modest levels of parallelism on a shared-memory SGI Power Challenge.
The computed results agreed reasonably well with
predictions of classical 1D theory, but we didnt
extensively validate it because we never intended

COMPUTING IN SCIENCE & ENGINEERING

Figure A. The Reusable Solid Rocket Motor (RSRM) is a primary booster for NASAs Space Transportation System (STS). Section A-A shows
11-point slot and fin star grain structure in the RSRMs forward segment. Propellant in forward-center and aft-center segments form straightwalled cylinders; aft-segment propellant tapers outward to a submerged nozzle. Inhibitors between segments are asbestos-filled carboxyl-terminated polybutadiene used to tailor burning surface to meet the motors thrust requirements.

GEN0 as a realistic, high-fidelity simulation; we


saw it simply as a start-up system integration exercise to get our team accustomed to working together. Visit www.csar.uiuc.edu to view animations
of our results.
We based the subsequent GEN1 code on newly
written or substantially modified in-house codes
for the various modules. We completed a simplified, serial, but full 3D version of it in October
1998. Its principal simplifications included our use
of a strictly cylindrical geometry (no star grain,
which is a slot-and-fin geometry used to increase
the propellants initial burning surface area), no
support for interface regression due to burning
(not a significant factor for short burn times, but
obviously necessary for longer runs), the requirement that the solid and fluid meshes must match at
the interface to simplify data transfer, and our use
of a linearly elastic (rather than viscoelastic) model
for the propellant.
Computed results without the star grain in the

MARCH/APRIL 2000

propellant gave a head-end pressure at the onset


of steady burning of roughly half the empirically
measured value for the Space Shuttle RSRM.
This was not surprising, as the whole point of the
star grain is to increase the exposed surface area
of propellant early in the burn, which increases
pressure and thrust accordingly. Thus, implementing the star grain became a high priority.
Another high priority was a fully parallel implementation, not only because this was an important general goal, but also for the practical reason that we could not otherwise make runs of
significant duration at a reasonable resolution.
By May 1999, we had completed a fully parallel
implementation of GEN1.4 With the star grain
geometry, computed head-end pressure was now
within a few percentage points of the Space Shuttle RSRMs measured value.
Recent efforts have focused on implementing
interface regression in both fluid and solid modules and on allowing for nonmatching meshes at

25

Figure 3. Solids and fluids codes


have different approaches to
component simulation.

Rocsolid
Finite element
Linear elastodynamics
Unstructured hexahedral meshes
ALE treatment of interface regression
Implicit time integration
Multigrid equation solver
F90, MPI parallelism

Scaled speedup

1,000

100

10

T3E
O2K
SP2

1
1

(a)

10

100

1,000

Processors

Scaled speedup

1,000

100

T3E
O2K
CLU

10

1
1

(b)

10

100

1,000

Processors

Scaled speedup

1,000

100

10

T3E
O2K

(c)

10

100

1,000

Processors

Figure 4. Graphs show scaled speedup of separate (a) Rocsolid and


(b) Rocflo modules and of (c) integrated GEN1 code for the Space
Shuttle RSRM. The problem size is fixed at 14,625 fluid grid points
and 8,192 solid elements per processor as the number of processors
grows. Computers used include a Cray T3E, SGI Origin 2000 (02K),
IBM SP2, and Intel Pentium cluster (CLU).

26

Rocflo
Finite volume
Unsteady, viscous, compressible flow
Block-structured meshes
ALE moving boundaries
Explicit time integration
2nd order upwind TVD
F90, MPI parallelism

the interface. Both fluid and solid modules now


support interface regression using an ALE (Arbitrary Lagrangian-Eulerian) approach in which
the mesh moves to allow for the dynamically
changing geometry. We also devised general
mesh-association and conservative data-interpolation schemes that permit accurate data
transfer between components with nonmatching meshes at the interface.
The main features of the solids codes (Rocsolid) include finite elements, unstructured hexahedral meshes, linear elastodynamics, ALE
treatment of regression, implicit time integrations, multigrid equation solvers, and Fortran 90
with MPI parallelism. The main features of the
fluids codes (Rocflo5) include finite volume;
block-structured meshes; unsteady, viscous,
compressible flow; ALE treatment of moving
boundaries; second-order upwind total variation
diminishing schemes; explicit time integrations;
and Fortran 90 with MPI parallelism. Figure 3
shows how the two compare.
Parallel scalability of Rocsolid, Rocflo, and
the GEN1 code that integrates them has been
excellent. Weve run Rocflo on up to 2,048
processors, and weve run Rocsolid and GEN1
on up to 512 processors on a variety of platforms, including the SGI Origin at NCSA, the
Cray T3E at the Pittsburgh Supercomputer
Center, and all three ASCI platforms at the US
Department of Energy laboratories. Figure 4
shows scaled speedup, meaning that the problem size per processor is constant. The largest
mesh sizes have about four million elements for
the solid and seven million zones for the fluid.
Figures 5 and 6 show visualizations of some
computational results.
The features we still have to implement to
complete the full GEN1 code include viscoelasticity, large deformations, and thermal effects in the solid; a large-eddy simulation turbulence model in the fluid; a more detailed
model of combustion and interface regression;
and a flame-spreading model to capture ignition
transients.

COMPUTING IN SCIENCE & ENGINEERING

Figure 5. Fully coupled 3D simulation of star grain in the


Space Shuttle RSRM showing stress in propellant and gas
pressure isosurfaces in slots and core region. Executed on a
256-processor SGI Origin 2000, visualized with Rocketeer,
our in-house visualization tool.

System integration issues


A number of technical issues arise in building
an integrated, multicomponent code such as
GEN1. First is the overall integration strategy,
where the fundamental choice is between modular
and monolithic approaches. In building the
GEN1 code, we chose a modular or partioned approach in that we used separately developed component modules and created an interface to tie
them together. In such an approach, separately
computed component solutions might require
subiterations back and forth between components
to attain self-consistency. This approach contrasts
with a more monolithic strategy in which all the
physical components are incorporated into a single
system of equations and all the relevant variables
are updated at the same time, thereby obviating
the need to iterate to self-consistency. Although it
has some theoretical advantages, a monolithic approach impedes separate development and maintenance of individual component modules by specialists in the respective areas. The modular
approach not only expedites separate development
and maintenance, it also allows swapping of individual modules without replacing the entire code
or even affecting the other modules. The modular
strategy seemed to offer clear practical advantages
in our somewhat dispersed organizational setting,
as well as potentially letting users include com-

MARCH/APRIL 2000

Figure 6. Gas temperature computed by Rocflo in star grain region


of the Space Shuttle RSRM near the onset of steady burning, visualized by Rocketeer. Values range from 3,364 K (magenta) to 3,392 K
(red). Temperature is represented as (a) tint on the interior surface
of propellant and as (b) a series of translucent colored isosurfaces
in the interior at slightly later time. The rocket is cut in half along
lateral axis to improve visibility.

mercial modules when appropriate.


However, even less coupled approaches are
commonly used in practice, in which entirely independent codes interact only offline (often with
human intervention), perhaps through exchange
of input and output data files. By contrast, in our
modular GEN1 code, the component modules are
compiled into a single executable code and they
exchange data throughout a run with subroutine
calls and interprocessor communication.
Another important issue is the physical, mathematical, and geometric description of the interface

27

between components, which in our case includes


the jump conditions combustion induces. A careful formulation of the interface boundary conditions is necessary to satisfy the relevant conservation laws for mass and linear momentum, as well
as the laws of thermodynamics.
Time-stepping procedures are another significant issue in component integration. Here, the time
steps for the fluid are significantly smaller than
those for the solid. We employ a predictorcorrector approach in which the fluid is explicitly stepped
forward by several (say, 10) time steps, based on the
current geometry the solid determines. The resulting estimate of fluid pressure at the future time is
then available for taking an implicit time step for
the solid. However, the resulting deformation and
velocity of the solid change the fluids geometry, so
the time-stepping of the fluid repeats and so on, until we attain convergence, which usually requires
only a few subiterations. Unless iterated until convergence, this scheme is only first-order accurate,
and it is serial in that only one component computes at a time. Parallel time-stepping schemes that
are second-order accurate without subiterations are
possible,6 and we plan to investigate these. But because we map both fluid and solid components onto
each processor, our current scheme does not prevent us from utilizing all processors.
As we mentioned briefly earlier, data transfer
between disparate meshes of different components
is another highly nontrivial issue in component integration. In our approach, we let the meshes differ at the interface in structure, resolution, and discretization methodology, and indeed this is the
case in GEN1, because the fluid mesh is blockstructured, relatively fine, and based on cellcentered finite volumes, whereas the solid mesh is
unstructured, relatively coarse, and based on nodecentered finite elements. Although in principle the
two interface meshes should abut because they discretize the same surface, in practice we cant assume this because of discretization or rounding
errors. Thus, we have developed general meshassociation algorithms that efficiently determine
which fluid points are associated with each element (facet) of the solid interface mesh;7 we then
use the local coordinates of the associated element
to interpolate relevant field values in a physically
conservative manner.
Yet another thorny issue in component integration is partitioning the component meshes
for parallel implementation in distributed memory. The block-structured fluid mesh is relatively
easy to partition in a highly regular manner, but
the unstructured solid mesh is partitioned by a

28

heuristic approach, currently using Metis, which


often yields irregular partitions (visit wwwusers.cs.umn.edu/~karypis/metis for further information). Moreover, because we partition the
two component meshes separately, there is no
way to maintain locality at the interfaceadjacent partitions across the interface may not be
placed on the same or nearby processors. In our
current approach, this effect complicates the
communication pattern and might increase communication overhead, but it has not been a serious drag on parallel efficiency so far. Nevertheless, we plan to explore more global, coordinated
partitioning strategies that will preserve locality
and perhaps simplify communication patterns.
Software integration framework
Our overarching goal in CSAR is not only to
develop a virtual prototyping tool for SRMs but
also to develop a general software framework and
infrastructure to make such integrated, multicomponent simulations much easier. Toward this
end, we have initiated research and development
efforts in several relevant areas of computer science, including parallel programming environments, performance monitoring and evaluation,
parallel input/output, linear solvers, mesh generation and adaptation, and visualization.
Our work in parallel programming environments has focused on creating an adaptive software
framework for component integration based on
decomposition and encapsulation through objects.
This environment provides automatic adaptive
load balancing in response to dynamic change or
refinement, as well as high-level control of parallel
components. It is purposely designed to maintain
compatibility with component modules in conventional languages such as Fortran 90, and it provides an automated migration path for the existing
parallel MPI code base. This work is in part an extension of the previously developed Charm++ system, which has successfully built a large parallel
code for molecular dynamics, NAMD.8 Although
this framework is not yet mature enough to serve as
the basis for the current GEN1 code, results for
pilot implementations of the GEN1 component
modules using the framework show great promise.
In performance evaluation, we are leveraging
ongoing development of the Pablo performance
environment,9 which provides capabilities for dynamic, multilevel monitoring and measurement,
real-time adaptive resource control, and intelligent performance visualization and steering of distributed, possibly geographically dispersed, appli-

COMPUTING IN SCIENCE & ENGINEERING

Rocketeer
by Robert A. Fiedler and John Norris
We need a powerful scientific visualization tool to analyze the
large, complex 3D data sets our whole system and subscale
rocket simulations generate. After an extensive review of existing packages, we decided to develop our own tool, which
we call Rocketeer. (Visit www.csar.uiuc.edu/F_software/
rocketeer to download a user guide and software.) The tool
has a number of features that make it ideal for visualizing data
from multicomponent simulations, including its support for
both structured and unstructured grids, cell-centered and
node-centered data, ghost cells, seamless merging of multiple data files, automated animation, and a smart reader for
HDF (hierarchical data format). A particularly useful feature for
visualizing field data in the interior of an SRM is Rocketeers
ability to depict translucent isosurfaces, which lets us view isosurfaces of temperature or pressure, for example, without
blocking the view of other isosurfaces deeper inside (see Figure B). Although our need to visualize data from SRM simulations specifically motivated many of these features, Rocketeer
is a broadly useful, general-purpose visualization tool.
Rocketeer is based on the Visualization Toolkit (VTK),1
which is in turn based on OpenGL to take advantage of
graphics hardware acceleration. It currently runs on Microsoft Windows and Unix/Linux. Planned enhancements
include implementing a clientserver model so that we can
perform most compute-intensive operations (in parallel) on
a remote supercomputer while interactive control and rendering are performed locally on a graphics workstation.

cations. We used these tools to instrument the


GEN1 code, view the resulting performance data,
and relate it back to the application codes call
graph to identify performance bottlenecks. We
also updated the popular ParaGraph performance
visualization tool10 to display the behavior and performance of parallel programs using MPI, and we
used it to analyze the GEN1 component modules.
Parallel I/O is often a bottleneck in large-scale
simulations, both in terms of performance and of
the programming effort required to manage I/O
explicitly. For large-scale rocket simulations, we
need good performance for collective I/O across
many processors for purposes such as periodically
taking snapshots of data arrays for visualization or
checkpoint and restart. Because we use geographically dispersed machines, we also need efficient
and painless data migration between platforms and
back to our home site. Panda11 provides all these
servicesit runs on top of MPI and uses the standard file systems provided with the platforms we
use. Its library offers server-driven collective I/O,

MARCH/APRIL 2000

Figure B. Rocketeer visualization of temperature profile computed by


Rocflo in the star grain region of the RSRM at the onset of steady
burning. Values range from 3,364 K (magenta) to 3,392 K (red).
Temperature is represented by the tint on the propellant fins surface
and by a series of translucent colored isosurfaces inside the slot. The image is cut in half along the rockets axis to enhance visibility. The hottest
point in this imaged flow is in the stagnation region in the middle of the
rockets head end; the coolest point is in the middle of the star grain regions open end.
Reference
1. W. Schroeder, K. Martin, and W.E. Lorensen, The Visualization Toolkit: An
Object-Oriented Approach to 3D Graphics, Prentice Hall, New York, 1997.

support for various types of data distributions, selftuning of performance, and integrated support for
data migration that exploits internal parallelism.
Weve already incorporated Panda into Rocflo and
are doing the same in Rocsolid. Again, early results are promising, and we plan to use Panda to
handle parallel I/O within our main visualization
tool, Rocketeer (see sidebar).
Validation
Comparison of simulation results with known
test cases and laboratory data is essential to establish this approachs validity for science and engineering. With our GEN1 integrated code rapidly
maturing, we have begun an aggressive series of
computational experiments to verify and validate
its efficacy and fidelity. Fluid-solid interaction
problems we use for validation include flow over
an elastic panel, flow over a wing (Agard Wing
445.6), and a model of inhibitor deformation in an
SRM. We hope also to be able to make compar-

29

Figure 7. (a) The 2D propellant surface model employs two sizes of ammonium perchlorate particles
imbedded in a fuel/binder matrix. (b) 3D flames supported by AP and binder decomposition product
gases for configuration appear at midline region of (a).

isons with test data for small laboratory-scale rockets through collaboration with various government laboratories. A larger-scale test we are pursuing is to try to predict the propellant slumping
that led to failure for the Titan IV SRBs original
design. Our ultimate test will be comparison with
the immense amount of test data for the Space
Shuttle RSRM taken during static firing tests after
its redesign. These data include literally thousands
of readings from strain gauges, pressure curves,
and so on for a liberally instrumented test version
of the Shuttle RSRM.
New research directions
Our second-generation rocket simulation code,
GEN2, will require significantly more detailed
and sophisticated component models than those
in GEN1. Moreover, GEN2 will also support accident scenarios that will require even greater detail and finer resolution. To have these new models ready by the time we need them in GEN2, we
have already begun extensive research into these
modeling issues, some of which we outline here.
Heterogeneous propellant flames

The propellant in the Shuttle SRM consists of a


high-density packing of ammonium perchlorate
(AP) oxidizer particles embedded in a matrix of
powdered aluminum fuel and polymeric binder.
The aluminum particles burn in the gas-phase products of AP-binder combustion. The propellant uses
an initial bimodal distribution of AP particle sizes of
20 and 200 microns, and the primary combustion
field is located within a few tens of microns of the
propellant surface. Flow disturbances originating in

30

the chamber due to acoustic-wave or mean-flow interactions and turbulence can affect this field, leading to what is known as erosive burning.
Some of the heat generated in the combustion
field is conducted to the propellant surface. This
heat is responsible for the surface regression and
the conversion of solid propellant to combustible
gases. The resulting burning surface is not flat
because the instantaneous regression rates of AP
and binder differ, and if cracks form in the propellant, the increase in propellant surface can
lead to a sharp increase in combustion intensity.
To describe the surface regression and to generate boundary conditions for chamber flow, we
must resolve the 3D combustion field and couple
it with physical processes such as heat conduction
in a thin surface layer in the propellant and allow
for pressure and thermal feedback from the chamber flow (see Figure 7).
Crack propagation in solid propellant

The initiation and propagation of one or more


cracks in the solid propellant or along the graincase interface can dramatically affect rocket performance. Cracks create additional burning surfaces in the SP, so the propagation of cracks can
greatly affect the pressure history in the rocket
chamber, leading in some cases to the rockets destruction. Modeling crack propagation in burning
SP is quite complex, because the problem is characterized by a tight coupling between structural dynamics, combustion, and fluid mechanics, along
with a rapidly evolving geometry. Using a fully
coupled explicit aeroelastic finite-element, finitevolume code, we are investigating potential accident scenarios associated with the presence of pre-

COMPUTING IN SCIENCE & ENGINEERING

existing cracks at various locations in the solid


booster, with special emphasis on propellant-liner
interfacial failures. Were using a novel cohesivevolumetric finite-element scheme to capture the
spontaneous motion of the crack, allowing for the
possibility of crack arrest or branching. As the crack
propagates and the reacting gas pressurizes newly
created crack faces, the fluid domain undergoes
complex changes that an adaptive unstructured finite-volume scheme can capture. As illustrated in
Figure 8, the reactive gas flow emanating from a
preexisting radial crack in the SP interacts with the
core flow in the rocket motor. This generates a region of high pressure on the leeward face of the
crack and a region of low pressure in the downstream vicinity of the crack that lead to substantial
deformation of the propellant grain.

Figure 8. The
effect of a preexisting radial
crack on the
motors core flow:
Colored contours
denote pressure in
the core flow and
hydrostatic pressure in the solid
propellant. Note
the region of high
pressure (red) in
the deformed
crack and of low
pressure (blue)
downstream from
the crack.

Aluminum particle combustion

The solid propellants in modern rockets use aluminum (Al) particles as fuel. As combustion proceeds, these particles in the propellant melt and agglomerate on the combustion interface. A complex
process follows as Al droplets detach from the surface and are injected into the core flow. The
droplets, whose initial size varies from 20 to 300 microns, are injected into a strong cross-flow, and they
provide a significant source of heat as they burn to
form Al oxides. Near-wall turbulence plays an important role in the dispersion of Al droplets, and as
a result, heat release is volumetrically distributed,
although dominant mainly in the near-wall region.
Al2O3 is the primary product of combustion, and it
appears either as fine powder of micron size or as
larger residual particles. The deposition of Al2O3
in the form of slag in the submerged nozzle can adversely affect motor performance.
The combustion of Al droplets strongly couples the dispersed phase (droplets and Al2O3 particles) with the continuous phase (the surrounding
core flow). We expect the GEN2 version of
Rocflo to include an Eulerian implementation of
the core flow and a Lagrangian implementation
of the Al droplets and the larger oxide particles.
The simulation will introduce tens of millions of
Al droplets into the flow at the combustion interface according to a specified probability distribution and local mass injection rate. The position,
velocity, temperature, and species concentration
of each droplet will be tracked in the simulation
over time by solving a set of ordinary differential
equations. The effect of the surrounding flow will
be parameterized in terms of lift and drag coefficients, heat and mass transfer coefficients, and
droplet burn rate. So far we have developed a de-

MARCH/APRIL 2000

tailed time-dependent 3D subsystem simulation


of the flow, evaporation, and burning of an isolated Al droplet to obtain accurate state-of-theart parameterizations (see Figure 9).

he individual problems just discussed


are themselves examples of fluidsolid
interactions, albeit on finer scales, for
which we will be able to use the same
software integration framework as for the global
simulation of the SRM. In this way, we hope to
leverage much of our current software development effort in component integration, and to be
able to spin off these subscale simulations from the
larger-scale system simulation in a highly compatible manner.

Acknowledgments
We thank our many colleagues at CSAR for their research
contributions to this article. This program is truly a
collaborative effort based on the technical strengths of
many people. We thank Amit Acharya, Prosenjit Bagchi, S.
Balachandar, Dinshaw Balsara, John Buckmaster, Philippe
Geubelle, Changyu Hwang, Thomas L. Jackson, and BiingHorng Liou for their contributions to the New research
directions section. The CSAR research program is
supported by the US Department of Energy through the
University of California under subcontract B341494.

31

7. X. Jiao, H. Edelsbrunner, and M.T. Heath, Mesh Association:


Formulation and Algorithms, Proc. Eighth Intl Meshing Roundtable, Tech. Report 99-2288, Sandia Natl Labs., Albuquerque,
N.M., 1999, pp. 7582.
8. L. Kale et al., NAMD2: Greater Scalability for Parallel Molecular
Dynamics, J. Computational Physics, Vol. 151, No. 1, May 1999,
pp. 283312.
9. L. DeRose et al., An Approach to Immersive Performance Visualization of Parallel and Wide-Area Distributed Applications,
Proc. Eighth IEEE Symp. High-Performance Distributed Computing,
IEEE Computer Soc. Press, Los Alamitos, Calif., 1999, pp.
247254.
10. M.T. Heath and J.A. Etheridge, Visualizing the Performance of
Parallel Programs, IEEE Software, Vol. 8, No. 5, Sept. 1991, pp.
2939.
11. Y. Cho et al., Parallel I/O for Scientific Applications on Heterogeneous Clusters: A Resource-Utilization Approach, Proc. 13th
ACM Intl Conf. Supercomputing, ACM Press, New York, 1999,
pp. 253259.

Figure 9. Results obtained from a time-dependent 3D simulation of


flow and heat transfer from a spherical droplet in cross-flow. The
simulation employs a resolution of 81 96 32 points along the radial, circumferential, and azimuthal directions at Reynolds number
Re = UD/ = 350, based on free stream velocity (U) and droplet diameter (D). At this Reynolds number, flow is unsteady with timeperiodic vortex shedding. (a) Azimuthal velocity contours at an instant in time, where we can see an imprint of vortex shedding. (b)
Temperature contours, where the approaching flow is hotter (red)
than the droplet (blue), which is considered isothermal.

References

Michael T. Heath is the director of the Center for Simulation of Advanced Rockets at the University of Illinois,
Urbana-Champaign. He is also a professor in the Department of Computer Science, the director of the
Computational Science and Engineering Program, and
a senior research scientist at the National Center for
Supercomputing Applications at the university. His research interests are in numerical analysisparticularly
numerical linear algebra and optimizationand in parallel computing. He wrote Scientific Computing: An Introductory Survey (McGraw-Hill, 1997) and has served
as editor of several journals in scientific and high-performance computing. He received a BA in mathematics from the University of Kentucky, an MS in mathematics from the University of Tennessee, and a PhD in
computer science from Stanford University. Contact
him at CSAR, 2262 Digital Computer Lab., 1304 West
Springfield Ave., Urbana, IL 61801; m-heath@uiuc.edu;
www.csar.uiuc.edu.

1. G.R. Nickerson et al., The Solid Propellant Rocket Motor Performance Prediction Computer Program (SPP), Version 6.0, Tech. Report AFAL-TR-87-078, US Air Force Materials Lab., Edwards Air
Force Base, Calif., 1987.
2. G.P. Sutton, Rocket Propulsion Elements, 6th ed., John Wiley &
Sons, New York, 1992.
3. M.T. Heath and W.A. Dick, Virtual Rocketry: Rocket Science
Meets Computer Science, IEEE Computational Science & Eng.,
Vol. 5, No. 1, Jan.Mar. 1998, pp. 1626.
4. I.D. Parsons et al., Coupled Multi-Physics Simulations of Solid
Rocket Motors, Parallel and Distributed Processing Techniques
and Applications Conf., Vol. VI, CSREA Press, 1999.
5. P.V.S. Alavilli, D. Tafti, and F. Najjar, The Development of an
Advanced Solid-Rocket Flow Simulation Program ROCFLO, Proc.
38th AIAA Aerospace Sciences Meeting and Exhibit, AIAA Press, Reston, Va., 2000.
6. C. Farhat and M. Lesoinne, Two Efficient Staggered Procedures
for the Serial and Parallel Solution of Three-Dimensional Nonlinear Transient Aeroelastic Problems, Computer Methods in Applied Mechanics and Eng., Vol. 182, Nos. 3 and 4, 2000.

32

William A. Dick is the managing director of the Center


for Simulation of Advanced Rockets at the University
of Illinois, Urbana-Champaign. He received a BS in mechanical engineering from the University of Delaware
and an MBA from the University of Illinois. Prior to
coming to the University of Illinois, he was a deputy
director and composites engineer in the National Science Foundations Engineering Research Center for
Composites Manufacturing Science and Engineering.
Contact him at CSAR, 2266 Digital Computer Lab.,
1304 West Springfield Ave., Urbana, IL 61801;
w-dick@uiuc.edu; www.csar.uiuc.edu.

COMPUTING IN SCIENCE & ENGINEERING

Das könnte Ihnen auch gefallen