Sie sind auf Seite 1von 7

Functional Testing of Digital Systems

Kwok- Woon Lai


Bell Laboratories
Murray Hill, New Jersey 07974
Daniel P. Siewiorek
Carnegie-Mellon University
Pittsburgh, Pennsylvania 15213

ABSTRACT often given the tedious task of writing validation programs


with vague guidelines such as "exercise each instruction at
Functional testing is testing aimed at validating the correct least once" and left to their own ingenuity. For a new
operation of a digital system with respect to its functional member of an existing computer family, these programs may
be supplemented by operating systems and application
specification. We have designed and implemented a practical test
programs which have not been written with testing in mind.
generation methodology that can generate tests directly from a
system's high-level specification. Solutions adopted include It is hardly surprising that such ad hoc approaches often do
multi-level fault models and multi-stage test generation. Tests not give satisfactory quality assurance. Good evidence
demonstrating the inadequacies of present testing techniques
generated from the methodology were compared against test
is expensive design errors discovered in the field for many
programs supplied by a computer manufacturer and were found
computer families. A functional testing methodology is very
to detect more faults with much better effidency. The
much needed.
experiment demonstrated that functional testing can be both
practical and eJ~cient. Automatic generation of design
validation tests is now closer to reality. 1.2 Physical Fault Detection
The usefulness of functional testing, however, goes beyond
design validation. It may even be able to supplant traditional
circuit testing in detecting physical faults. With the number of
components per chip doubling every one to two years,
1. Introduction traditional test generation methods are apparently coming to
Functional testing is testing aimed at validating the correct a dead end. Many ATG systems based on circuit testing
operation of a digital system with respect to its functional techniques have become unusable due to combinatorial
specification. Conventional testing methodologies, by explosions in their computer run times. When users
contrast, generate tests solely based on the physical ~structure generate tests for their systems piecemeal, they find that the
(e.g. interconnections, components used) of a system. Their complexity has lowered fault coverage while requiring ever
aim is to diagnose hardware failures and they offer little larger numbers of test vectors.
assurance about whether a system has been designed This growing complexity can be made more manageable by
correctly in the first place. specifying a system at the functional level in a hierarchical
manner and generating tests at this level using efficient
1.1 Design Faults heuristics. Functional tests may actually be more efficient
than tests generated from a circuit diagram using traditional
Design faults are non-physical faults caused by imperfections
approaches. Whereas implementation-specific tests are more
introduced into a system during its design stage. Our original
thorough in detecting individual physical faults, functional
motivation for research into automatic test generation (ATG)
tests exercise a system at a higher level, perhaps activating
at the functional level was the problem of architecture
many physically distant components simultaneously, and can
validation - validating that a prototype correctly implements
potentially detect more faults in shorter time than tests that
the instruction set of a specified computer architecture. A
focus on one spot of the hardware at a time. As in many
second motivation is the high cost of design errors in VLSI
other areas of computer science, global efficiency may prove
circuits, where field engineering changes are no longer
to be far more important than local efficiency.
possible.
Currently, the functional correctness of most new systems
1.3 Advantages of Functional Testing
are being checked by ad hoc methods. Test programmers are
Automatic test generation at the functional level offers the
* R~ear.da eonduetes:l
- while principal author was a graduate student ar following advantages over ad hoc approaches:
t...amegae-Mellon University. This research was sponsored by the
Defense Advanc,~ a~eare~. Project'sAsency (DOD), ARPA Order No. • Better quality - tests would be generated according to
~/, tuomturen oy me A~r F6rce AvaonicsLaboratory under contrac~ scientifically established fault models rather than the
F33615-78-C-I551.
personal judgment of test programmers. The tests should
The views and conclusionscontainedin this document are those of the be much more thorough.
authors and thould .no=b.e inte]!pr.exed~ representingthe officialpolities,
earner expressen or maplien, ot me /Jefense Advanced Researchvroject~
Agency or the U.S. Government. • A u t o m a t i o n - test generation time and cost can be
drastically reduced, freeing skilled manpower. Tests can

20th Design Automation Conference


Paper 15.3
0738-100X/83/0000/020751.00 © 1983 IEEE 207
be generated as soon as the functional specification of the l" . . . . . . . . . . . . . . . . "-I
system becomes available, even before logic design is ,I Functional Specification, !
underway, and keep up with any changes. Such an , (Graph Description) ,
automatic bridge between specification and
implementation is extremely useful in development and
debugging and can significantly shorten the development
I . . . . . . . . :
cycle. Project managers can count on tests of guaranteed .... Fa_~_t M_od__els_. . . . "
qualities being generated on schedule, rather than live in
the constant fear of unsatisfactory tests and schedule
Parameterized Tests
Primitive Level
delays.
Fault Models
1
1.4 Review of Literature Test Case Synthesizer
l~ata Base
Akers Ill have proposed the use of binary decision diagrams in
generating "experiments" for simple digital devices such as
flip-flops and gates. His work is among the first attempts to Test Cases
generate tests based on formal functional descriptions other
than boolean equations and flow tables. Binary decision
diagrams, however, are not adequate for the description of Test Program
more complicated digital systems. Breuer and Friedman [21
Synthesizer
have extended traditional circuit testing approaches to handle
"functional level primitives" such as adders and shift
registers. Their main concern is hardware fault diagnosis and Test Program
tests are still generated from the circuit diagram of the
hardware to be tested. Because the structure of their
algorithms are the same as in the original methods, their
extension inherits circuit testing's problem of combinatorial Figure 1. Components of the Methodology
explosion. Thatte and Abraham TM have proposed a
methodology of testing microprocessors. Their chief concern
themselves - they can choose predefined fault models or
is again hardware fault detection. Their model describes data
define their own.
transfer among registers and the main memory but does not
include control and data transformation functions. Based on
this model they have proposed a number of test generation
2.1 Graph Description Language
algorithms. Other researchers interested in hardware fault
detection have also looked into the problem of ATG for A functional specification language suitable for test
microprocessors. In the interest of space, their work will not generation was needed, so we designed an extensible graph
be discussed here. description language called state transformation graphs
(STG). STG can describe digital systems at levels of detail
ranging from user-defined primitives of arbitrary complexity
2. Methodology down to logic gates. While an STG graph serves as a
system's functional specification, it also facilitates the
Our methodology is an ambitious one. It generates tests
expression of structural assumptions about the system which
directly from the functional specification of a digital system
are vital for practical test generation. This is a tradeoff
using multi-level fault models and multi-stage test
between generality and practicality. By allowing the user to
generation. Major components of the methodology are include as few implementation assumptions as is practical, as
shown in Figure 1. Functional analyzer is the heart of the test
much generality as possible in the tests generated is
generation system. Its inputs are the digital system's graph
preserved.
description and graph level fault models chosen by the user.
For each fault model and each graph primitive covered by A user interested in physical fault detection is free to
the model, it generates a parameterized test that detects the construct a description the nodes and paths of which have a
fault at that primitive. The test case synthesizer then one to one correspondence with physical components in the
substitutes the formal parameters by bit patterns obtained system. With such a description and appropriate fault models,
from the test pattern database. These test vectors can then be tests generated by our methodology would actually be
mapped by the test program synthesizer into test program identical to those generated by a traditional method from the
segments that would actually carry out the tests. system's circuit diagrams. In other words, our methodology
actually contains traditional circuit testing as a special case.
Any practical functional testing methodology must necessarily We should emphasize, however, that STG is intended to be a
be a compromise between generality and practicality.
functional specification language used by designers to specify
Functional approaches that assume nothing about how a
their systems.
system is implemented would require astronomical numbers
of tests to cover every possibility. For test generation to be Associated with each STG graph is a set of state variables
feasible, one has to make assumptions about the which can be read and written by read/write operators,
implementation. Since the choice of fault models directly together they represent a finite state machine. Even though
determines the cost of test generation and the effectiveness STG is intended to be a functional specification language
of the tests generated, our approach is to let users decide for used by designers, it can also describe arbitrary

Paper 15.3
208
I I i
I fringe 1 I ~ll <"
Data Operator Decider Write Cell
]-> I PC+I page

I I...I I...I
ceU-> I l arrayt1-> I J~ay[l <-I I J i I
I I <8> I I
I I i t
p- i i i
<11.9>
Read Cell Read Memory Write Memory

bit bit addr


I i
I Decode
i...i i i...i
Demultiplexer /0 1 2 3 4 5 6 7~

i i i i i i i i
and tad dea itm imp lot

r
L___L___ L___I .... I.... I..... I___J__.
i
Junction Data Flow Gates i
Imtrucdma Execution

Figure 2. Some STG Primitives "- 77.- ,;7..

combinational and sequential circuits. An STG graph is a I


directed graph in which paths represent (logical) data/control
paths while nodes represent data transformation operators.
I ~ 1
Data is passed along in the form of tokens, with at most one
token per path. A node can .fire when it has the required Figure 3. STG graph of PDP-8 • Instruction Decoding
tokens on its input paths. The firing of a node results in the
removal of input tokens and the placement of new tokens on effective effective
its outputs. Every graph has two special nodes: begin and and PC+I
end. At the beginning of each cycle, a token is fired from
the begin node which subsequently activates other nodes
depending on the current state. Eventually a token arrives at
I
the end node signalling the end of the cycle. Some basic
STG primitives are shown in Figure 2. As an example, the I ~c'~ I IMtl "~ I I +1 I IMt]<-I
top level graph description of the PDP-8 minicomputer is
given in Figure 3 and the graphs for two of its instructions,
'-., /
AND (logical and) and JMS (jump subroutine), are shown in i~DI
Figure 4. Detailed definition of STG can be found in [6].
I ^el<" I
2.2 Graph Level Fault Models I i
Graph level fault models can be viewed as "macro" fault
models that model faults at the graph level and are therefore Figure 4. PDP-8 Instructions : AND, JMS
functional fault models. Each model specifies the kind of
graph primitives covered by the model, and given such a dominance and fault equivalence relationships among inputs
primitive, where the test data should be applied and where and outputs of each node type. Similar to the single-path
the test results can be found. model, the single-node model models faults in the nodes of a
The single-path model models faults in the data paths of a graph. This model specifies that test parameters are to be
graph description with the assumption that at most one data applied to all inputs of a node with the node's outputs then
path can be faulty at any one time (note that a path is a carrying the test results to be inspected.
logical entity and may not correspond to any physical wires). The double-path model assumes that up to two paths may be
It covers any fault that affects a path's integrity. A path can faulty at the same time. For each pair of paths selected for
be tested by applying test data to its input and examining the test generation, a test parameter is applied to the beginning
result at its output. During test generation, symbols of each path with the corresponding test result coming out at
representing test data and test result are propagated backward the end of the path. Both parameters are backward
and forward respectively. The model also specifies fault propagated while both results are forward propagated. The

Paper 15.3
209
double-path model is useful for detecting interference Each stage is further subdivided into test generation routines
between two dat~2 paths. The number of tests required is that work on one primitive at a time. These routines are
proportional to n , where n is the number of paths in the called recursively until success. If a step fails, the state of
description. One can go even further and allow any number the graph is rolled back to the point where a choice was last
of paths to be faulty at the same time. The number of tests made and the next choice is attempted. Localized heuristics
to be generated then becomes o(2n). A graph level fault which only consider a primitive's immediate neighbors in
model can specify any combination of paths and nodes - the making a decision are used whenever possible, so the
possibilities are endless. number of primitives considered at each step is not affected
by the size of the graph, although the number of steps
The more parameters and results to propagate, the higher the
probability that a test either does not exist or cannot be
generated by the heuristics. Practical experience in other , , , l.,rapn-Level
fields of testing has shown that multiple faults which cannot , Graph Description , ,
, , , Fault Models ,
be detected by tests generated for single faults occur ._I I_ . . . . . . . . . . . . . -J
infrequently. With costs much higher than the single-fault
models, the potential benefit of multiple-fault models is
probably marginal.
n i] Parameter Introduction II
2.3 Primitive Level Fault Models
Primitive level fault models model faults at the level of
'
I t
individual graph primitives. At this level, assumptions about ' _ ! Backward Propagation I
', 1 I
the implementation are inevitable unless exhaustive or
random testing is to be used. For each likely implementation :
of t h e same functional primitive, a different model may be
developed and entered into the test pattern database. A user PrG r aitive
ph :i I Forward Propagation II

can choose a suitable fault model or simply use all available Database
,
models to cover as wide a range of implementations as
possible. This is a tradeoff between cost and generality. " -! Justification [
If a model assumes a specific circuit realization, traditional
!
circuit testing methods can be called upon to generate test
..J
patterns based on that realization. If the graph primitive is in Parameterized Tests
turn defined in terms of lower-level primitives, the
functional test generation system can be called upon
recursively to generate'tests for the higher-level primitive. Figure 5. Operation of the Functional Analyzer
This is made possible by the extensible graph language. In
each case, patterns generated are entered into the test pattern needed in each stage may grow proportionally with the
database under the name of the primitive along with the number of paths in a graph. At present, only loop-free graph
implementations assumed and the number of test vectors descriptions can be handled by the functional analyzer.
required.
A constraint is a predicate function associated with a path.
Example : The single-path model makes no assumption about A path can only carry tokens that satisfy the constraint. A
what the actual fault in the data path may be. The task of path can have any number of constraints as long as they are
selecting test data from the 2w possibilities (where w is the mutually consistent. Non-overlapping bit fields within a path
width of the data path) is left to the primitive level fault each can have its own set of constraints. Cells in the
model. To test for single-bit stuck-at faults in the data path, machine state and symbols created during test generation
for example, it is only necessary to use a pair of test vectors may also have constraints. Symbols are used to represent test
that turn each bit on and off at least once e.g data, test results, bit fields, and a number of other things.
00..00 & 11..11, 01..01 & 10..10. To test for shorts between Constraints involving symbols and their manipulation play a
adjacent bits, however, would require more than two test central role in the test generation process. During test
vectors generation, constraints are added to paths, cells, and symbols
to create the conditions needed to generate a successful test.
2.4 Functional Analyzer Constraint resolution is the process of determining whether
a set of constraints is consistent and then possibly simplifying
The functional analyzer first consults the graph level fault the set. Constraints are usually added one at a time to a set
model chosen to select a set of primitives. For each case, of constraints that has already been determined to be
test generation is performed in four sequential stages (Figure consistent. Constraint resolution can be an extremely
5): parameter introduction, backward propagation, forward involved problem if arbitrary predicate functions are to be
propagation, and justification. Each stage has its own considered. Fortunately, the descriptions of most computers
objective and conditions are imposed on the graph as require only simple predicate functions such as = , ~ , < , ~<,
necessary to meet that objective. By successively > , >/. As a result, most constraints that arise in practice are
constraining the graph, a parameterized test which detects very simple and can be resolved using relatively simple
the fault is generated at the end. algorithms.

Paper 15.3
210
Definition: A constant vector
Parameter Introduction - this phase introduces symbols
representing test data parameters and test results into the graph (c0,c I ..... Ci.l,Ci+ 1 ..... Cn.1)
and sets the stage for subsequent phases. The graph level fault is a forward p r o p a p t l o n vector for the ith input of the n-
model selected is consulted to identify the next set of primitives
input function F iff
for which a test is to be generated. Symbols representing the
required parameters and results are then added to the graph. F(c0,c 1..... Ci.l, x, ci+ 1 ..... Cn.1) = g(x)
Implication - each time a path receives a new constraint, it may for all x in the ith input domain of F and iff g ' l exists. The
have implications on other paths connected to the same nodes. function g is called the transformation function of the
At each test generation step, paths that have received new vector.
constraints are noted and the implication routine is called at the Propagation is successful if the value is passed along to the
end of the step to check if any of the changes results in a output with no loss of information. For a multiple-output node
contradiction, indicating failure of that step.
represented by the set of functions {F0., F.I,, ..., F -1} where m
is the number of our0uts, the above deftmtio.nthcan~e extended -
2.4.1 Backward Propagation a vector is a propagationtl~ector for the i input if it can
For test data to be applied to the required places, the machine propagate the value at the i input to any one of the m outputs.
state prior to the test cycle must contain the test data or some
transformation of them. The objective of the backward
propagation phase is to set up the conditions necessary for the get names of all
application of test data parameters to required places. This is nodes connected to path
done by backward propagating each parameter to a read node in
the graph. Then by initializing the machine state accordingly no
)
Propagation
prior to the test cycle, it is guaranteed that the parameters Failed
any node left? ~..
would be applied to the correct spots.
Each step within the backward propagation phase considers
only one path at a time. It tries to impose the necessary select n i ~ node ~
conditions on the inputs of the path's source node so that the
current value of the path is assured. Since the value to be
backward propagated is almost always a symbolic parameter or obtain propagation vectors ]
a symbolic expression, a one to one mapping of that value must for node from database /
appear somewhere on the inputs of the source code. This
concept is formalized in the following definition. no/
Definition: A constant vector

nod.
any vector left? . ~ , , , ,
(Co,C1 ..... ci.l,ci+ 1 .... Cn.1) yes
is a backward propagation vector of the n-input function F select next vector
iff
F(c0 ..... ci.1, g(x), ci+ 1..... Cn.1) = x
-1 propagate symbol thru
for all x in the output domain of F and iff g exists. The
function g is called the transformation function of the
vector. implication ok? no /
By imposing a backward propagation vector on the inputs of the yes
source node, the objective of the current backward propagation
step is satisfied. The backward propagation r[]~tine is then Propagation Successful
recursively called for the path connected to the i input of the
node F. The process is successful if a read node is eventually
reached. Figure 6. Flow Chart for Forward Propagation

Propagation vectors for a function can usually be determined


2.4.2 Forward Propagation simply by inspection. The most obvious and simple ones can be
Test results must be saved in the machine state at the end of entered into the graph primitive database which contains all
the test cycle so that they can be checked for correctness in definitions and heuristics associated with each type of graph
subsequent cycles. They must therefore either appear in the primitive. Figure 6 shows a flow chart of the forward
contents or the addresses of the new machine state. The propagation phase. The backward propagation phase works in
objective of the forward propagation phase is to bring these essentially the same manner.
results to "write" nodes in the graph to ensure their Justification - During the course of backward and forward
observability after the test cycle. propagation, when necessary conditions for the test generation
Forward propagation is similar to backward propagation in to be successful are assigned, many paths are often left
operation except for its forward direction. The concept of unspecified. Justification fills in the unspecified values so that a
propagation vectors can similarly be defined. parameterized test can actually be generated.

Paper 15.3
211
The test case synthesizer brings together parameterized tests 3. Experiment
produced by the functional analyzer and test patterns that have A fault simulation experiment utilizing ISPS[4], a register-
been developed through primitive-level fault models. It transfer level hardware description language, was conducted to
substitutes the formal parameters by actual test data and evaluate the effectiveness of our methodology. Tests were
expected results obtained from the test pattern database. T h e generated for the DEC PDP-8 minicomputer and compared
user can specify the characteristics and cost constraints he wants against test programs supplied by the computer's manufacturer.
to impose. Only patterns meeting the requirements will be An ISPS description of the PDP-8 was simulated with single
chosen from the database. stuck-at faults in all the data paths represented in the
Test Program Synthesizer - If the system being tested is a description, for a total of 1,438 faults. Design faults were not
computer, its instructions can usually be grouped by their chosen because of their much larger fault space and it
operand fetching modes and members of the same set can impossible to select a subset without using considerable
usually be tested with similar sequence of instructions. Test subjective judgment.
program templates can be developed for each group. For each On the other hand, single stuck-at faults have long been
actual test case given, the synthesizer selects the applicable accepted as the starting point of many testing strategies at both
templates and tries to fill in the necessary "blanks" whenever
possible, producing ready-to-run program segments and ISPS Description
automating the test programming process.

ZS Desigu Comideratlom I Fanlt


Routi
Generati~
nes [ ISPSCompiler
Separation of Issues - a major goal of the research is to I I
untangle the issues involved in functional testing, dividing them
into smaller problems that can be attacked one at a time. This dependent faults Database
enables better focusing on the individual issues, makes the
functional testing problem as a whole more manageable, and
permits the development of more efficient solutions. This is
done throughout the methodology whenever possible. The
solutions adopted include:
1. Multi-level f a u l t models - fault models are divided into
graph and primitive levels.
2. Multi-stage test generation - tests are generated in two
stages. Parameterized tests can be repeatedly used by the I S I S Simulator [
test case synthesizer to generate tests for different
requirements. I
Modularity and Flexibility - the methodology is designed to Experimental
facilitate evolutionary changes and incremental improvement in Results
every one of its building blocks. The following solutions were
adopted: Figure 7. Design of Experiments
* Test generation is performed in stages even within the
functional analyzer, a heuristic can be "unplugged" and high and low levels. It is the closest thing to a quamitative
replaced by a new one with ease. yardstick by which tests generated by various methodologies are
• Whenever possible, information is grouped into databases measured. It is generally accepted that a test that detects a
that can be easily modified by the user. The databases higher percentage of single stuck-at faults than another test is
include all the fault models, the test pattern database, test probably the better test, even though many of the faults that
program templates, and definitions of the primitives used in actually occur in practice are not single stuck-at faults. Single
graph descriptions. stuck-at faults were therefore selected for our experiment as the
most objective and practical measure available. The design of
User Conlrol - the user is given maximum control over the
the experiment is shown in Figure 7.
whole test generation process because each application has its
own fault distribution and cost considerations and it is best to An STG graph of the PDP-8 was developed based on the
let the user specify the pertinent information rather than forcing computer's handbook and has 158 nodes and 235 paths. The
unrealistic assumptions upon the user. A user can exert control graph was run through our system to produce parameterized
over the test generation process by : tests using the single-path fault model. The heuristics
succeeded in generating tests for 97% of the paths selected by
1. Selecting the level of detail at which the digital system or
the model. The test patterns used in test case synthesis were
parts of the system is specified.
simple: each bit of every parameter within a parameterized test
2. Defining new graph primitives. need to be turned on and off at least once. The test cases were
given to a test programmer who semi-mechanically translated
3. Selecting or defining fault models.
them into PDP-8 assembly code (a test program synthesizer was
4. Imposing cost and other restrictions during test not written for the PDP-8). The test programming effort was
generation. straightforward and no in-depth understanding of the system

Paper 15.3
212
being tested was required at all. The same translation could References
have been performed automatically by simple test program
synthesis techniques.
The test programs supplied by DEC exerdse the machine at a
functional level and halts whenever an error is discovered, 1. S.B.Akers, "Functional Testing with Binary Decision
pinpointing the instruction at fault. Our functional tests Diagrams", in Prec. 8th Int'l Conference on Fault-Tolerant
actually provide better diagnostic resolution since each test is Computing (FTCS-8), pp 75-82, IEEE Computer Society,
designed to detect faults in a functional primitive. June 1978.

Results of the experiment are summarized in Table I. Our 2. M.A. Breuer and A.D. Friedman, "Functional Level
tests outperformed the manufacturer's programs by a Primitives in Test Generation", IEEE Transactions on
substantial margin - we detected 98.5% of the stuck-at faults Computers C-29(3),pp 223-235, March 1980.
compared to 95.5%. Most surprisingly, tests produced by our
methodology achieve the higher fault detection rate with far 3. S.M.Thatte and J.A. Abraham, "Test Generation for
fewer instructions. Microprocessors", IEEE Transactions on Computers C-
29(6), June 1980.

Test Program Instructions Executed Faults Detected 4. M.R.Barbacd, G.E.Barnes, R.O.Cattell, and
D.P.Siewiorek, "Symbolic Manipulation of Computer
Descriptions: The ISPS Computer Description Language",
Manufacturer's > 10,000 95.5% CS Dept. TR CMU-CS-79-137, Carnegie-Mellon U.,
August 1979.

Ours 731 98.5% 5. Kwok-Woon Lai, "Functional Testing of Digital Systems",


PhD thesis, Computer Science Dept. TR CMU-CS-81-148,
Carnegie-Mellon University, Dec. 1981.

Table 1. Results of the Experiment

First version of the test generation system had over 5,000 lines
of LISP code and required 600K bytes to run. Only the graph
description language and the functional analyzer were fully
implemented in the first version. Test generation for the PDP-8
took only 21 minutes of CPU llme running with interpreted
LISP code on a DEC-2060. Compilation of the LISP code alone
can speed things up about five times.

4. Condml~a
We have taken a practical approach towards the problem of
functional testing at the system level. We laid down the
groundwork for a systematic assault on the problem and also
provided a framework for dividing the overall problem into
smaller, more manageable problems. An ambitious functional
testing methodology which generates tests directly from the
functional specification of a digital system has been designed,
implemented, and evaluated. Automatic generation of design
validation tests is now closer to reality. Test cases that would
have taken test programmers man-months and hard-earned
experience to develop can now be generated automatically in a
matter of minutes.
Although the scope of the evaluation experiment is limited, its
results are very promising. The quality and the efficiency of
the tests generated further underscore the promise of functional
testing. On the other hand, there are still many things that we
cannot begin to understand until more practical experience is
gained. Large scale experiments are currently being planned.
Improvement of the system will continue as w¢ continue to
learn. I
1. The interested reader will find a m ~ e detailed discus~rm c~ our
results in my thesis [5].

Paper 15.3
213

Das könnte Ihnen auch gefallen