Sie sind auf Seite 1von 17

On Agile Performance

Requirements
Specification and Testing
Authors : C.W. Ho, M.J. Johnson, L. Williams, and E.M. Maximilien
From : AGILE 2006 Conference (AGILE’06)
Presented by : K.W. Lee, 17.01.2008

1
Outline
1 Introduction
2 Performance Requirements Evolution
Model (PREM)
3 IBM Experience
4 Conclusion

2
1 Introduction (1/3)

• Performance :
– Difficult and expensive to fix in later stages
of the development process

• Performance engineering activities


– Start with performance requirements (PRs)
definition

3
1 Introduction (2/3)

• Problem :
– Agile method – sometimes criticized for not
having explicit practices for eliciting NFRs

• Solution :
– Propose software Performance
Requirements Evolution Model (PREM) for
PRs specification and validation

4
1 Introduction (3/3)

• About :
– an agile approach to address the
specification and testing of an important
NFR : performance
– Only focus on time-related performance
issues
– the model is applicable to space-related
performance issues

5
2 Performance Requirements
Evolution Model (PREM) (1/8)
• PREM
– An evolutionary model for PRs specification
– Specifies performance characteristics details
and the form of validation for the PR
– Development can start with quick, simple
PRs and related performance test cases
– Test cases can be more detailed

6
2 Performance Requirements
Evolution Model (PREM) (2/8)
2.1 Model Description

7
2 Performance Requirements
Evolution Model (PREM) (3/8)
• Level 0
– Represents PRs with only qualitative, casual
descriptions
– Bring out the operations for which
performance matters in the eyes of the
customer
– “The authentication process shall complete
before the user loses his or her patience”

8
2 Performance Requirements
Evolution Model (PREM) (4/8)
• Level 1
– Associated with quantitative metrics
– “The authentication process shall complete
in 0.2 seconds”

9
2 Performance Requirements
Evolution Model (PREM) (5/8)

10
2 Performance Requirements
Evolution Model (PREM) (6/8)
• Level 1
– Insufficient
• Do not show how different processes interact
with each other and how they respond to system
workloads variations -> NFRs system execution
characteristics (SEC)
• SECs include:
– Information of what processes are in the system, and
the frequency of the execution of the processes

11
2 Performance Requirements
Evolution Model (PREM) (7/8)
• Level 2
– Quantitative performance objectives with the
SECs of the system
– “On average, 20 requests arrive at the
server per minute. The authentication
process accounts for 10% on incoming
requests. The authentication process shall
be complete in 0.2 seconds”

12
2 Performance Requirements
Evolution Model (PREM) (8/8)
• Level 3
– Represents the PRs with worst-case specification
– “During the peak hours, 200 mobile shopping tablets
are in use. 8% of the shoppers are either being
served at the five checkout stations or are waiting in
lines. For the rest of the customers, the promotional
message shall display on the mobile tablet within 1
second after a customer enters a lane where the
promotional items are located.”

13
3 IBM Experience (1/3)

• Show how the PRs are mapped to the


PREM levels

14
3 IBM Experience (2/3)

3.1 Performance Requirements Specification


• IBM’s performance requirements were
collected from 3 primary source :
1. Customer feedback -> Level 0
2. Domain expert -> level 1
3. Set of limiting factors
• E.g. internal USB hardware contribute to the
latency from command to physical device
reaction
15
3 IBM Experience (3/3)

3.2 Performance Testing


– Design performance tests using JUnit
framework
– developers had early visibility to any severe
problems caused by new code

16
4 Conclusion
• Using PREM as a guideline
– a development team can identify & specify
PRs incrementally
– fits in the “good enough” attitude of agile
methods
– Help the developers to write appropriate test
cases for PRs.
– TDD can be used to verify whether
performance objectives have been met
17

Das könnte Ihnen auch gefallen