Beruflich Dokumente
Kultur Dokumente
Response time
Workload
When to run performance tests
• Start of project
– Detect bad technology choices early on
– Ensure end-to-end testability
– Uncover performance requirements
• At regular intervals during development
– Detect performance problems as they arise, while
there’s still time to fix them
– Performance smoke tests as part of automated
regression testing / continuous integration
• Systems and Acceptance test
Preparing performance tests
• Plan ahead
• Obtain requirements
• Design for testability
• Obtain test data
• Set up test environment
• Choose testing tools
• Create test configuration
Plan ahead
• Part of contract?
• Generate from logs
• Ask users / business analysts
• Run functional tests, monitor system
Set up test environment
• Commercial products
– Good, may be expensive (NOK 100.000+)
• Open source offerings
– Many options, varying maturity
• Rolling your own
– Possibly only option if unusual or very new
technology
– Investigate possibility of extending existing tools
Create test configuration
• Smoke test
– Is everything in place?
• System warm-up (if not part of test)
– Pre-heat caches
– Preload pages
• Enable logging and monitoring as required
Run test
• Fire it up!
• Log start and end times
• Sample system behaviour manually during test
• Monitor test environment
– Catch any issues that would invalidate the test early
on
– Client load peak
– External service failures
Document results
• Vertical scaling
– Increase memory, CPU, bandwidth
• Horisontal scaling
– More servers, load balancing
– Requires architecture support
– Terracotta, Azul
• Dedicated system
– Remove resource contention
• Configure platform (tuning)
– Consider OS tuning guidelines for app container,
database
• Profile and rewrite
Implement change and re-test
• JMeter
http://jakarta.apache.org/jmeter/
• Mercury LoadRunner
http://www.mercury.com/us/products/performance-center/loadrunner/