Sie sind auf Seite 1von 30

Yes, but is it fast enough?

Performance testing with JMeter

Markus Krüger <markus.kruger@ergogroup.no>


Agenda
• What is performance testing?
• Preparing performance tests
• Running performance tests
• Solving performance issues
• JMeter
• Demos
What is performance testing?

• Testing to measure performance under a


particular workload
• Verifying that system satisfies functional
requirements under realistic conditions
• Verifying that scalability requirements are met
• Can be combined with stress testing:
how system performs under extreme loads

Response time
Workload
When to run performance tests

• Start of project
– Detect bad technology choices early on
– Ensure end-to-end testability
– Uncover performance requirements
• At regular intervals during development
– Detect performance problems as they arise, while
there’s still time to fix them
– Performance smoke tests as part of automated
regression testing / continuous integration
• Systems and Acceptance test
Preparing performance tests
• Plan ahead
• Obtain requirements
• Design for testability
• Obtain test data
• Set up test environment
• Choose testing tools
• Create test configuration
Plan ahead

• Include performance test in project plans


• Ensure qualified testers (or training time) are
available
• Procure and configure test environment
– System administrator support for
recreating/duplicating test environment similar to
production
– Access to external test servers (firewall openings,
contact lists for external servers, etc.)
Obtain requirements

• Do performance requirements exist?


• SLA (service level agreement)
– Response time (min, max, avg, 90%)
– Requests per second
– Number of concurrent users/sessions
• Expectancies from existing systems
• Ensure sufficiently detailed requirements
– Clarify vague requirements
• ”GUI should be responsive”
• ”Should perform well under typical load”
– Concrete measures (50 transactions/second, max
1.5 second response time)
Design for testability

• Sufficient logging to measure where time is


spent
• Configurable logging
– enable precise measurements where needed, while
avoiding excessive logging that hurts performance
• Easy configuration of parameters affecting
performance
– E.g., number of database connections, web service
URLs, cache sizes, and so on
• System statistics
– E.g., number of requests, amount of data
transmitted, error count
– Often available from application container
Obtain test data

• Part of contract?
• Generate from logs
• Ask users / business analysts
• Run functional tests, monitor system
Set up test environment

• Realistic test environment


– As close as possible to production environment
– Corresponding network configuration
– Same software versions
– Realistic database volumes
– Investigate and document any deviations that might
affect performance
• Dedicated servers / server time
• Test clients
– Number of clients
– Ensure client load doesn’t affect test results
– Simulation of network conditions, geographic
distribution
Choose testing tools

• Commercial products
– Good, may be expensive (NOK 100.000+)
• Open source offerings
– Many options, varying maturity
• Rolling your own
– Possibly only option if unusual or very new
technology
– Investigate possibility of extending existing tools
Create test configuration

• Ensure repeatable tests


– Reserved data ranges
– Resettable test database
• Put configuration under version control
• Focus on most used features
– Often insufficient time to perform complete
performance tests
• Realistic distribution
– Too limited set of requests  false cache hit rate
– Too evenly distributed  no resource contention
– Include peaks
– Emulate request rate accurately
Running performance tests
• Coordinate test
• Prepare environment for test run
• Run test
• Document results
Coordinate test

• Inform other users of test environment


– Ensure test data in required state
– Ensure external test servers are available
• Ensure system administrators and developers
available for troubleshooting
• Check that no server maintenance, other
performance test, etc. are scheduled at the
same time
• Plan test to avoid resource contentions
Prepare environment for test run

• Smoke test
– Is everything in place?
• System warm-up (if not part of test)
– Pre-heat caches
– Preload pages
• Enable logging and monitoring as required
Run test

• Fire it up!
• Log start and end times
• Sample system behaviour manually during test
• Monitor test environment
– Catch any issues that would invalidate the test early
on
– Client load peak
– External service failures
Document results

• Carefully document test run


– Start and end dates
– Software versions
– Test configuration version
– Summary of test results
– Any issues encountered
– Relevant log exerpts (or complete logs)
• For comparison with later test runs
• For uncovering test environment issues
– Collision with other testing activities, service
upgrades,etc?
– Recent configuration changes?
Solving performance issues
• Locate bottlenecks
• Consider options
• Implement change, re-test
Locate bottlenecks

• Investigate system platform


– Top, lsof, vmstat, SysInternals, Task Manager,
perfmon
– Database tuning
• Monitor application container
– Admin console, JMX
• Check logs
• Profile code
– NetBeans, Eclipse TPTP, JProfiler, JProbe

Premature optimization is the root of all evil.


- C. A. R. Hoare
Consider options

• Vertical scaling
– Increase memory, CPU, bandwidth
• Horisontal scaling
– More servers, load balancing
– Requires architecture support
– Terracotta, Azul
• Dedicated system
– Remove resource contention
• Configure platform (tuning)
– Consider OS tuning guidelines for app container,
database
• Profile and rewrite
Implement change and re-test

• Implement one change at a time and re-test to


measure effect
– Avoid unneccessary work and system complexity
– Detect what changes actually improve system
– Multiple changes at the same time might cancel
each other out, or even degrade performance
• Ensure repeatable tests
• Document changes, including failures
– Keep track of already attempted options, to avoid
repeated work
– Important for learning how to improve future
systems
JMeter
• Project overview
• Features
• Demo: Testing web applications
• Demo: Recording and parametrizing test scripts
• Demo: Distributed performance testing
• Demo: Testing web services
JMeter project overview

• 100% pure Java desktop application


• Designed to load test functional behavior and
measure performance
• First designed for testing web applications
– has since expanded to other test functions
• Originally developed by Stefano Mazzocchi
• Open source, part of Apache Jakarta project
• http://jakarta.apache.org/jmeter/
JMeter features

• Load and performance test various protocols


– HTTP, FTP, JDBC, JMS, LDAP, SOAP
• HTTP proxy server for recording test scripts
• Multithreading framework for concurrent
sampling
• Caching and offline analysis/replaying of test
results
• Distributed testing
• Extensible
– Pluggable samplers for implementing custom tests
– Data analysis and visualization plugins
– Functions for dynamic test script input, data
manipulation.
– Scriptable samplers (BeanShell++)
Application overview

• GUI, command line interface


• Tests can be run and analyzed interactively, or run in
batch mode and analyzed offline
• Test plans consist of
– Thread groups: organize threads of execution
– Samplers: sends requests to a server
– Logical controllers : control flow of test plan (loops,
conditionals, ordering, etc.)
– Listeners: record, summarize and display record and
response data
– Timers: introduce delays in test plan
– Assertions: assert facts about responses, for functional
testing
– Configuration elements
– Pre-processors and post-processors
Demo: Testing web applications
Demo: Recording and parametrizing
test scripts
Demo: Distributed performance testing
Demo: Testing web services
References

• JMeter
http://jakarta.apache.org/jmeter/

• Mercury LoadRunner
http://www.mercury.com/us/products/performance-center/loadrunner/

• Alexander Podelko's Performance Testing


Links
http://www.alexanderpodelko.com /PerfTesting.html

Das könnte Ihnen auch gefallen