Beruflich Dokumente
Kultur Dokumente
A quick reference for users of the Team Testing performance features of Visual Studio
Visual Studio Performance Testing Quick Reference Guide 6/20/2011
Page 1
Summary
This document is a collection of items from public blog sites, Microsoft internal discussion aliases (sanitized) and experiences from various Test Consultants in the Microsoft Services Labs. The idea is to provide quick reference points around various aspects of Microsoft Visual Studio performance testing features that may not be covered in core documentation, or may not be easily understood. The different types of information cover: How does this feature work under the covers? How can I implement a workaround for this missing feature? This is a known bug and here is a fix or workaround. How do I troubleshoot issues I am having?
The document contains two Tables of Contents (high level overview, and list of every topic covered) as well as an index. The current plan is to update the document on a regular basis as new information is found.
The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information presented after the date of publication. This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT.
Microsoft grants you a license to this document under the terms of the Creative Commons Attribution 3.0 License. All other rights are reserved.
2010 Microsoft Corporation. Microsoft, Active Directory, Excel, Internet Explorer, SQL Server, Visual Studio, and Windows are trademarks of the Microsoft group of companies. All other trademarks are property of their respective owners.
Page 2
Revision History
Version 2.0 o Released 2/16/09 o Available externally on CodePlex o Major reformat of document o Added comprehensive index Version 3.0 o Release Candidate published 3/23/2010 o Added many VS 2010 performance testing articles o Added and updated articles about VS 2010 how-to's, issues, etc. o Added or updated articles for features "changed in 2010" o Updated many articles on issues with VS 2008 o Added some deep dive articles about how VS performance testing works (both 2008 and 2010) Version 3.0a o Final release version for 3.0. This is the official release that should be used. o Published on 4/1/2010 Version 3.5 o Added more content and updated some existing content. o Added --NEW-- tag to all new article entries. o Added --UPDATED-- tag to all article entries that were corrected or enhanced. o Created a new section that contains full copies of some in depth blog posts. Version 3.6 o Added more content and updated some existing content.
NOTE
All items that are not marked with a version note should be considered to apply to both VS 2008 and VS 2010
Page 3
List of Topics
NOTE FROM THE AUTHOR HOW IT WORKS
How Web Tests Handle HTTP Headers General Info (including order of execution) of load and web test plugins and rules Client Code does not execute because Web Tests Work at the HTTP Layer When is the "Run unit tests in application domain" needed? How the "Test Iterations" Setting impacts the total number of tests executed Test timeout setting for load test configuration does not affect web tests How user pacing and "Think Time Between Test Iterations" work Load test warmup and cool down behaviors What is the difference between Unique, Sequential and Random Data Sources Simulation of Browser Caching during load tests Comparing new users to return users How cache settings can affect your testing and app performance Goal based user behavior after the test finishes the warmup period Threading models in Unit tests under load The difference between Load Test Errors and Error Details How parameterization of HIDDEN Fields works in a webtest Testing execution order in Unit Tests How machines in the test rig communicate How a Load Test Plugin is executed on a test rig Sharing State across agents in a load test rig is not supported out of the box --NEW--Order of execution of components in a webtest --NEW--401 Access Denied responses and "Requests per second" measurements --UPDATED-- File Downloads, Download Size and Storage of files during Web Tests --NEW-- Info on how VS records and generates web tests. --NEW--IE9 and other browser emulation in VS2010
10 11
11 11 14 14 14 15 15 15 16 17 18 20 22 23 24 25 27 29 30 31 32 44 45 46 46
48
48 49 50 51 52 53 54 60 61 65 67 68 69
Page 4
--NEW--Load test distributions/Network emulation --NEW--SilverLight recording info --NEW--Unlimited agent licenses for certain MSDN subscribers
71 71 71
72
72 72 72 73 73 74 75 75 76 76
77
77 77 77 78 81 81 82 82 82 82 84 85 87 87
90
90 92 93 93 93 94 95 96
98
98 98 99 100
Page 5
data sources for data driven tests get read only once Consider including Timing Details to collect percentile data Consider enabling SQL Tracing through the Load Test instead of separately How to collect SQL counters from a non-default SQL instance How 90% and 95% response times are calculated Transaction Avg. Response Time vs. Request Avg. Response Time Considerations for the location of the Load Test Results Store --UPDATED-- Set the recovery model for the database to simple How to clean up results data from runs that did not complete InstanceName field in results database are appended with (002), (003), etc. Layout for VS Load Test Results Store How to view Test Results from the GUI SQL Server Reporting Services Reports available for download How to move results data to another system Load Test Results without SQL NOT stored Unable to EXPORT from Load Test Repository Web Test TRX file and the NAN (Not a Number) Page Time entry Proper understanding of TRX files and Test Results directory Understanding the Response Size reported in web test runs
101 102 103 103 103 104 104 104 105 105 105 106 106 106 107 107 108 109 110
111
111 112 112 112 113 113 113 114 114 115 115 115 116 117 117 117 117 118 119 119 119 120 120 121 122
Page 6
WCF service load test gets time-outs after 10 requests Loadtestitemresults.dat size runs into GBs Content-Length=0 Header not sent resulting in HTTP 411 Length Required Error Error that test could not run because the network emulation is required Error/Crash in "Open and Manage Load Test Results" dialog Calls to CaptchaGenerator.aspx fail during playback Request failure with improperly encoded query strings calling SharePoint 2010 Network Emulation does not work in any mode other than LAN Error that Browser Extensions are disabled when recording a web test Error: Request failed: No connection could be made because the target machine actively refused it MaxConnection value in App.Config is not honored when running a load test --NEW-- Cannot change the "Content-Type" header value in a webtest --NEW-- VS does not expose a method for removing specific cookies from requests. --NEW-- File Upload" feature in VS does not allow you to use a stream to send file --NEW--Unit tests that consume assemblies requiring MTA will fail with default test settings --NEW--MSTest tests that consume assemblies requiring MTA will fail with default test settings --NEW--Load Test Agent Error "Failed to open the Visual Studio v10.0 registry key" --NEW--"Agent to use" property of a Load Test not acting as expected --NEW--Fiddler 2 not seeing application traffic --NEW--BUG: Microsoft.VisualStudio.TestTools.WebStress.LoadTestResultsCollector --NEW--ASP.NET Profiling sometimes does not report after a run. --NEW--System.InvalidCastException: Unable to cast COM object --NEW-- Assembly could not be loaded and will be ignored. --NEW--Issue with webtest login when getting 307 Temporary Redirect --NEW--Data bound validation rule fails when set at the TEST level --NEW--"Could not read result repository" --NEW--Page response time counters disappear after test is completed --NEW--WebTestContext.Clear() not clearing cookies --NEW--SQL Tracing error "Could not stop SQL tracing" --NEW--LoadTestCounterCategoryNotFoundException
124 124 125 126 126 127 127 127 128 129 129 129 130 130 130 130 131 131 132 132 133 133 134 134 135 136 136 137 137 138
140
140 159 163 176 176 177
TROUBLESHOOTING
How to enable logging for test recording --UPDATED-- Diagnosing and fixing Web Test recorder bar issues How to enable Verbose Logging on an agent for troubleshooting Troubleshooting invalid view state and failed event validation Troubleshooting the VS Load Testing IP Switching Feature --NEW--Performance Counters in .NET 4.0 help with analysis of Agent machines
178
178 178 180 180 182 183
Page 7
184
184 184 184 185 185 186 186 186 187 187 187 188 189 190 190 190 191 191 192 192 193 193 194 195 195 196 196 200 202 202 203 204 205 205 206 207 207 207 208 210 210 210 211 212 213
Page 8
--UPDATED--HOW TO: Handle 404 errors in dependent requests so the main request does not fail. --NEW--HOW TO: Minimize the amount of data a webtest retains for Response Bodies --NEW--HOW TO: Schedule tests to execute --NEW--HOW TO: NOT send an "accept-language" in webtests --NEW--How to upload a file in a Web test Gotcha: Check Your Validation Level in the Load Test Run Settings Gotcha: Do not adjust goals too quickly in your code Gotcha: Response body capture limit is set to 1.5 MB by default Gotcha: Caching of dependent requests is disabled when playing back Web Tests Gotcha: VS 2008 and out of memory Gotcha: Timeout attribute in coded web test does not work during a load test --NEW--Gotcha: Cannot programmatically set .counterset mappings at runtime Best Practice: considerations when creating a dynamic goal based load test plugin: Best Practice: Coded web tests and web test plug-ins should not block threads Best Practice: Add an Analysis Comment
214 215 215 216 217 223 223 224 224 224 224 225 225 225 226
EXTENSIBILITY
New Inner-text and Select-tag rules published on Codeplex How to Add Custom Tabs to the Playback UI How to extend recorder functionality with plugins
226
226 228 235
243
243 244 244 244 245 245 246 247
OLDER ARTICLES
Content-Length header not available in Web Request Object SharePoint file upload test may post the file twice Some Hidden Fields are not parameterized within AJAX calls (FIX) Unit Test threading models and changing them Bug in VS 2008 SP1 causes think time for redirected requests to be ignored in a load test New Load Test Plugin Enhancements in VS 2008 SP1 Four New Methods added to the WebTestPlugin Class for 2008 SP1
248
248 248 248 248 249 249 249
INDEX
250
Page 9
Thanks to all of the people who have contributed articles and information. I look forward to hearing feedback as well as suggestions moving forward. Sincerely, Geoff Gray, Senior Test Consultant Microsoft Testing Services Labs
Page 10
How It Works
How Web Tests Handle HTTP Headers
There are three different types of HTTP headers handled by Web tests: 1) Recorded Headers and headers explicitly added to the request. By default, the Web test recorder only records these headers: "SOAPAction" "Pragma" "x-microsoftajax" "Content-Type" 2) You can change the list of headers that the Visual Studio 2008 and 2010 web test recorder records in the registry by using regedit to open: HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools\Web LoadTest Add a string value under this key with the name "RequestHeadersToRecord" and value="SOAPAction;Pragma;x-microsoftajax;Content-Type; Referrer" If you do this and re-record your Web test, the Referrer header should be included in the request like this:
3) Headers handled automatically by the engine. Two examples: 1) headers sent and received as part of authentication. These headers are handled in the Web test engine and can't be controlled by the test. 2) cookies, which can be controlled through the API.
General Info (including order of execution) of load and web test plugins and rules
WebTestPlugins get tied to a webtest at the main level of the test. The order of precedence is:
class WebTestPluginMethods : WebTestPlugin { public override void PreWebTest(object sender, PreWebTestEventArgs e) { } public override void PreTransaction(object sender, PreTransactionEventArgs e) {} public override void PrePage(object sender, PrePageEventArgs e) {}
Page 11
public override void PreRequestDataBinding(object sender, PreRequestDataBindingEventArgs e) {} public override void PreRequest(object sender, PreRequestEventArgs e) {} public public public public } override override override override void void void void PostRequest(object sender, PostRequestEventArgs e) {} PostPage(object sender, PostPageEventArgs e) {} PostTransaction(object sender, PostTransactionEventArgs e) { } PostWebTest(object sender, PostWebTestEventArgs e) { }
PreWebTest fires before the first request is sent. PreTransaction is fired before all user defined transaction in the test. PrePage fires before any explicit request in the webtest. It also fires before any
PreRequest method.
PreRequestDataBinding fires before data from the context has been bound into
the request. Gives an opportunity to change the data binding. PreRequest fires before ALL requests made, including redirects and dependant requests. If you want it to act on only redirects, or skip redirects. use the e.Request.IsRedirectFollow property to handle code flow. All Post<method> follow the exact opposite order as the Pre<method>
WebTestRequestPlugins get set at an individual request level and only operate on the request(s) they are explicitly tied to, and all redirects/dependant requests of that request.
class WebTestRequestPluginMethods : WebTestRequestPlugin { public override void PreRequestDataBinding(object sender, PreRequestDataBindingEventArgs e) {} public override void PreRequest(object sender, PreRequestEventArgs e) { } public override void PostRequest(object sender, PostRequestEventArgs e) { } }
ValidationRules can be assigned at the request level and at the webtest level. If the rule is assigned at the webtest level, it will fire after every request in the webtest. Otherwise it will fire after the request it is assigned to.
public class ValidationRule1 : ValidationRule { public override void Validate(object sender, ValidationEventArgs e) { } }
ExtractionRules can be assigned at the request level. It will fire after the request it is assigned to.
public class ExtractionRule1 : ExtractionRule { public override void Extract(object sender, ExtractionEventArgs e) { } }
NOTE: If you have multiple items attached to a request, then the order of precedence is: 1) PostRequest (request plugins fire before WebTestRequest plugins) 2) Extract 3) Validate Visual Studio Performance Testing Quick Reference Guide Page 12
LoadTestPlugins get tied to the load tests directly. With VS 2005 and VS 2008, there can be only 1 plugin per loadtest, while VS 2010 adds >1 per test as well as LoadTestPlugin properties such that they are consistent with WebTestPlugins. The methods available are divided into three categories as shown below:
class LoadTestPlugins : ILoadTestPlugin { void LoadTest_LoadTestStarting(object sender, EventArgs e) { } void LoadTest_LoadTestFinished(object sender, EventArgs e) { } void LoadTest_LoadTestAborted(object sender, LoadTestAbortedEventArgs e) { } void LoadTest_LoadTestWarmupComplete(object sender, EventArgs e) { } void LoadTest_TestFinished(object sender, TestFinishedEventArgs e) { } void LoadTest_TestSelected(object sender, TestSelectedEventArgs e) { } void LoadTest_TestStarting(object sender, TestStartingEventArgs e) { } void LoadTest_ThresholdExceeded(object sender, ThresholdExceededEventArgs e) { } void LoadTest_Heartbeat(object sender, HeartbeatEventArgs e) { } }
2 3
1) 2) 3) 4)
These fire based on the load test (meaning each one will fire only once during a full test run) These fire once per test iteration, per vUser. Heartbeat fires once every second, on every agent. ThresholdExceeded fires each time a given counter threshold is exceeded.
NOTE: Each method in section 1 will fire once PER physical agent machine, however since the agent machines are independent of each other, you do not need to worry about locking items to avoid contention. NOTE: If you create or populate a context parameter inside the LoadTest_TestStarting method, it will not carry across to the next iteration.
Changed in 2010
In VS 2010, you can have more than one LoadTest plugin, although there is no guarantee about the order in which they will execute. You can now control whether a validation rule fires BEFORE or AFTER dependent requests. at the end of recording a Web test, we now automatically add a Response Time Goal Validation rule at the Web test level, but this doesn't help much unless you click on the Toolbar button that lets you edit the response time goal as well as Think Time and Reporting Name for the Page for all recorded requests in a single grid
Page 13
Client Code does not execute because Web Tests Work at the HTTP Layer
The following blog outlines where and how web tests work. This is important to understand if you are wondering why client side code is not tested. http://blogs.msdn.com/slumley/pages/web-tests-work-at-the-http-layer.aspx
How the "Test Iterations" Setting impacts the total number of tests executed
In the properties for the Run Settings of a load test, there is a property called "Test Iterations" that tells VS how many tests iterations to run during a load test. This is a global setting, so if you choose to run 5 iterations and you have 10 vusers, you will get FIVE total passes, not fifty. NOTE: you must enable this setting by changing the property "Use Test Iterations" from FALSE (default) to TRUE.
Page 14
Test timeout setting for load test configuration does not affect web tests
The "Test Timeout" setting in the Test Run Configuration file (in the "Test -> Edit Test Run Configuration" menu) does not have an effect in all cases. Uses the setting o Running a single unit test, web test, ordered test, or generic test by itself o Running any of the above types of tests in a test run started from Test View, the Test List editor, or mstest. o Tests running in a load test (except Web tests) Does not use the setting o Running a Web test in a load test o The load test itself
This particular test timeout is enforced by the agent test execution code, but load test and Web test execution are tightly coupled for performance reasons and when a load test executes a Web test, the agent test execution code that enforces the test timeout setting is bypassed.
How user pacing and "Think Time Between Test Iterations" work
The setting "Think Time Between Test Iterations" is available in the properties for a load test scenario. This value is applied when a user completes one test, then the think time delay is applied before the user starts the next iteration. The setting applies to each iteration of each test in the scenario mix. If you create a load test that has a test mix model "Based on user pace", then the pacing calculated by the test engine will override any settings you declare for "Think Time Between Test Iterations".
In 2008 The Load test Terminate method does not fire unless you use a cool down period. In 2010 The Load test Terminate method always fires. Visual Studio Performance Testing Quick Reference Guide Page 15
What is the difference between Unique, Sequential and Random Data Sources
Single Machine running tests Sequential This is the default and tells the web test to start with the first row then fetch rows in order from the data source. When it reaches the end of the data source, loop back to the beginning and start again. Continue until the load test completes. In a load test, the current row is kept for each data source in each web test, not for each user. When any user starts an iteration with a given Web test, they are given the next row of data and then the cursor is advanced. Random This indicates to choose rows at random. Continue until the load test completes. Unique This indicates to start with the first row and fetch rows in order. Once every row is used, stop the web test. If this is the only web test in the load test, then the load test will stop. Multiple machines running as a rig Sequential This works that same as if you are on one machine. Each agent receives a full copy of the data and each starts with row 1 in the data source. Then each agent will run through each row in the data source and continue looping until the load test completes. Random This also works the same as if you run the test on one machine. Each agent will receive a full copy of the data source and randomly select rows. Unique This one works a little differently. Each row in the data source will be used once. So if you have 3 agents, the data will be spread across the 3 agents and no row will be used more than once. As with one machine, once every row is used, the web test will stop executing.
Page 16
Important Note: When running a Web test by itself (outside of the load test), the Cache Control property is automatically set to false for all dependent requests so they are always fetched; this is so that they can be displayed in the browser pane of the Web test results viewer without broken images.
Page 17
The "Percentage of New Users" affects the following whether the tests contained within the load test are Web tests or unit tests: The value of the LoadTestUserId in the LoadTestUserContext object. This only matters for unit tests and coded Web tests that use this property in their code. On the other hand if you set the number of test iterations equal to the user load, then you should get a different LoadTestUserId regardless of the setting of "Percentage of New Users". If you are using the load test feature that allows you to define an "Initial Test" and/or a "Terminate Test" for a virtual user, then it affects when the InitializeTest and TerminateTest are run: for "new users" (a more accurate name might be "one time users", the InitializeTest is run for the virtual user, the "Body Test" is run just once, and then the "Terminate Test" is run. For users who are NOT "new users", the InitializeTest is run once, the Body Test is run many times (until the load test completes), and then the TerminateTest runs (which might be during the cool-down period).
The "Percentage of New Users" affects the following Web test features that are not applicable for unit tests: The simulation of browser caching. The option affects how the VUser virtual browser cache is maintained between iterations of Tests. "New users" have an empty cache (not the responses are not actually cached, only the urls are tracked), "return users" have a cache. So if this value is 100% all Vusers starting a Test will be starting with an empty browser cache. If this value is 0% all VUsers will maintain the state of the browser cache between iterations of Web Tests. This setting affects the amount of content that is downloaded. If an object sits in a Vuser cache and if the object has not been modified since the last time the Vuser downloaded it, the object will not be downloaded. Therefore, new users will download more content versus returning users with items it their browser cache. The handling of cookie for a Web test virtual user: new users always start running a Web test with all cookies cleared. When a user who is not a "new user" runs an Web test after the first one run, the cookies set during previous Web tests for that virtual user are present.
Page 18
The below graphs (taken from test runs in VS 2010) demonstrate the difference between a new user and a return user. The graphs are based on a 10 user / 50 iteration run, but with different percentages for "new users" on each run. NOTE: The graphs below are new to VS 2010, but the way in which users are simulated is the same as in VS 2008. For a better understanding of these graphs, go to the section called "--UPDATED-- Virtual user visualization now available".
Zero percent new users shows a graph where each of the 10 vusers is constantly reused.
Fifty percent new users shows a graph where each of the 10 vusers is constantly reused by half of the iterations, but the other half are split out among new vusers which never get reused.
One hundred percent new users shows a graph where none of the vusers is ever reused.
Page 19
How cache settings can affect your testing and app performance
This article shows how changing the caching settings in your Visual Studio tests and on your web server can impact your test. It also shows a real world demonstration of the difference between NEW and RETURN users.
Comparing New Users to Return Users (WRT caching): New users are simulated by clearing the cache at the start of each new iteration, whereas the cache is carried from iteration to iteration for return users. This results in many more requests being cached with return users. NOTE: The total # of requests made by VS is a sum of the two VS values. In other words, Total Requests in the IDE does not include cached requests.
Looking at the impact of content expiration on the overall network and web server activity (For more information, see the section Add an Expires or a Cache-Control Header from http://developer.yahoo.com/performance/rules.html).
Notice that VS honors the content expiration (this is actually handled by the underlying System.NET component). However, VS still reports the cached file request, even though no call went out the wire. This is expected behavior since the request was a part of the site. In order to see how many requests went on the wire, you need to use IIS logs or network traces.
Page 20
Notes: All 4 tests above were run for the same duration with the same number of users executing the same test. Although the numbers do not match exactly, they are close enough to show the behavior of the tests. The discrepancy is due to a few things, including cool down of the test and the possible misalignment of the query I used to gather data from the IIS logs. The IIS Log items for "200 OK" and "304-Not Modified" were gathered using LogParser and the following query:
SELECT sc-status, COUNT(*) AS Total FROM *.log WHERE to_timestamp(date, time) between timestamp('2010-02-12 02:13:22', 'yyyy-MM-dd hh:mm:ss') and timestamp('2010-02-12 02:18:22', 'yyyy-MM-dd hh:mm:ss') GROUP BY sc-status
Page 21
Goal based user behavior after the test finishes the warmup period
1. The user load starts at the value specified by the Initial User Count property of the Goal Based Load Pattern. 2. At each sampling interval (which defaults to 5 seconds, but can be modified by the "Sample Rate" property in the load test run settings), the performance counter defined in the goal based load pattern is sampled. (If it can't be sampled for some reason, an error is logged and the user load remains the same.) 3. The value sampled is compared with the "Low End" and "High End" properties of the "Target Range for Performance Counter". 4. If the value is within the boundaries of the "Low End" and "High End", the user load remains the same. 5. If the value is not within the boundaries of the "Low End" and "High End", the user load is adjusted as follows: The midpoint of the target range for the goal is divided by the sample valued for the goal performance counter to calculate an "adjustment factor". For example, if the goal is defined as "% Processor Time" between 50 and 70, the midpoint is 60. If the sampled value for % Processor Time is 40, then AdjustmentFactor = 60/40 = 1.5, or if the sampled value is 80, the AdjustmentFactor = 60/80 = 0.75. The AdjustmentFactor is multiplied by the current user load to get the new user load. However, if the difference between the new user load and the current user load is greater than the "Maximum User Count Increase/Decrease" property (whichever applies), then the user load is only adjusted by as much as max increase/decrease property. My experience has been that keeping these values fairly small is a good idea; otherwise the algorithm tends to cause too much fluctuation (the perf counter keeps going above and below the target range). The new user load can also not be larger than the value specified by the goal based pattern's MaximumUserCount property or less than the Minimum User Count property. Two more considerations based on special properties of the goal based load pattern: o If the property "Lower Values Imply Higher Resource Use" is True (which you might use for example for a performance count such as Memory\Available Mbytes), then the user load is adjusted in the opposite direction: the user load is decreased when the sampled counter value is less than the Low End of the target range and increased when the user load is greater than the High End of the target range. o If the property "Stop Adjusting User Count When Goal Achieved" is True, then once the sampled goal performance counter is within the target range for 3 consecutive sampling intervals, then the user load is no longer adjusted and remains constant for the remainder of the load test. Lastly, as is true for all of the user load patterns, in a test rig with multiple agents, the new user load is distributed among the agents equally by default, or according to the "Agent Weightings" if these are specified in the agent properties.
Page 22
Page 23
Each of these is a separate type of error and gets its own quantity of errors (#1) and error details (#2) The number of errors is shown in the Count column. Clicking on one of the numbers will bring up the Load Test Errors dialog below. There is no count displayed for error details.
This is the display of the Errors table in the test results viewer.
Any errors entry (#1) that has an associated error details will have a link in one or both of the last columns. Click on these to get the details about that specific error instance.
Page 24
Then on a subsequent post we see Field1 and Field2 posted, then this request and response match and a hidden field bucket will be created for them. The first available bucket number is assigned to the hidden field bucket. Once a bucket is "consumed" by a subsequent request via binding, that bucket is made available again. So if the test has a single frame, it will always reuse bucket 0: Page 1 o Page 2 o Page 3 o Page 4 o
Extract bucket 0 Bind bucket 0 params Extract bucket 0 Bind bucket 0 params
If a test has 2 frames that interleave requests, it will use two buckets:
Page 25
Frame 1, Page 1 o Extract bucket 0 Frame 2, Page 1 o Extract bucket 1 Frame 2, Page 2 o Bind bucket 1 params Frame 1, Page 2 o Bind bucket 0 params
Or if a test uses a popup window, or Viewstate, you would see a similar pattern as the frames page where multiple buckets are used to keep the window state. Why are some fields unbound? Some hidden fields values are modified in java script, such as EVENT_ARGUMENT. In that case, it won't work to simply extract the value from the hidden field in the response and play it back. If the recorder detects this is the case, it put the actual value that was posted back as the form post parameter value rather than binding it to the hidden field. A single page will have have just one hidden field extraction rule applied. If there are multiple forms on a given page, there is still just one down-stream post of form fields, resulting in one application of the hidden field extraction rule.
Page 26
Page 27
[TestMethod] public void Test1() { Console.WriteLine("Test1"); } [TestMethod] public void Test2() { Console.WriteLine("Test2"); } [TestMethod] public void Test3() { Console.WriteLine("Test3"); } [TestCleanup] public void TestCleanUp() { Console.WriteLine("TestCleanUp"); } [ClassCleanup] public static void ClassCleanUp () { Console.WriteLine("ClassCleanUp"); } }
(This consists of 3 Test Methods, ClassInitialize, ClassCleanup, TestInitialize, TestCleanUp and an explicit declaration of TestContext) The execution order would be as follows: Test1 [Thread 1]: new TestContext -> ClassInitialize -> TestInitialize -> TestMethod1 -> TestCleanUp Test2 [Thread 2]: new TestContext -> TestInitialize -> TestMethod2 -> TestCleanUp Test3 [Thread 3]: new TestContext -> TestInitialize -> TestMethod2 -> TestCleanUp -> ClassCleanUp The output after running all the tests in the class would be: Class Setup Test Init Test1 TestCleanUp Test Init Test2 TestCleanUp Test Init Test3 TestCleanUp ClassCleanUp
Page 28
Controller-Agent Communications
Page 29
Controller-Agent Communications
Page 30
Sharing State across agents in a load test rig is not supported out of the box
The following is an excerpt from a discussion on a possible way to customize VS to handle sharing state across a rig:
Question: The load test in our scenario is driving integration tests (implemented using the VS unit
testing framework) so I want the data to be available to the unit test while it is running. I am thinking of writing a lightweight service that acts as the provider of shared state. I will use the ILoadTestPlugin.Initialize to initialize / reset the data source (using a filter for agent ID so that it runs only once) by calling the service, retrieve the data from the service in LoadTest.TestStarting event and then make this data available to the unit test using the test context. This way, the duration of the test run is not affected by the state retrieval process. However, I need to be careful in implementation of the shared state provider so that it doesn't have a major impact on the test run results (because of synchronisation / contention).
Answer: As you said, the service needs to be super-fast and simple. Maintaining a simple list of
name/value pairs would go a long way. The trickiest thing about the service is what locking to provide. For example, for state variable keeping a count, we don't want agents setting the value, as they will step on each other and lose increments. A better design is to have a first class Increment command that the service handles. There are similar questions for integrity of string data, although that is probably not as important as providing a simple counter. Another common pattern is maintaining lists of stuff. One user is adding things to the list, the other user is consuming them. This is probably best implemented with a database.
Page 31
1) Any execution output lines below in blue with asterisks come from a coded version of the web test to show how the different points of the code will execute. 2) All other output execution lines come from the rules and plugins written for this example. 3) None of the built in rules have output statements, but the order of execution follows the same pattern as the custom items.
INITIALIZATION
***** Entering WebTest Initialization Code ***** Exiting WebTest Initialization Code ***** Entering WebTest Initialization Code ***** Exiting WebTest Initialization Code WebTest - PreWebTest ***** Entering Validation Initialization Code ***** Exiting Validation Initialization Code
NOTES: These items are accessible in coded web tests, but they still fire in declarative tests as well.
Entering a Transaction
NOTES: 1) Transactions do not fire any requests so they do not show any response info. Transactions do track the overall time for all requests and work inside the transaction.
Page 32
2) Pre Transaction events fire before all requests inside the transaction, so any settings/configuration inside a transaction event will be overridden by any similar settings inside any of the page or request events.
***** Entering request1 ***** Entering yield return request1 WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage1.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage1.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage1.aspx WebTest - PrePage: http://localhost/HOL/ParamPage1.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage1.aspx Web Extraction Rule: http://localhost/HOL/ParamPage1.aspx Web Extraction Rule Number 2: http://localhost/HOL/ParamPage1.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage1.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage1.aspx WebTest - PostPage: http://localhost/HOL/ParamPage1.aspx ***** Entering request1 = null ***** Exiting request1
NOTES: 1) Notice that web request events do NOT fire for any of the dependent requests. 2) Notice that the custom extraction rules fire in the order they are listed in the code. This behavior holds true for all extraction/validation rules. In this test, I do not have output for the built-in hidden params, but in this request it comes first in the list, therefore it fires first in the actual test.
Page 33
WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx
NOTES: 1) Notice that the custom extraction rules fire in the order they are listed in the code. This order is opposite of the previous request (both in code and order executed).
Page 34
1) This fires after all request events inside the transaction, even though the playback shows only the transaction beginning.
WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx
Page 35
NOTES: 1) The normal series of events fires on each iteration that meets the loop condition. When the loop condition fails, there are NO MORE events fired as part of the embedded request(s)
WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamRedirect.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamRedirect.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamRedirect.aspx WebTest - PrePage: http://localhost/HOL/ParamRedirect.aspx WebTest - PreRequest: http://localhost/HOL/ParamRedirect.aspx WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamRedirect.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamRedirect.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage4.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage4.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage4.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage4.aspx Web Extraction Rule: http://localhost/HOL/ParamPage4.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:True http://localhost/HOL/ParamPage4.aspx WebTest - PostRequest: Redirect:True http://localhost/HOL/ParamPage4.aspx WebTest - PostPage: http://localhost/HOL/ParamRedirect.aspx
NOTES: 1) When a redirect comes down the pipe, the webtest-request and web-request events fire before the redirect is sent. 2) However, Extraction and validation rules do NOT fire until after the redirect. If you want to validate against a request BEFORE it redirects, you need to use a plugin and simulate the work of a validation rule.
Visual Studio Performance Testing Quick Reference Guide Page 36
3) Notice that the Pre and Post Page events surround the entire set of requests in the main level request EXCEPT for DataBinding events AND a request level PreRequest event. Page level includes the main request, redirect requests and dependent requests.
WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage2.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage2.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage2.aspx WebTest - PrePage: http://localhost/HOL/ParamPage2.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage2.aspx Web Extraction Rule: http://localhost/HOL/ParamPage2.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx WebTest - PostPage: http://localhost/HOL/ParamPage2.aspx
Page 37
WebTest - PostWebTest
Code used in this test This code uses an internal team tool called RTMonitor. The only thing RTMonitor does is write a message to an output window. This is where the outputs from above are being collected. To ensure that the proper order of execution was captured, I used breakpoints on every single RTMonitor line to allow the output queue to blast each message separately instead of in batches. Load Test Plugins.
[System.ComponentModel.DisplayName("BehaviorLoadTestRule")] [System.ComponentModel.Description("TODO: Add a more detailed description of LoadTestRule1 Loadtest rule.")] public class ExecutionOrderPluginClasses : ILoadTestPlugin { LoadTest m_loadTest; public void Initialize(LoadTest loadTest) { m_loadTest = loadTest; // Here's the complete list of events that are fired. You should delete events you are not using. // To see a list of events, type "m_loadTest." on a new line in the constructor. m_loadTest.LoadTestStarting += new EventHandler(LoadTest_LoadTestStarting); m_loadTest.LoadTestFinished += new EventHandler(LoadTest_LoadTestFinished); m_loadTest.LoadTestAborted += new EventHandler<LoadTestAbortedEventArgs>(LoadTest_LoadTestAborted); m_loadTest.LoadTestWarmupComplete += new EventHandler(LoadTest_LoadTestWarmupComplete); m_loadTest.TestStarting += new EventHandler<TestStartingEventArgs>(LoadTest_TestStarting); m_loadTest.TestFinished += new EventHandler<TestFinishedEventArgs>(LoadTest_TestFinished); m_loadTest.TestSelected += new EventHandler<TestSelectedEventArgs>(LoadTest_TestSelected); m_loadTest.ThresholdExceeded += new EventHandler<ThresholdExceededEventArgs>(LoadTest_ThresholdExceeded); m_loadTest.Heartbeat += new EventHandler<HeartbeatEventArgs>(LoadTest_Heartbeat); } void LoadTest_LoadTestStarting(object sender, EventArgs e) { RTMonitor.Write("LoadTest_LoadTestStarting"); } void LoadTest_LoadTestFinished(object sender, EventArgs e) { RTMonitor.Write("LoadTest_LoadTestFinished"); } void LoadTest_LoadTestAborted(object sender, LoadTestAbortedEventArgs e) { RTMonitor.Write("LoadTest_LoadTestAborted"); }
Page 38
void LoadTest_LoadTestWarmupComplete(object sender, EventArgs e) { RTMonitor.Write("LoadTest_LoadTestWarmupComplete"); } void LoadTest_TestFinished(object sender, TestFinishedEventArgs e) { String str = e.UserContext["AgentId"].ToString(); RTMonitor.Write("LoadTest_TestFinished " + str); } void LoadTest_TestSelected(object sender, TestSelectedEventArgs e) { RTMonitor.Write("LoadTest_TestSelected"); } void LoadTest_TestStarting(object sender, TestStartingEventArgs e) { RTMonitor.Write("LoadTest_TestStarting"); } void LoadTest_ThresholdExceeded(object sender, ThresholdExceededEventArgs e) { RTMonitor.Write("LoadTest_ThresholdExceeded"); } void LoadTest_Heartbeat(object sender, HeartbeatEventArgs e) { RTMonitor.Write("LoadTest_Heartbeat"); } }
Page 39
[System.ComponentModel.Description("TODO: Add a more detailed description of 'ValidationRule1' validation rule.")] public class ValidationRule2 : ValidationRule { public override void Validate(object sender, ValidationEventArgs e) { RTMonitor.Write(Color.SlateGray, "WebTest level Validation Rule"); } }
WebTest Plugin
[System.ComponentModel.DisplayName("BehaviorWebTestPlugin")] [System.ComponentModel.Description("TODO: Add a more detailed description of 'WebTestPlugin1' WebTest Plugin.")] public class WebTestPlugin1 : WebTestPlugin { public override void PreWebTest(object sender, PreWebTestEventArgs e) { RTMonitor.Write(Color.Red, "WebTest - PreWebTest"); } public override void PostWebTest(object sender, PostWebTestEventArgs e) { RTMonitor.Write(Color.Red, "WebTest - PostWebTest"); } public override void PreRequestDataBinding(object sender, PreRequestDataBindingEventArgs e) { RTMonitor.Write(Color.Red, "\tWebTest - PreWebTestDataBinding: {0}", e.Request.UrlWithQueryString.ToString()); } public override void PreRequest(object sender, PreRequestEventArgs e) { RTMonitor.Write(Color.Red, "\tWebTest - PreRequest: {0}", e.Request.UrlWithQueryString.ToString()); } public override void PostRequest(object sender, PostRequestEventArgs e) { RTMonitor.Write(Color.Red, "\tWebTest - PostRequest: Redirect:{1} {0}", e.Request.UrlWithQueryString.ToString(), e.Request.IsRedirectFollow.ToString()); } public override void PrePage(object sender, PrePageEventArgs e) { RTMonitor.Write(Color.Red, "\tWebTest - PrePage: {0}", e.Request.UrlWithQueryString.ToString()); } public override void PostPage(object sender, PostPageEventArgs e) { RTMonitor.Write(Color.Red, "\tWebTest - PostPage: {0}", e.Request.UrlWithQueryString.ToString()); } public override void PreTransaction(object sender, PreTransactionEventArgs e) { RTMonitor.Write(Color.Red, "\tWebTest - PreTransaction: {0}", e.TransactionName); } public override void PostTransaction(object sender, PostTransactionEventArgs e) { RTMonitor.Write(Color.Red, "\tWebTest - PostTransaction: {0}", e.TransactionName); } }
Page 40
WebRequest Plugin
[System.ComponentModel.DisplayName("BehaviorWebTestRequestPlugin")] [System.ComponentModel.Description("TODO: Add a more detailed description of WebTestRequestPlugin1 extraction rule.")] public class WebTestRequestPlugin1 : WebTestRequestPlugin { public override void PreRequestDataBinding(object sender, PreRequestDataBindingEventArgs e) { RTMonitor.Write(Color.Black, "\t\tWebRequest - PreWebRequestDataBinding: {0}", e.Request.UrlWithQueryString.ToString()); } public override void PreRequest(object sender, PreRequestEventArgs e) { RTMonitor.Write(Color.Black, "\t\tWebRequest - PreWebRequest: {0}", e.Request.UrlWithQueryString.ToString()); } public override void PostRequest(object sender, PostRequestEventArgs e) { RTMonitor.Write(Color.Black, "\t\tWebRequest - PostWebRequest: Redirect:{1} {0}", e.Request.UrlWithQueryString.ToString(), e.Request.IsRedirectFollow.ToString()); } }
Coded webtest code to get the initialization events The bulk of the output was gathered from the declarative test, but this was used to capture execution points that are not exposed as events.
public WebTest1Coded() { RTMonitor.Write(Color.Blue, "***** Entering WebTest Initialization Code"); this.PreAuthenticate = true; this.PreWebTest += new EventHandler<PreWebTestEventArgs>(this.testPlugin0.PreWebTest); this.PostWebTest += new EventHandler<PostWebTestEventArgs>(this.testPlugin0.PostWebTest); this.PreTransaction += new EventHandler<PreTransactionEventArgs>(this.testPlugin0.PreTransaction); this.PostTransaction += new EventHandler<PostTransactionEventArgs>(this.testPlugin0.PostTransaction); this.PrePage += new EventHandler<PrePageEventArgs>(this.testPlugin0.PrePage); this.PostPage += new EventHandler<PostPageEventArgs>(this.testPlugin0.PostPage); RTMonitor.Write(Color.Blue, "***** Exiting WebTest Initialization Code"); } public override IEnumerator<WebTestRequest> GetRequestEnumerator() { RTMonitor.Write(Color.Blue, "***** Entering Validation Initialization Code"); if ((this.Context.ValidationLevel >= Microsoft.VisualStudio.TestTools.WebTesting.ValidationLevel.High)) { ValidationRule2 validationRule1 = new ValidationRule2(); this.ValidateResponse += new EventHandler<ValidationEventArgs>(validationRule1.Validate); } this.PreRequestDataBinding += new EventHandler<PreRequestDataBindingEventArgs>(this.testPlugin0.PreRequestDataBinding); this.PreRequest += new EventHandler<PreRequestEventArgs>(this.testPlugin0.PreRequest); this.PostRequest += new EventHandler<PostRequestEventArgs>(this.testPlugin0.PostRequest); RTMonitor.Write(Color.Blue, "***** Exiting Validation Initialization Code"); RTMonitor.Write(Color.Blue, "***** Entering this.BeginTransaction(\"Transaction1\")"); this.BeginTransaction("Transaction1"); RTMonitor.Write(Color.Blue, "***** Entering request1"); WebTestRequest request1 = new WebTestRequest("http://localhost/HOL/ParamPage1.aspx"); if ((this.Context.ValidationLevel >= Microsoft.VisualStudio.TestTools.WebTesting.ValidationLevel.High)) {
Page 41
ValidationRule1 validationRule2 = new ValidationRule1(); request1.ValidateResponse += new EventHandler<ValidationEventArgs>(validationRule2.Validate); } ExtractHiddenFields extractionRule1 = new ExtractHiddenFields(); extractionRule1.Required = true; extractionRule1.HtmlDecode = true; extractionRule1.ContextParameterName = "1"; request1.ExtractValues += new EventHandler<ExtractionEventArgs>(extractionRule1.Extract); ExtractionRule1 extractionRule2 = new ExtractionRule1(); extractionRule2.ContextParameterName = "test1"; request1.ExtractValues += new EventHandler<ExtractionEventArgs>(extractionRule2.Extract); ExtractionRule2 extractionRule3 = new ExtractionRule2(); extractionRule3.ContextParameterName = "test1a"; request1.ExtractValues += new EventHandler<ExtractionEventArgs>(extractionRule3.Extract); WebTestRequestPlugin1 requestPlugin1 = new WebTestRequestPlugin1(); request1.PreRequestDataBinding += new EventHandler<PreRequestDataBindingEventArgs>(requestPlugin1.PreRequestDataBinding); request1.PreRequest += new EventHandler<PreRequestEventArgs>(requestPlugin1.PreRequest); request1.PostRequest += new EventHandler<PostRequestEventArgs>(requestPlugin1.PostRequest); RTMonitor.Write(Color.Blue, "***** Entering yield return request1"); yield return request1; RTMonitor.Write(Color.Blue, "***** Entering request1 = null"); request1 = null; RTMonitor.Write(Color.Blue, "***** Exiting request1"); RTMonitor.Write(Color.Blue, "***** Entering this.EndTransaction(\"Transaction1\""); this.EndTransaction("Transaction1");
WebTest - PreWebTest ==========WebTest - PreWebTest Event, UserName.Name datasource IDIOT WebTest - PreTransaction: Transaction1
WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage1.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource IDIOT WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage1.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource IDIOT ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage1.aspx WebTest - PrePage: http://localhost/HOL/ParamPage1.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage1.aspx ==========WebTest - PreRequest Event, UserName.Name datasource SILLY Web Extraction Rule: http://localhost/HOL/ParamPage1.aspx Web Extraction Rule Number 2: http://localhost/HOL/ParamPage1.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage1.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage1.aspx WebTest - PostPage: http://localhost/HOL/ParamPage1.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage2.aspx?Parameter1={{DataSource1.UserNames%23csv.Name}} ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource SILLY WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage2.aspx?Parameter1={{DataSource1.UserNames%23csv.Name}} ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource SILLY ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PrePage: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PreRequest: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE ==========WebTest - PreRequest Event, UserName.Name datasource GOOSE Web Extraction Rule Number 2: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE Web Extraction Rule: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest level Validation Rule
Page 42
Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PostPage: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PostTransaction: Transaction1
WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource GOOSE WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource GOOSE ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx ==========WebTest - PreRequest Event, UserName.Name datasource ID10T Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource ID10T WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource ID10T ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx ==========WebTest - PreRequest Event, UserName.Name datasource IDGIT Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamRedirect.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource IDGIT WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamRedirect.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource IDGIT ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamRedirect.aspx WebTest - PrePage: http://localhost/HOL/ParamRedirect.aspx WebTest - PreRequest: http://localhost/HOL/ParamRedirect.aspx ==========WebTest - PreRequest Event, UserName.Name datasource ID1OT WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamRedirect.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamRedirect.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage4.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource ID1OT WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage4.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource ID1OT ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage4.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage4.aspx ==========WebTest - PreRequest Event, UserName.Name datasource STUPID Web Extraction Rule: http://localhost/HOL/ParamPage4.aspx WebTest level Validation Rule
Page 43
Request level Validation Rule WebRequest - PostWebRequest: Redirect:True http://localhost/HOL/ParamPage4.aspx WebTest - PostRequest: Redirect:True http://localhost/HOL/ParamPage4.aspx WebTest - PostPage: http://localhost/HOL/ParamRedirect.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage2.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource STUPID WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage2.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource STUPID ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage2.aspx WebTest - PrePage: http://localhost/HOL/ParamPage2.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage2.aspx ==========WebTest - PreRequest Event, UserName.Name datasource IDIOT Web Extraction Rule: http://localhost/HOL/ParamPage2.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx WebTest - PostPage: http://localhost/HOL/ParamPage2.aspx WebTest - PostWebTest
200 0 0.032
Shows the 401 and 404 failures Shows that all pages ran every time Matches the Overall measurement
From Direct Database Query (Actual query used is below) Overall 0.0319266666666667 Page 0 0.02092 Page 1 0.0498 Page 2 0.02506 W/O 401 0.03743 W/O 404 0.03536 Visual Studio Performance Testing Quick Reference Guide Page 44
DB Query Used:
SELECT 'Overall', AVG(ResponseTime) FROM [LoadTest2010].[dbo].[LoadTestPageDetail] WHERE LoadTestRunId = 48 AND PageId < 3 UNION SELECT 'Page 0', AVG(ResponseTime) FROM [LoadTest2010].[dbo].[LoadTestPageDetail] WHERE LoadTestRunId = 48 AND PageId = 0 UNION SELECT 'Page 1', AVG(ResponseTime) FROM [LoadTest2010].[dbo].[LoadTestPageDetail] WHERE LoadTestRunId = 48 AND PageId = 1 UNION SELECT 'Page 2', AVG(ResponseTime) FROM [LoadTest2010].[dbo].[LoadTestPageDetail] WHERE LoadTestRunId = 48 AND PageId = 2 UNION SELECT 'W/O 401', AVG(ResponseTime) FROM [LoadTest2010].[dbo].[LoadTestPageDetail] WHERE LoadTestRunId = 48 AND (PageId = 1 OR PageId = 2) UNION SELECT 'W/O 404', AVG(ResponseTime) FROM [LoadTest2010].[dbo].[LoadTestPageDetail] WHERE LoadTestRunId = 48 AND (PageId = 1 OR PageId = 0)
--UPDATED-- File Downloads, Download Size and Storage of files during Web Tests
The web test engine does not write responses to disk, so you don't need to specify a location for the file. It does read the entire response back to the client, but only stores the first 1.5M of the response in memory. You can override that using the WebTestRequest.ResponseBodyCaptureLimit property in the request's section of a coded web test. For a declarative webtest, you can add the following code to a plugin:
public class IncreaseResponseSize : WebTestPlugin { [DisplayName("Size to use - MB")] [Description("The maximum size to allow - defined in Mb (e.g. 10 = 10Mb)")] [DefaultValue(5)] public int iSize { get; set; } public override void PreWebTest(object sender, PreWebTestEventArgs e) { if (!e.WebTest.Context.ContainsKey("$LoadTestUserContext")) e.WebTest.ResponseBodyCaptureLimit = iSize * 1024 * 1024; // 10 MB } }
Page 45
Page 46
There is one other important part. VS spins up the same number of async connections for dependent requests that the browser does. So if you emulate IE7, you will see two separate TCP conversations when pulling down a web page with dependent requests and if you emulate IE8, you will see up to 6 More info: One of the big gotchas that may happen in VS, but is uncommon with browser emulation (especially IE) is the fact that VS uses the standard built-in .NET WebHttp objects to control all traffic, where IE uses the native-mode WinINET. There are some subtle differences there. I have only hit one or two cases where it mattered, but I just wanted to mention this difference.
Page 47
You can also right-click on a form post or query string parameter in the request tab to start a search.
Page 48
Page 49
Page 50
Page 51
Page 52
The calls from above without a reporting name. Even though they are the same requests, some have a GET method and some have a POST method.
Page 53
How to view activity visualization In VS 2010, you can view a map of the virtual users activity AFTER a test run completes by clicking on the "Details" button in the results window.
Page 54
What is shown in the visualization window 3 choices: 1) Test 2) Transaction 3) Page View shows users in relation to each other (Y-axis) and durations of a single instance of each users measured activity (X-axis). For complete details on this, see the entry New users versus One Time users
Use the Zoom to time slider to control how much of the test details you wish to see. Hover the mouse pointer over an instance to get a popup of the info about that instance.
Page 55
More Information Here are the table definitions from the LoadTest2010 Results Store: For the LoadTestTestDetail table, the big differences are that you get the outcome of the tests, which virtual user executed it, and the end time of the test. [LoadTestRunId] [int] NOT NULL , [TestDetailId] [int] NOT NULL , [TimeStamp] [datetime] NOT NULL , [TestCaseId] [int] NOT NULL , [ElapsedTime] [float] NOT NULL, [AgentId] [int] NOT NULL, [BrowserId] [int], [NetworkId] [int], New to [Outcome] [tinyint], 2010 [TestLogId] [int] NULL, [UserId] [int] NULL, [EndTime] [datetime] NULL, [InMeasurementInterval] [bit] NULL
For the LoadTestPageDetail table, you now get the end time of the page as well as the outcome of the page. [LoadTestRunId] [int] NOT NULL , [PageDetailId] [int] NOT NULL , [TestDetailId] [int] NOT NULL , [TimeStamp] [datetime] NOT NULL , [PageId] [int] NOT NULL , [ResponseTime] [float] NOT NULL, [ResponseTimeGoal] [float] NOT NULL, [GoalExceeded] [bit] NOT NULL, New to [EndTime] [datetime] NULL, 2010 [Outcome] [tinyint] NULL, [InMeasurementInterval] [bit] NULL
Page 56
For the LoadTestTransactionDetail table the big changes are you get the response time of the transaction and the end time. Statistics for transactions such as Min, Max, Avg, Median, StdDev, 90%, 95% and 99% are being calculated. These statistics are based on the ResponseTime column, not the ElapsedTime. The difference between the 2 is that elapsed time includes think time whereas the response time does not. [LoadTestRunId] [int] NOT NULL , [TransactionDetailId] [int] NOT NULL , [TestDetailId] [int] NOT NULL , [TimeStamp] [datetime] NOT NULL , [TransactionId] [int] NOT NULL , [ElapsedTime] [float] NOT NULL, New to [EndTime] [datetime] NULL, 2010 [InMeasurementInterval] [bit] NULL, [ResponseTime] [float] NULL
Another change in VS 2010 is that the default for whether or not to collect details has changed. In VS 2005 and VS 2008 the default was to not collect this detail data. In VS 2010, the default is to collect the detail data. This is controlled by the Timing Details Storage property on the Run Settings node in a load test. So you can still run your own analysis on this data, but there is also a new view in VS that you can use to get a look at the data. The view is the Virtual User Activity Chart. When a load test completes, there will be a new button enabled on the load test execution toolbar. It is the detail button below:
When you click on this button you will brought to the Virtual User Activity Chart. It looks like the following:
Page 57
Here is what you are looking at. Each horizontal row represents a virtual user. Each line in a horizontal row represents a test, page or transaction. If you look at top left of this view, you will see a combo box that shows which type of detail you are looking at. So in my case this is showing pages. Each color represents a different page in the test. The length of the line represents the duration of the page. So you can quickly tell which pages are running long. If you look at the bottom of the chart, you will see a zoom bar. The zoom bar allows you to change the range that you are looking at. The zoom bar overlays one of the graphs from the graph view. So whichever graph is selected in the graph view, you will see that on the zoom bar. This makes it very easy to correlate spikes in a graph with what tests/pages/transactions are occurring during that spike. The legend on the left also has some filtering and highlight options. If you uncheck a page, then all instances of that page are removed from the chart. If you click to Highlight Errors, then all pages that failed will have their color changed to red. If you look at bottom part of the legend, you will see all the errors that occurred during the test. You can choose to remove pages with certain errors or remove all successful pages so you only see errors. There is one other very useful feature of this view. You can hover over any line to get more information about the detail and possibly drill into the tests that the detail belongs to. For example this is what it looks like when you hover a detail:
Page 58
You see information about user, scenario, test, url , outcome, etc. For this detail, there is also a test log link. If you click this, you will see the actual test that the page was a part of. For example, when I click test log, I see the following:
You see the full set of details collected for the test in the usual web test playback view that you are use to. If it was a unit test, you would have seen the unit test viewer instead.
Page 59
Page 60
New Load Test and Load Test Rig Licensing and configurations
This information was taken straight from a blog post by Ed Glas (http://blogs.msdn.com/edglas/archive/2010/02/07/configuration-options-for-load-testing-with-visualstudio-2010.aspx) Using Visual Studio Ultimate enables you to generate 250 virtual users of load. To go higher than 250 users, you need to purchase a Virtual User Pack, which gives you 1000 users. You can use the 1000 users on any number of agents. Note that if you install the Virtual User Pack on the same machine as Visual Studio Ultimate, you do not get 1250 users on the controller. The 250 virtual users you get with Ultimate can only be used on "local" runs, not on a Test Controller. If you need to generate more 1000 users, you purchase additional Virtual User Packs, which aggregate or accumulate on the Test Controller. In other words, installing 2 Virtual User Packs on one controller gives you 2000 Virtual Users, which can be run on any number of agents. Configuration 1: "Local" Load Generation
This is what you get when you install Visual Studio Ultimate, which is the ability to generate load "locally" using the test host process on the same machine that VS is running on. In addition to limiting load to 250 users, it is also limited to one core on the client CPU. Note that purchasing Ultimate also gives you the ability to collect ASP.NET profiler traces by using a Test Agent as a data collector on the Web server.
Page 61
This is a common configuration if you are scaling out your load agents. With this configuration, the Test Controller and each Test Agent is on a separate machine. The advantage of this configuration is the controller is easily shared by team members, and overhead from the controller does not interfere with load generation or operation of the client. Note the Test Controller must have one or more Virtual User Packs installed to enable load testing. Load agents in this configuration always use all cores on the machine.
Page 62
With configuration A, you install the Test Controller and Test Agent on the same machine as VS, then configure the Test Controller with Virtual User Packs. This enables you to generate >250 virtual users from the client machine, and unlocks all cores in the processor. Configuration B shows an alternative configuration, enabled if you configure the machine with Virtual User Packs using the VSTestConfig command line. Note that a Virtual User Pack can only be used on one machine at a time, and configuring it on a machine ties it to that machine for 90 days. So you can't have the same Virtual User Pack installed on both the VS client and a separate machine running the Test Controller. See the Virtual User Pack license for details.
Page 63
In this configuration, the controller is running on the same machine as the Test client, with distributed agents running as load generators. This configuration is recommended if you have a solo performance tester. If your test controller and test agents will be shared by a team, we recommend running the controller on a separate box. Note that test agents are tied to a single test controller. You can't have two test controllers controlling the same agent.
If you are using Visual Studio 2008, these options should look familiar to you as the VS 2008 load agents and controller offered the same configuration options. The new twist with VS 2010 is the Virtual User Packs, which offer you more flexibility in how you configure your load agents.
The Test Controller and Test Agent are "free" when you purchase Ultimate.
Page 64
It is not recommended to use ordered tests in a load test. In the load test results, you do not get the pass/fail results, test timings or transaction timings for any of the inner tests. You just get a Pass/Fail result and duration for the overall ordered test. To address this issue, there is a new test mix type in VS2010 called Sequential Test Mix. Here is what it looks like in the load test wizard:
For this mix type, you set the order of tests that each virtual user will run through. You can mix web and unit tests in the mix and you will get the individual test, page and transaction results. When a virtual user completes the last test in the mix, it will cycle back to the first test in the mix and start over.
Page 65
If you just want to control the order of web tests, you could also use a main web test that calls all of the tests in order as "nested tests". This is called "Web Test Composition." For example, suppose I have WebTest1 and WebTest2 and I want 1 to run before 2. I would create a third web test that has no requests, but references tests 1 and 2. To create this kind of test, first record web tests 1 and 2. Then add a third web test and just hit stop in the web test recorder. When you are back in the web test editor, right click on the root node and select "Add Call to Web Test..."
This will launch a dialog and then select WebTest1. Then do same steps and add WebTest2. Now just run WebTest3 and you will execute both tests. WebTest composition has been available since VS2008
Page 66
When you choose to parameterize the web servers in a web test, you may see more webservers listed than your test actually calls. This is expected behavior. that the parameter parser is finding websites that reside inside query strings. Notice this in the .webtest file:
<QueryStringParameter Name="Source" Value="http%3A%2F%2Flocalhost%3A17012%2Fdefault%2Easpx" RecordedValue="http%3A%2F%2Flocalhost%3A17012%2Fdefault%2Easpx" CorrelationBinding="" UrlEncode="False" UseToGroupResults="False" />
Any Query String that has a URL gets added to the server list Any Form Post parameter that has a URL gets added to the server list NO added header value makes it into the list If the form post or query parameter NAME is a URL (not the value, but the name of the parameter), it does NOT get added.
This button will cause VS to detect URLs and create parameters for them. This web test has only ONE request, but VS detects four web servers. Any Query String that has a URL gets added to the server list Any Form Post parameter that has a URL gets added to the server list If the form post or query parameter NAME is a URL (not the value, but the name of the parameter), it does NOT get added. NO added header value makes it into the list
Page 67
Agents to Use
The agent names that are entered should be the names of agents that are connected to the controller to which the load test will be submitted. They should be the simple computer names of the agents (as seen in the "Computer Name" field in the Control Panel). Unfortunately, at this time, if you switch to submitting the load test to a different controller, you will need to change the value for "Agents to Use" as there is no way to parameterize this list to vary depending on the controller used. This list of agents designates a subset of those the agents that are connected to the controller, and are in the Ready state when the load tests starts (they may be running a different load test or other test run when the load test is queued as long as they become Ready when the load test is taken out of the Pending state and starts running), and that meet any agent selection criteria to allow the test run to be run on the agent. The Scenario will run on all agents in the list that meet these criteria, and the user load for the Scenario will be distributed among these agents either evenly (by default) or according to any agent weightings specified in the Agent properties for the agents (from the "Administer Test Controllers" dialog in Visual Studio).
Delay Start Time Amount of time to wait after the load test starts before starting any tests in this scenario. Disable During Warmup If true, the delay time does not begin until after warmup completes.
Page 68
The context menu showing the loop and condition insert options
Page 69
Page 70
http://msdn.microsoft.com/en-us/vstudio/ff520697
Page 71
Open QTAgentService.exe.config Add "<add key="WorkingDirectory" value="<location to use>"/>" under the <appSettings> node. Create the <location to use> folder.
In 2008 By default, web test playback ignores proxy servers set for localhost, so enabling a proxy for 127.0.0.1 (which is where Fiddler captures) will not result in any captured data. To make this work, either add a plugin with the following code, or put the following code in the Class constructor for a coded web test: this.Proxy = "http://localhost:8888"; WebProxy webProxy = (WebProxy)this.WebProxy; webProxy.BypassProxyOnLocal = false; In 2010 To get fiddler to work in VS 2010, simply open Fiddler, then start playing the web test. There is no need to code for anything.
Page 72
Controlling the amount of memory that the SQL Server Results machine consumes
The default behavior for SQL Server is to consume as much memory as it thinks it can, the workload on the machine may not be allowing SQL Server to correctly identify memory pressure and hence give back some memory. You can configure SQL Server to a max memory limit, which if all you are doing is inserting results should be fine. The below is how you can set memory to 512mb. The size of the memory you use will vary based on the machine, testing and how much memory you have. sp_configure 'show advanced options', 1 RECONFIGURE GO sp_configure 'max server memory', 512 RECONFIGURE GO
Change the values as needed and note that the time is in milliseconds.
Page 73
How to set the number of Load Test Errors and Error Details saved
Load Test Errors: You can change the total number of errors stored for a run in the appropriate configuration file (depending on whether this is for local runs or for test rig runs): Version Run Type File Name Location 2008 2008 2010 2010 Local Remote Local Remote VSTestHost.exe.config QTController.exe.config DevEnv.exe.config QTController.exe.config
<Program Files>\Microsoft Visual 9\Common7\IDE\ <Program Files>\Microsoft Visual Team Test Load Agent\LoadTest\ <Program Files>\Microsoft Visual 9\Common7\IDE\ <Program Files>\Microsoft Visual 9\Common7\IDE\ Studio Studio 9.0 Studio Studio
Add a key to the "appSettings" section of the file (add the "appSettings" section if needed) with the name "LoadTestMaxErrorsPerType" and the desired value.
<appSettings> <add key="LoadTestMaxErrorsPerType" value="5000"/> </appSettings>
Page 74
Multi-proc boxes used as agents should have .NET garbage collection set to server mode
Changed in 2010
In 2008 To enable your application to use Server GC, you need to modify either the VSTestHost.exe.config or the QTAgent.exe.config. If you are not using a Controller and Agent setup, then you need to modify the VSTesthost.exe.config. If you are using a controller and agent, then modify the QTAgent.exe.config for each agent machine. Open the correct file. The locations are
VSTestHost.exe.config - C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE QTAgent.exe.config - C:\Program Files\Microsoft Visual Studio 9.0 Team Test Load Agent\LoadTest
To enable gcserver you need to add the following highlighted line in the runtime section: <?xml version ="1.0"?> <configuration> <runtime> <gcServer enabled="true" /> <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"> <probing privatePath="PrivateAssemblies;PublicAssemblies"/> </assemblyBinding> </runtime> </configuration> In 2010 The agent service in VS 2010 is now set to Server GC by default. No need to take any action here.
To retrieve a list of agents assigned to a controller without using the VS IDE, look in: In 2008 <install point>\Microsoft Visual Studio 9.0 Team Test Load Agent\LoadTest\QTControllerConfig.xml In 2010 <install point>\Microsoft Visual Studio 10.0\Common7\IDE\QTControllerConfig.xml
Page 75
--NEW--Managing Test Controllers and Agents from VS *AND* from Lab Center
You can administer both the test agents and the test controller. If a test controller is registered with a team project, you can configure and monitor it and any registered test agents using the Test Controller Manager in the Lab Center for Microsoft Test Manager. Otherwise, to configure and monitor the test controller and any registered agents, click Test in Microsoft Visual Studio 2010 and point to Manage Test Controllers
Page 76
Page 77
Page 78
Make sure you pick the correct adapter here. Use the Network Connections properties built into Windows along with the IPCONFIG command to see which NIC is assigned to what subnet (see below).
The base address is 3 octets and should be representative of the subnet you are on. If you are using a class B subnet, you still need a third octet for the base.
The information as shown in the Network Connections dialog box in Windows. You may need to hover the mouse over the NIC to see the entire name of the NIC.
Ethernet adapter Secondary: Connection-specific IP Address. . . . . Subnet Mask . . . . Default Gateway . . DNS . . . . . . Suffix . . . . . . . . . . . . . . . . : : 10.69.200.3 : 255.255.0.0 : 10.69.0.1
Ethernet adapter Primary: Getting the proper IP Address info for spoofing Connection-specific DNS Suffix . : IP Address. . . . . . . . . . . . : 10.99.3.3 Subnet Mask . . . . . . . . . . . : 255.255.0.0
Page 79
Setting up The Load Test Once the test rig is setup, you can configure which Load Test will actually use IP Switching by setting the correct property for the Load Test:
Where to enable IP Switching for the Load Test Itself (after configuring the agents to use it)
Page 80
Startup: Multiple Network Cards can cause tests in a rig to not start
Problem: When running tests against a controller and test agents the tests start with pending state but then nothing else happens. Visual Studio Client Resolution: The problem is that you have two network adapters on the client machine. The following entries in the controller log confirm that this is the problem:
[I, 2972, 11, 2008/06/26 13:02:59.780] QTController.exe: ControllerExecution: Calling back to client for deployment settings. [E, 2972, 11, 2008/06/26 13:06:51.155] QTController.exe: StateMachine(RunState): Exception while handling state Deploying: System.Net.Sockets.SocketException: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond 65.52.230.25:15533
This is exactly the type of error message we see when the controller communication with Visual Studio fails because the client has network cards: To configure your Visual Studio installation to communicate with the controller, try this: In regedit: Find the key: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools Add a new key under the above key named "ListenPortRange" In the key "ListenPortRange", add a new string value with the name "BindTo" and the IPv4 address for the client (65.52.230.25 in your case) as the BindTo value.
Test Rig Resolution: Read the following support article for the steps to resolve this issue on a test rig: http://support.microsoft.com/kb/944496
Page 81
The below error may appear several times when running a load test where you are using IP Switching. In most cases, this can be ignored.
00:51:35 AGENT02 <none> <none> <none> Exception LoadTestException 151 Web test requests were not bound to either the correct IP address for IP switching, or the correct port number for network emulation, or both.
Page 82
The one situation where the presence of this error may indicate a real issue with the test is when the application is relying on a given iteration to always come through on the same IP address for purposes of maintaining a session (such as a load balancer like Microsoft ISA Server with the IP Sticky setting turned on).
Page 83
You might encounter timeouts when deploying load tests to agents when the deployment contains many or large files. In that case you can increase the timeout for deployment. The default value is 300 seconds. In 2010 You have to change the .testsettings file that corresponds to your active test settings in Visual Studio, because the deployment timeout setting is not exposed via the Visual Studio UI. Check via the menu Test | Select Active Test Settings (Visual Studio 2010) which file is active. You can find the file in the Solution Items folder of your solution. Open it in the XML editor, by right clicking it, choosing "Open With" and selecting "XML (Text) Editor". The TestSettings element will have an Execution element. Add a child element called Timeouts, if not already present, to the Execution element. Give it a deploymentTimeout attribute with the desired timeout value in milliseconds. For example:
<?xml version="1.0" encoding="UTF-8"?> <TestSettings name="Controller" id="330da597-4a41-4ae7-8b95-60c32ab793fb" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010"> () <Execution location="Remote"> <Timeouts deploymentTimeout="600000" />
IntelliSense should help you out when adding/editing this. In 2008 In 2008 you have to change the .testrunconfig file that corresponds to your active test run configuration, Add a child element Timeouts under the TestRunConfiguration element if no such element is already present. Check via the menu Test | Select Active Test Run Configuration which file is active. You can find the file in the Solution Items folder of your solution. Give it a deploymentTimeout attribute with the desired timeout value in milliseconds. For example:
<?xml version="1.0" encoding="UTF-8"?> <TestRunConfiguration name="Controller" id="af5824b3-56fa-4534-a3f8-6e763a56869a" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2006"> <Timeouts deploymentTimeout="600000"/>
Page 84
Visual Studio 2010 added the feature of handling network emulation within the test harness. This functionality is based off of a toolkit that was unofficially released as NEWT (http://blog.mrpol.nl/2010/01/14/network-emulator-toolkit/).
The default profiles within Visual Studio are somewhat limited, but these can be enhanced by making additional emulation files or modifying the existing files. The location of the emulation files is:
The sample on the next page shows some of the items that can be set and changed: If you create a new file, save it as a "*.NETWORK" file in the above directory. The name you assign the profile in the XML is what will be displayed inside Visual Studio. If you already have custom profiles you created with NEWT, just make sure to add the
<NetworkEmulationProfile name="NAME_OF_PROFILE_HERE" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">
Before the <Emulation> tag and to close it after the </Emulation> tag
Page 85
<NetworkEmulationProfile name="NAME_OF_PROFILE_HERE" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010"> <Emulation> <VirtualChannel name="VirtualChannel 1" DispatchType="packet"> <FilterList> <Filter name="FILTER_1" not="0"/> </FilterList> <VirtualLink name="LINK_1" instances="1"> <LinkRule dir="upstream"/> <LinkRule dir="downstream"/> </VirtualLink> </VirtualChannel> <VirtualChannel name="VirtualChannel 2" DispatchType="packet"> <FilterList> <Filter name="FILTER_3" not="0"> <IpVersion>IPv4</IpVersion> <Local> <IpAddress>255.255.255.255</IpAddress> <IpMask>0.0.0.0</IpMask> <PortBegin>0</PortBegin> <PortEnd>65535</PortEnd> </Local> <Remote> <IpAddress>127.0.0.1</IpAddress> <IpMask>255.255.255.255</IpMask> <PortBegin>0</PortBegin> <PortEnd>65535</PortEnd> </Remote> </Filter> <Filter name="FILTER_2" not="0"> <IpVersion>IPv6</IpVersion> <Local> <IpAddress>FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF</IpAddress> <IpMask>0000:0000:0000:0000:0000:0000:0000:0000</IpMask> <PortBegin>0</PortBegin> <PortEnd>65535</PortEnd> </Local> <Remote> <IpAddress>0000:0000:0000:0000:0000:0000:0000:0001</IpAddress> <IpMask>FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF</IpMask> <PortBegin>0</PortBegin> <PortEnd>65535</PortEnd> </Remote> </Filter> </FilterList> <VirtualLink name="LINK_2" instances="1"> <LinkRule dir="upstream"> <Bandwidth> <Speed unit="kbps">100000</Speed> <QueueManagement> <NormalQueue> <Size>100</Size> <QueueMode>packet</QueueMode> <DropType>DropTail</DropType> </NormalQueue> </QueueManagement> </Bandwidth> </LinkRule> <LinkRule dir="downstream"/> </VirtualLink> </VirtualChannel> </Emulation> </NetworkEmulationProfile>
Page 86
The above setup is needed to use all TFS capabilities minus lab mgmt. Lab mgmt introduces the following new components in the system: Visual Studio Performance Testing Quick Reference Guide Page 87
1. System Center Virtual Machine Manager Server (VMM) 2. Hyper-V hosts managed by VMM 3. Virtual machines (VM) that run on the hosts (these are part of the environment) It doesnt matter where you put the VMs on dev or corp or some other domain or workgroup. All scenarios work fine. Based on where you place VMM server and Hyper-V hosts, some scenarios will not work or need workarounds. I will describe these two topologies (as T1 and T2) and what scenarios work where. T1 T2 VMM server corp domain Hyper-V hosts corp domain (note: you can join VMs to any domain even if you put the hosts on corp) VMM server corp domain Hyper-V hosts dev domain
Summary As you can see below, all components (TFS, VS, MTM, VMM, hosts) in the same domain is the best case. If you split the components in 2 un-trusted domains, the capabilities still will work (with some workarounds called out) if you setup the components as in T2. Some capabilities will not work in T1. So, if the customer wants to use lab mgmt. in a multi-domain setup, the recommendation is to use T2, and be aware of the workarounds/additional steps that need to be done. I have called them out below. Test and Lab Management Capabilities Sl Visual Studio 2010 Capability/Feature No T1 T2 All components in same domain
1.
Testing on local machine i. Running manual tests locally from MTM ii. Running automated tests locally from VS Testing on virtual environments i. Running manual tests from MTM with remote data collection ii. Running automated tests on environments Testing on physical environments i. Running manual tests from MTM with remote data collection ii. Running automated tests on environments
1.
X X
O*
2.
O* O**
O* O**
Page 88
2.
Build deployment i. Automated build-deploy-test workflow Environment creation and management i. Create environment from VM templates ii. Start/stop/snapshot environment iii. Connect to environment using Environment Viewer iv. Clone environments using network isolation
3.
4.
Admin operations i. Add hosts to the VMM server O^ : Supported out of the box (OOB); O: Not supported OOB, but possible with workarounds; X: Not supported Notes on the workarounds: * Microsoft Test Manager (in dev domain) needs to talk to test controller (in corp domain) in this case. For this authentication to work, you need to use cached credentials or shadow accounts (same user name/password on both machines) on the client machine to talk to test controller. **Some extra steps are needed for test agent/controller communication to work across domains. Review the Requirements for Workgroups and Multiple Domain section here: http://msdn.microsoft.com/en-us/library/dd648127.aspx ^ Adding a host in untrusted domain involves extra steps including manually installed VMM agent.
Page 89
These files are standard XML files and can be modified to allow for quick and easy re-use of custom sets. It is recommended that you copy the counter set you wish to enhance and add the name CUSTOM to it so you will always remember that it is a custom counter set. Or you can create your own totally independent counter set. The following shows the layout of the file:
Page 90
<?xml version="1.0" encoding="utf-8"?> <CounterSet Name="Custom" CounterSetType="Custom Set"> <CounterCategories> <CounterCategory Name="Memory"> <Counters> <Counter Name="% Committed Bytes In Use" /> <Counter Name="Available Mbytes" /> </Counters> </CounterCategory> <CounterCategory Name="Processor"> <Counters> <Counter Name="% Processor Time"> <ThresholdRules> <ThresholdRule Classname="Microsoft.VisualStudio.TestTools.WebStress.Rules.Thresh oldRuleCompareConstant, Microsoft.VisualStudio.QualityTools.LoadTest"> <RuleParameters> <RuleParameter Name="AlertIfOver" Value="True" /> <RuleParameter Name="WarningThreshold" Value="80" /> <RuleParameter Name="CriticalThreshold" Value="95" /> </RuleParameters> </ThresholdRule> This all needs to be on one line. Make </ThresholdRules> sure you format it properly when </Counter> New To 2010 </Counters> putting it in the final file. Range specifies the <Instances> graph range. <Instance Name="*" /> New To 2010 </Instances> HigherIsBetter is used for highlighting better </CounterCategory> or worse results in the Excel reports. <CounterCategory Name="PhysicalDisk"> <Counters> <Counter Name="% Disk Read Time" Range="100" /> <Counter Name="% Idle Time" Range="100" HigherIsBetter="true"> <ThresholdRules> <ThresholdRule Classname="Microsoft.VisualStudio.TestTools.WebStress.Rules.Thresh oldRuleCompareConstant, Microsoft.VisualStudio.QualityTools.LoadTest"> <RuleParameters> <RuleParameter Name="AlertIfOver" Value="False" /> <RuleParameter Name="WarningThreshold" Value="40" /> <RuleParameter Name="CriticalThreshold" Value="20" /> </RuleParameters> New To 2010 </ThresholdRule> RangeGroup uses a common range for all counters in that range group. </ThresholdRules> </Counter> <Counter Name="Avg. Disk Bytes/Read" RangeGroup="DiskBytesRate" /> <Counter Name="Avg. Disk Bytes/Transfer" RangeGroup="DiskBytesRate" /> <Counter Name="Avg. Disk Bytes/Write" RangeGroup="DiskBytesRate" /> <Counter Name="Avg. Disk Queue Length" RangeGroup="Disk Queue Length" /> <Counter Name="Split IO/Sec" RangeGroup="Disk Transfers sec" /> </Counters> <Instances> <Instance Name="*" /> </Instances> </CounterCategory> </CounterCategories> </CounterSet>
Page 91
The values are in ms, so 9000 is 9 seconds. If you make this change, also change the load test sample rate to be larger than this: at least 10 or preferably 15 seconds, and yes with many agents located far from the controller, it is recommended to delete most of the categories in the Agent counter set (perhaps just leave Processor and Memory). The .NET API that used to read the performance counters is PerformanceCounterCategory.ReadCategory(), so the entire category is read even if the counter set definition only includes one counter and one instance. This is a limitation at the OS level in the way performance counters are read. The defaults in VS are: LoadTestCounterCategoryReadTimeout: 2000 ms (2 seconds) LoadTestCounterCategoryExistsTimeout: 10000 ms
Page 92
Page 93
Page 94
When you choose this, you will get the following dialog:
Page 95
--NEW-- Controller and Agent countersets show up in test even after removing them
If you delete the Controller and Agent countersets, they will not be in the load test. HOWEVER, as soon as you manage countersets and change any other counters, VS will put the Controller and Agent counters back in the test:
After the addition, the counters are back in the countersets AND the counterset mappings.
Page 96
Page 97
Verifying saved results when a test hangs in the "In Progress" state after the test has finished
If you run a test and either the test duration or the number of iterations needed for completion of the test have been reached, but the test stays in the "In Progress" state for a long time, you can check if all of the results have been written to the load test results repository by running this SQL query against the LoadTest database: select LoadTestName, LoadTestRunId, StartTime, EndTime from LoadTestRun where LoadTestRunId=(select max(LoadTestRunId) from LoadTestRun); If the EndTime has a non-NULL value then the controller is done writing results to the load test results database and it should be safe to restart the rig (killing anything if needed). This doesn't necessarily mean that all results from all agents (if the agents got hung) were successfully written to the load test database, but it does mean that there's no point in waiting before killing the agents/tests. Visual Studio Performance Testing Quick Reference Guide Page 98
The metrics during and after a test differ from the results seen.
Scenario 1: When you run load tests and look at the numbers you get while the tests are running, the values you see may not be the same values that you get when you load the completed test results at a later point. This behavior is not unexpected, based on warmup and cooldown settings.
Comparison of a test with and without warmup. Notice the total number of tests run is different, but the recorded times are close enough to be valid for reporting.
Scenario 2: When you compare the summary page results to the detailed results values, there can be a difference in what is reported. This is due to the implementation of collecting the timing details, which are currently flushed when a test iteration ends. For iterations that are in progress with in-flight requests, we give the iteration 10 seconds (configurable via cooldown) to complete any in-flight requests. If they do not complete, the transactions in those iterations are not counted in the details, but are counted in the summary page.
Page 99
Comparing New Users to Return Users (WRT caching): New users are simulated by clearing the cache at the start of each new iteration, whereas the cache is carried from iteration to iteration for return users. This results in many more requests being cached with return users. NOTE: The total # of requests made by VS is a sum of the two VS values. In other words, Total Requests in the IDE does not include cached requests.
Comparing the same tests using HTML's Content Expiration setting TOR 12 - Caching - ReturnUsers - Content Expiration
HTM 270 HTML 264 GIF 85 BMP 3330 200 OK - 3874 304 Not Modified - 75 VS Requests: 3,949 VS Requests Cached: 84,842
Looking at the impact of content expiration on the overall network and web server activity (For more information, see the section Add an Expires or a Cache-Control Header from http://developer.yahoo.com/performance/rules.html). Notice that VS honors the content expiration (this is actually handled by the underlying System.NET component). However, VS still reports the cached file request, even though no call went out the wire. This is expected behavior since the request was a part of the site. In order to see how many requests went on the wire, you need to use IIS logs or network traces.
Page 100
Notes: All 4 tests above were run for the same duration with the same number of users executing the same test. Although the numbers do not match exactly, they are close enough to show the behavior of the tests. The discrepancy is due to a few things, including cool down of the test and the possible misalignment of the query I used to gather data from the IIS logs. The IIS Log items for "200 OK" and "304-Not Modified" were gathered using LogParser and the following query:
SELECT sc-status, COUNT(*) AS Total FROM *.log WHERE to_timestamp(date, time) between timestamp('2010-02-12 02:13:22', 'yyyy-MM-dd hh:mm:ss') and timestamp('2010-02-12 02:18:22', 'yyyy-MM-dd hh:mm:ss') GROUP BY sc-status
data sources for data driven tests get read only once
When initializing data driven tests the data is read ahead of time, and only retrieved once. Therefore there is no need to optimize the connection to the data source.
Page 101
Page 102
Consider enabling SQL Tracing through the Load Test instead of separately
There is a set of properties on the Run Settings in the Load Test Editor that allow the SQL tracing feature of Microsoft SQL Server to be enabled for the duration of the load test. If enabled, this allows SQL trace data to be displayed in the load test analyzer on the "SQL Trace" table available in the Tables dropdown. This is a fairly easy-to-use alternative to starting a separate SQL Profiler session while the load test is running to diagnose SQL performance problems. To enable this feature, the user running the load test (or the controller user in the case of a load test run on a rig) must have the SQL privileges needed to perform SQL tracing, and a directory (usually a share) where the trace file will be written must be specified. At the completion of the load test, the trace file data is imported into the load test repository and associated with the load test that was run so that it can be viewed at any later time using the load test analyzer.
The calculation of the percentile data for transactions is based not on the sampled data that is shown in the graph, but on the individual timing details data that is stored in the table LoadTestTransactionDetail. The calculation is done using a SQL stored procedure that orders the data by the slowest transaction times, uses the SQL "top 10 percent" clause to find the 10% of the slowest transactions then uses the min() function on that set of rows to get the value for the 90th percentile time. The stored procedure in the LoadTest database that does this is "Prc_UpdateTransactionPercentiles".
Page 103
Page 104
How to clean up results data from runs that did not complete
If you have a Load Test Run that abnormally aborts and does not show data in a consistent manner (or does not show up in the list of runs as either completed or aborted), you can use the following query on the SQL repository server to clean up the database:
update LoadTestRun set Outcome='Aborted' where Outcome='InProgress'
The Outcome field is left blank until the test either completes or is manually aborted. Any test results in the DB cannot be accessed through the GUI until the Outcome field has one of the two values 'Completed' or 'Aborted'
InstanceName field in results database are appended with (002), (003), etc.
Question:In the LoadTest databases, the Instance Names are sometimes appended with "(002)", etc. For example, I have a transaction called "Filter Render Request" and in the load test database I have two transactions. Also, I have a URL pointing to RenderWebPartContent and I have several entries. Can someone give me a quick explanation? Answer: To make a long story short it is a unique identifier that is used mostly internally to distinguish between cases where you have the same test name in two different scenarios in the load test or the same page name (simple file name) in different folders in two different requests.
Page 105
Once in the manager, you choose a controller name from the drop down list (or <local> if you want the results from the local database) and the manager will populate with the tests it finds. You can select whatever test results you wish to move, and then choose "export" to move them into a file (compressed with an extension of .ltrar). That file can be moved to another machine and then imported into a new results store.
Page 106
For most result types all of the data needed to display the result can be exported into a TRX file. This is not true for load tests. The only thing that a TRX file stores for a load test is the connection string to the database with the results and the run id of the run to load. So if you do not run the load test with storage type set to database, then exporting the TRX file is useless. It will contain no useable data that you can use for later analysis. So ALWAYS use a database when running load tests.
Notice the call to LoadTest.dbo.LoadTestRun is hardcoded, which is what causes the feature to break. In general, we recommend you use the LoadTest database name (or in the case of 2010, the database is named LoadTest2010).
Page 107
Web Test TRX file and the NAN (Not a Number) Page Time entry
In VS 2008, if a Web Test trx file is opened in an XML editor, you may notice the NAN page time for some of the responses.
<Response url="http://teamtestweb1/storecsvs/" contentType="text/html; charset=utf-8" statusLine="HTTP/1.1 200 OK" pageTime="NaN" time="0.006" statusCodeString="200 OK" contentLength="12609">
When/why does this happen? This only happens to non top-level requests, i.e. redirects and dependents. At the end of Web test execution, all results (objects and their members) are serialized to a trx file, including the pageTime. NAN is the result of doing a .ToString() on a float or double value that has not been initialized. This means that the pageTime is not known at the time this entry was written to the trx. The following is the screenshot of the Web test result file opened in the Playback window. It shows how this property is set in the code.
The high-lighted one is the top-level page. It is redirected and the redirected to page has some dependent requests. The 'Total Time' for the top-level page, i.e. the page time, refers to the time to send all requests and receive all responses (including the redirects and dependents) from the Web server. It is only calculated and populated for the primary request, but not for 'redirected to' and the dependents. This is why that you are seeing Nan page time in the XML file.
Page 108
TRX files are the result files created when you run a unit or web test in Visual Studio. There are two pieces here. The first describes how TRX files are constructed in VS 2008, and the second part shows how things have changed for VS 2010 In 2008 In VS 2008, if you run a Web test outside a load test, the entire Web test result is serialized to the trx file. So each request and response in the test is serialized. If the test runs multiple iterations, the trx file can get very large. We added optimizations to control the amount data that is stored in the TRX for request/response bodies by only storing one copy of a unique response bodies (in multi-iteration runs you may end up with multiple identical responses). Also, the request and response bodies are compressed to dramatically reduce the amount of space they require in the TRX. There is a test context snapshot stored before every request (including dependent requests). Sometimes, you'll find really large VIEWSTATE in a test context that can make them really large. The request/response headers and the test context snapshots are not compressed and duplicates are not eliminated, so they have the potential to become bloated. In 2010 In VS2010, there is one major change on how the WebTestResultDetails class is persisted upon test completion. Instead of writing the WebTestResultDetails class to a trx file, VS serializes the object to a *.webtestResult file. The relative path of this file is added as an element to the trx file. By saying 'relative', it means relative to the path of the corresponding trx file.
The file only exists on the machine that you run the Web test from, i.e. the VS / mstest machine. For a local run, the file goes to \TestResults\prefix_Timestamp\In\TestExecuId. For a remote run, the file goes to \TestResults\prefix_Timestamp\In \Agent\TestExecuId. When you open a Web test trx file from the Test Results window, VS reads the value of WebTestResultFilePath from the trx file, and then loads the .webtestResult from TrxDirecory\WebTestResultFilePath into Web Test Result window.
Note about Data Collectors and TRX files If you have data collector(s) turned on for a unit/Web test, the collector data, e.g. event log, go to \TestResults\prefix_Timestamp\In\TestExecuId\Agent. For a Load test, collector data go to
\TestResults\prefix_Timestamp\In\Agent.
Page 109
Page 110
Page 111
Page 112
After opening a webtest with the VS XML Editor, it will not open in declarative mode.
Applies only to 2010
In the VS IDE, you can right click on a webtest file and choose to "Open in XML Editor". Once you do that and then close the window, the next time you double click on the webtest to open it, the file should open in the default declarative view. However, in VS 2010 there is a known issue that causes the webtest to always be opened in XML mode. To work around this issue: 1) 2) 3) 4) open the test project file (e.g. .csproj), look for the web test that is opened as XML, delete the line '<SubType>Designer</SubType>' save the test project.
Example of the section needing to be changed: --------------------------------------------------------------------<None Include="WebTest1.webtest"> <CopyToOutputDirectory>Always</CopyToOutputDirectory> <SubType>Designer</SubType> </None> ---------------------------------------------------------------------
Possible DESKTOP HEAP errors when driving command line unit tests
When you run a large number of unit tests that call command line apps, and they are run on a test rig (this does not happen when running tests locally), you could have several of the tests fail due to running out of desktop heap. You need to increase the amount of heap that is allocated to a service and decrease the amount allocated to the interactive user. See the following post for in depth information, and consider changing the registry as listed below: http://blogs.msdn.com/ntdebugging/archive/2007/01/04/desktop-heap-overview.aspx
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\SubSystems OLD SETTING: "Windows SharedSection=1024,3072,512" NEW SETTING: "Windows SharedSection=1024,1024,2460"
Page 113
Goal based load tests in VS 2008 do not work after applying SP1
Changed in 2010
There is a QFE available that fixes the following bugs with Goal Based Load Patterns that were introduced in VS 2008 SP1: If you defined a goal based load pattern using a performance counter from any of the "LoadTest:*" categories, an error would occur and the user load would not be not adjusted according to the goal. If you defined a goal based load pattern using a "single instance" performance counter (for example Memory\Available Mbytes), an error would occur and the user load not be not adjusted according to the goal. If the Machine Name property entered for the goal based performance counter did not exactly match the casing for the computer name, an error would occur and the user load would not be adjusted according to the goal.
The hotfix can be obtained from: http://support.microsoft.com/kb/957451 This is no longer an issue in VS 2010
Page 114
Page 115
Page 116
When you use the new feature in VS 2010 "Save Log on Test Failure", you may get an "Out of disk space" error. Depending on the number of "Maximum Test Logs" and the size of data for each iteration, the logs being saved can be very large(for instance, a webtest that uploads and/or downloads large files).
When a particular request encountered an error in VS 2008 while running a load test (with "Timing Details Storage" set to "All Individual Details"), you could go to the details of the error and see the information specific to that request. This option is no longer in VS 2010. It has been replaced by the new detailed logging feature that logs the entire Web test or unit test result for a failed virtual user iteration.
If you are experiencing the bug, you can work around it by: Generating a coded web test Renaming or deleting the "Test Results" folder Changing the test project's location for the "Test Results" folder
Page 117
--UPDATED-- Socket errors or "Service Unavailable" errors when running a load test
When running a load test, you might receive several errors similar to: Exception SocketException Only one usage of each socket address (protocol/network address/port) is normally permitted HttpError 503 - ServiceUnavailable 503 ServiceUnavailable These are often due to exhaustion of available connection ports either on the VS machine(s) or on the machines under test. To see if this could be happening, open a CMD window on your VS machine(s) and on the machine(s) under test, and run the following command: "netstat anp tcp" If you see lots of connections in a TIME_WAIT state, then you can be suffering from port exhaustion.
The TIME_WAIT state is a throwback from the old days (well more accurately the default of 4 minutes is the throwback). The idea is that if the client closes a connection, the server puts the socket into a TIME_WAIT state. That way, if the client decides to reconnect, the TCP negotiation does not need to occur again and can save a little bit of time and overhead. The concept was created because creating a TCP connection was a costly operation years ago when networks were very slow). To get around this issue, you need to make more connections available and/or decrease the amount of time that a connection is kept in TIME_WAIT. In the machine's registry, open the following key and either add or modify the values for the two keys shown: [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters ] "TcpTimedWaitDelay"=dword:0000001e (30 seconds) "MaxUserPort"=dword:0000fffe (65,535 ports) If you are experiencing the issue on one of the VSTT load machines, you may also need to change the load test connection model to "Connection Pooling" and increase the pool size considerably.
Page 118
Error "Failed to load results from the load test results store"
"Unable to cast object of type 'System.DBNull' to type 'System.Byte[]'" error when trying to retrieve load test results from the DB inside VS. This error will occur if you get a NULL value in the LoadTest column of the LoadTestRun table. To fix it, go to the table and delete the row that has the NULL value. The occurrence of this issue should be extremely rare.
A default Hidden extraction rule was added to the request. When the rule fired, the result was:
$HIDDEN2._ListSchemaVersion_{9fcdfcc2-6d4f-4a22-a379-8224954c1d9a
This is not a bug, but just a side effect of how VS process context parameters.
Test results iteration count may be higher than the max test iterations set
When a test run that defines a specific number of test iterations is complete, you may see more tests run than the iterations set in the run properties. This is rare and is caused by the load test process crashes and restarts. This issue exists in VS 2008 and VS 2010. The reason for this is that the Restart file we use to handle restarting a load test after QTAgent dies was never updated to include info about the tests completed, so it will always run the initial number of test iterations after restart. Resolution: Find out what is causing QTAgent to crash and fix that issue.
Page 119
NOTE: an example of this scenario is firing off a batch file that starts a NETCAP.EXE window to gather trace data during the test run. This NETCAP process must run asynchronously so it will not block the web test. It must also complete by itself or the resultant trace file will not get written. Web tests should not be starting other processes, or performing any blocking operations as they will cause problems with the load test engine. For the netcap example, a better solution is to write this as a VS2010 data collector.
Page 120
Bug with LoadProfile.Copy() method when used in custom goal based load tests
If you create a custom goal based load test plugin and use the LoadProfile method Copy(), you will get an error saying: "A LoadTestPlugin attempted to set the 'MinTargetValue' property of the load profile for Scenario Scenario1 of Load Test LoadTest1; this not allowed after the LoadTestLoadProfile has been assigned to the LoadProfile property of the LoadTestScenario." This is due to a regression in hotfix 957451. There is currently no fix for this, however there is a workaround. You need to create your own copy method and use it to populate the custom LoadProfile. Make sure that you do NOT set the "ScenarioName" value since this is where the bug lies. Here is some sample code: The Copy() Method usage that will fail:
LoadTestGoalBasedLoadProfile newGoalProfile = _scenario.LoadProfile.Copy() as LoadTestGoalBasedLoadProfile;
Page 121
Errors in dependent requests in a Load Test do not show up in the details test log
The new Detailed Test Logging feature will not allow you to see the details of errors that occur inside dependent requests during a load test (like AJAX or JSON requests) The problem is that if a dependent request has an error, even though the test will be flagged as failed, and the log for that iteration will be stored, the log does not contain any details for any dependent requests. Therefore you do not get any details about why the failure occurred. To work around this issue, you need to make sure any dependent requests that are having problems get moved back up to main requests, at least during a test debugging phase.
Page 122
The errors table in the results shows the exception count and allows you to drill into the details. The picture below shows you how to display the full details log for this failed iteration
Here you see the details log. It shows that there is a failure, but the request details do not show where the error occurred, nor can you get any details about the error.
Page 123
Page 124
Content-Length=0 Header not sent resulting in HTTP 411 Length Required Error
You may run into an issue where the web request is failing with an HTTP 411 Length Required response. This is in a Post Request with no body. This will not always occur as some web servers may ignore the missing header. However RFC specification 2616 defines that even with a content length of zero, the header should still be sent (http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html). Visual Studio uses its own header collection class to allow for a single collection per request. This makes the code more efficient. The internal method used to build this collection first removes all headers that are restricted by the System.Net.HttpWebRequest class (http://msdn.microsoft.com/enus/library/system.net.httpwebrequest.headers.aspx) and then adds back the appropriate headers. However the internal code does not add a content-length when the length is zero. Also, V.S. does not allow you to directly set any headers that are controlled by the system (such as content-type and content-length). To work around this issue, add a dummy body to your request. Here is an example:
WebTestRequest request1 = new WebTestRequest("http://localhost/"); request1.Method = "POST"; request1.Encoding = System.Text.Encoding.GetEncoding("utf-8"); request1.Headers.Add(new WebTestRequestHeader("Content-Length", "0")); StringHttpBody request1Body = new StringHttpBody(); request1Body.ContentType = ""; request1Body.InsertByteOrderMark = false; request1Body.BodyString = ""; request1.Body = request1Body; yield return request1; request1 = null;
Page 125
Error that test could not run because the network emulation is required
Applies only to 2010
You may receive the following error when trying to start a load test: Warning 5/25/2010 4:58:53 PM Could not run load test 'LoadTest1' on agent 'PRITAMB1': Network emulation is required for load test 'LoadTest1' but the driver is not installed on agent PRITAMB1. Network emulation is required for load test 'LoadTest1' but the driver is not installed on agent PRITAMB1. PRITAMB1 This is most likely caused by the fact that the network emulation drivers did not successfully install during the VS setup. There are two methods you can try to resolve this issue.: 1) Re-configure agent using the GUI inside Visual Studio 2) Follow these steps: a. Launch VS with administrator privilege, create a default C# test project. b. Open Local.testsettings in Solution Explorer. c. Select "Data and Diagnostics", Enable "Network Emulation" and press "Configure". d. In Network Emulation Detail dialog, select a network profile other than "LAN", like "3G", then press OK. e. Network Emulation driver would be installed after a short network disconnection. NOTE: If you just install VS and not the remote agent, the Network Emulation driver is not installed. You must run the command "VSTestConfig NETWORKEMULATION /install" from an elevated VS Command Prompt. This will install the driver so that you can use it from VS
Missing >
Page 126
Request failure with improperly encoded query strings calling SharePoint 2010
Applies only to 2010
When testing a site built on SharePoint 2010, requests may fail. When running this in a 2010 Web Test, the query string is not encoded at all and fails out 1. POST to /global/pages/search.aspx a. Response HTTP 302 with location header: /global/pages/Search.aspx?k=ALL(Developer OR Support Engineer)AND(prname="Engineering" OR prname="ITOperations")AND(lvl=59 OR lvl=60 OR lvl=61 OR lvl=62) 2. GET to /global/pages/Search.aspx?k=ALL(Developer OR SUPPORT ENGINEER)AND(PRNAME="ENGINEERING" OR PRNAME="ITOPERATIONS")AND(LVL=59 OR LVL=60 OR LVL=61 OR LVL=62) HTTP/1.1 a. Response HTTP 400 Bad Request b. Fiddler only shows the request as /global/pages/Search.aspx?k=ALL(Developer c. VS is set to follow redirects on the initial POST so this request was automatic
Resolution: Visual Studio now has a property on requests called EncodeRedirectedUrl. Set this to true and it should work as expected. This is not available in the UI, so you either need a plugin or a coded test to set it.
Network Emulation does not work in any mode other than LAN
Applies only to 2010
Let's say you have3 two NIC cards and two IP addresses assigned on the Agent machine. One is used to communicate with controller (intranet) and other to communicate with external web site (extranet). The problem is when Network Emulation is enabled (DSL etc), the Load Test code is picking wrong IP address and setting this as source IP address. So, all requests were failing with "501 Network Unreachable" socket exception. Visual Studio Performance Testing Quick Reference Guide Page 127
As you may already know, for Network Emulation, the loadtest has to specify a port number from a port range that is set for the network type to be emulated. Unfortunately, it also has to also specify a source IP address in the .Net call (HTTPRequest.ServicePoint.BindIPEndPointDelegate), and it assumes the first IP address that is returned by System.Net.Dns.GetHostAddresses is the correct one. In this case, we are getting the intranet IP address first and ending up binding HTTP requests to it. The solution that worked is to enable IP Switching, and specify an IP address range that consists of one IP address that is equal to the one that works. (To set this, open TestManageTestControllers in VS, and click on Properties for the Agent machine, and fill appropriate fields). This will enable the Load Test to use correct IP address in communicating with Web Site.
Error that Browser Extensions are disabled when recording a web test
You might see the following error when trying to record a web test:
To fix this, go to "Tools" -> "Internet Options and set the following:
Page 128
Error: Request failed: No connection could be made because the target machine actively refused it
Proxy and ForeFront (Anti-Virus) are creating issues here. There are lots of 504 gateway errors and errors caused by ForeFront e.g.
The number of HTTP requests per minute exceeded the configured limit. Contact your Forefront TMG administrator
Here is my sample test that writes the value of maximum connections to a file
[TestMethod] public void TestMethod1() { File.WriteAllText("c:\\out.txt", "The current connection limit is " + System.Net.ServicePointManager.DefaultConnectionLimit.ToString()); }
When run in a single user test, I see the following output The current connection limit is 1500 When run in a load test with 1 iteration, I see the following output The current connection limit is 100 The load test code does set the DefaultConnectionLimit to 100; otherwise it defaults to something very low, so the load test code is overriding the config setting. If you write a line of code anywhere in your unit test (like the TestInitizlize or ClassInitialize method) to set the DefaultConnectionLimit explicitly, that should override the load test setting and the load test sets this before running any unit test code
Page 129
--NEW-- VS does not expose a method for removing specific cookies from requests.
The cookie collection that Visual Studio sends with requests is dynamically built based on the cookies sent back from the server. There is a method that allows you to add a new cookie to the collection, but there is no exposed way to delete an existing cookie. Further, the collection itself is marked as read-only, so if you create a new cookie collection using System.Net and then assign it to the request's cookie collection, the request will fail to execute.
--NEW-- File Upload" feature in VS does not allow you to use a stream to send file
If you want to use a stream to upload a file as a form post item, Visual Studio 2010 provides a form post item to easily do this. You can also create a plugin to do the same thing. However, neither method provides a way to get the file contents from a stream. The ONLY method for getting the file is by reading a static file on the drive. This is not a limitation of the testing functionality of Visual Studio, but a limitation in the "File" class in the .NET Framework.
--NEW--Unit tests that consume assemblies requiring MTA will fail with default test settings
If you create a unit test that consumes an assembly that runs in an MTA, the test will most likely hang, or fail with an error similar to:
Test method SimpleLoadTest.SyncFullJobTest.Test_SyncFullJobTest_WithNoError threw exception: System.InvalidCastException: Unable to cast COM object of type 'System.__ComObject' to interface type 'ISyncKnowledge2'. This operation failed because the QueryInterface call on the COM component for the interface with IID '{ED0ADDC0-XXXX-XXXX-XXXX-45661D2XXXXX}' failed due to the following error: No such interface supported (Exception from HRESULT: 0x80004002 (E_NOINTERFACE))..
A good example of this is using the SQL SERVER SYNC FRAMEWORK in a unit test. This is due to the fact that Visual Studio testing defaults to STA for tests execution. To fix this, you must manually edit the ".testconfig" file (".testrunconfig" in 2008) being used to execute the test(s). Add the line:
<ExecutionThread apartmentState="1" />
--NEW--MSTest tests that consume assemblies requiring MTA will fail with default test settings
This is the same issue as the previous article, and more information can be found at this link. http://blogs.msdn.com/b/ploeh/archive/2007/10/21/runningmstestinanmta.aspx
Page 130
--NEW--Load Test Agent Error "Failed to open the Visual Studio v10.0 registry key"
PROBLEM: Got a machine with VS Ultimate 2010 and VS 2010 Controller installed. All appear to be good. I go to the next machine and try to install the agent. The install program runs fine and completes. Then it gives me the configuration wizard. No matter what I choose, the Apply Settings fails with:
Further, the service does not show up in the list of installed services. RESOLUTION: When installing the agent software, there are 2 components: 1) Agent software 2) VS Performance Tools (for development debugging/profiling/etc. The error will occur if you only install the Agent software and do not install the Performance tools. The default install DOES install both items.
Then you can use attributes in test setting to filter agents for your test run. Please check here for details: http://msdn.microsoft.com/en-us/library/ff426235.aspx (Step 16). You will have to use 2 different test settings for your 2 load test runs so that each run happens on different set of agents. This will definitely work.
Page 131
--NEW--BUG: Microsoft.VisualStudio.TestTools.WebStress.LoadTestResultsCollector
PROBLEM: Im running a goal based load test. The goal is set to use a custom performance counter which records a number of operations/sec from the internals of my customer's code. I setup the load test using the correct category name, counter name, a blank for the instance name and the appropriate machine name. The issue is related to the counter being only defined on a remote machine. When I ran the load test I received the following error
LoadTestGoalCounterNotFoundException A goal based load pattern can not be applied because it specifies a performance counter (UKGC08\THGWebTestHarness: Availability Searches\Operations Executed / sec\) that could not be read: Category does not exist.
The specific counter my customer is using does not exist on the client machine (where Visual Studio is installed) and nor does it exist on the Controller. My workaround for the present is to install the counters on the client machine and the controller, which gets me past the exception so I can test their code without issue. This is a known issue and is being investigated by the product team.
Page 132
Page 133
Page 134
The problem is that the redirects can take up to 30-40 seconds for some reason. This causes the first request to take over 1 minute to complete. It is successful, but this timeframe is too long for a valid test. RESOLUTION: Force a pre-authentication to the proxy server in the webtest: http://blogs.msdn.com/b/rogeorge/archive/2009/06/23/how-to-authenticate-to-a-proxy-server-withina-visual-studio-webtest-using-visual-studio-2008-sp1.aspx
--NEW--Data bound validation rule fails when set at the TEST level
If you create a validation rule at the WebTest level and then bind a context parameter to it, the validation rule will not resolve the context and therefore will fail. If, however, you move the exact same rule to a request level, it will work.
Page 135
RESOLUTION:
Either change the
Page 136
RESOLUTION: It is actually known issue with Dev10 RTM build and is fixed in SP1. As a workaround in case you dont want to move to SP1, you can reopen result from DB using Open and manage results in load test editor.
Page 137
A trace file was in fact successfully created and had a file size of 102400KB. On opening it in SQL Profiler I could see the trace. What I think happened is that a maximum file size was reached and it stopped tracing. It appears that a maximum file size may be set up together with a rollover option when creating a new trace. So I assume that Visual Studio sets the maximum file size to 102400KB with no rollover enabled. So I would like to know if there is any way of configuring maximum file size and rollover for SQL Trace integrated into Load Tests? Otherwise, the SQL trace integration is not going to work for real world testing.
RESOLUTION: Currently there is no resolution. It has been added to the feature request list for the product team. Currently there is no estimation when this will be addressed.
--NEW--LoadTestCounterCategoryNotFoundException
PROBLEM: Hi, thanks for replies, Ive checked the list you sent and things seem to be OK, however Im unable to access the perf counters from the controller, for example, if I try to add perf counters on that Win7 box I get:
Page 138
RESOLUTION: Hi, Ive found the answer in this article: http://blogs.msdn.com/b/edglas/archive/2008/11/19/readingperformance-counters-on-vista-and-server-2008-machines.aspx Basically, my issue was that the Remote Registry service is set to manual in Win7 by default and is set to automatic in WinXP so I enabled the service and things started working (my test agent is a local admin so none of the other security settings were needed in my case).
Page 139
With each release of VS we have made major strides in Web Test Authoring and Debugging. With VS 2008, we added a number of features to address the most common challenges with Web test authoring, the most important being a low-level http recorder and a automatic correlation tool. This covered the most prevalent challenges outlined in Web Test Authoring and Debugging Techniques. Again with VS 2010 we have made major strides in Web test authoring and debugging: 1) More http recordings "just work" 2) New tools to help you debug and fix the ones that don't, and 3) New extensibility points for the recorder, editor, and results viewer enable you, us and our community to release rich functionality "out of band" to handle custom applications and rich data types. A New Name, But Under the Covers Still the Same In this release we renamed "Web Test" to "Web Performance Test" to highlight the primary scenario for Web tests, which is using them as scripts in a load test to model user actions. Load tests are used to drive load against a server, and then measure server response times and server response errors. Because we want to generate high loads with a relatively low amount of hardware, we chose to drive Web performance tests at the protocol layer rather than instantiating a browser. While Web performance tests can be used as functional tests, this is not their primary focus (see my post Are Web Tests Functional Tests?). You will see that I still refer to "Web Performance Tests" as "Web Tests" for short. If you really want to test the user experience from the browser, use a Coded UI test to drive the browser. In order to be successful working with Web Performance Tests, it is important you understand the fundamentals about how they work. Web Performance Tests Work at the HTTP Layer The most common source of confusion is that users do not realize Web Performance Tests work at the HTTP layer. The tool adds to that misconception. After all, you record in IE, and when running a Web test you can select which browser to use, and then the result viewer shows the results in a browser window. So that means the tests run through the browser, right? NO! The Web test engine works at the HTTP layer, and does not instantiate a browser. What does that mean? In the diagram below, you can see there are no browsers running when the engine is sending and receiving requests:
Page 140
Page 141
What Does This Mean for You? This design has fundamental and far-reaching impact if you are working with Web tests. It is critical that you understand this if you are going to be successful authoring and debugging Web tests. This escapes even customers who have worked extensively with Web tests, and is a major source of confusion. The Web test engine: 1) 2) 3) 4) Sends and receives data at the HTTP layer. Does NOT run a browser. Does NOT run java script. Does NOT host ActiveX controls or plugins.
Ok, so Web tests work at the HTTP layer. What about requests sent and received by javascript and/or browser plugins? The best example for java script generating HTTP traffic is AJAX calls. The most common example of browser plugins are SilverLight or Flash. The Web test recorder will record HTTP traffic from AJAX calls and from most (but not all) browser plugins. Web Tests Can Succeed Even Though It Appears They Failed A common source of confusion comes from the browser preview in the Web test result viewer. This browser control does not run java script nor host plugins, which is by design since the engine does not do this either, and for security reasons. A common technique in pages requiring java script is to sense this, and put up an alternate page when the browser is not running java script, such as "java script required on this page":
This page looks like it failed, when if fact it succeeded! Looking closely at the response, and subsequent requests, it is clear the operation succeeded. As stated above, the reason why the browser control is pasting this message is because java script has been disabled in this control. Another variant of this is plugins such as this page that is using SilverLight:
Page 142
Again, it looks like the page failed, when in fact at the HTTP layer it succeeded. A Common Challenge: Dynamic Parameters One of the major challenges with working at the HTTP layer are "dynamic parameters". A dynamic parameter is a parameter whose value changes every each time it is run. The most common case is a session parameter, such as a login session ID. Each time a user logs in to a site, he is given a new login session ID. In order to simulate this user's actions, the test cannot simply replay the recorded session ID, it must replay the new session ID. Web tests handled most patterns of dynamic parameters automatically, but there are still some patterns it does does not automatically handle. Huge Strides Forward with VS 2010 With ever more complicated applications being built on HTTP, it is getting harder and harder to develop scripts at the HTTP layer. With VS 2010, we again have made tremendous strides across the tool, in recording, editing, and debugging, so help you be successful doing this. Some of the high-level features are: 1) Finding and fixing dynamic parameters 2) Enabling an extensibility point in the recorder such that record/playback "just works" for any app (effectively enabling you to automate #1). 3) Enabling extensibility points for editing and viewing results of rich data types
Page 143
We have made a number of other improvements as well, most notably: Editor Improvements 1) 2) 3) 4) Support for looping and branching in Web tests Request details editor Create meaningful reports using the reporting name on Page Goal validation rule
Recorder Improvements 1) 2) 3) 4) Record/playback of file upload "just works" Record all dependents by default New recorder options in Tools Options Improvements in hidden field and dynamic parameter correlation
Debugging a Web Test to Find and Fix Dynamic Parameters The VS recorder automatically handles most classes of dynamic parameters: cookies, query string and form post parameter values, and hidden fields. In VS 2010 we have made incremental improvements on each of these. Yet there are still a few dynamic parameter patterns that cannot be deterministically detected and fixed up. Our goal with this release was to build tooling around the flow for debugging a web test, mostly to help find and fix dynamic parameters. This flow is described in Sean's seminal post, How to Debug a Web Test. The flow is this: 1) Record a Web test and play it back. Playback fails. 2) Look at the form post and query string parameters in a failed request and determine if any look to be dynamic 3) Go to the web test to determine if they are bound to an extracted value 4) If not, search through the log to find where the parameter is set. Or better yet, search through the recording log to find the unique value in order to find where it is getting set. 5) Add an extraction rule and bind the parameter value to the extracted value. In VS 2010, you'll find commands in Web test playback and the editor that seamlessly support this flow: 1) A new recorder log that enables you to see the http traffic that was generated from IE. This is a huge new feature critical for debugging tests. You can jump from a request, request parameter, or response in playback to the same context in the recording log to compare them. 2) Search in playback and search and replace in the Web test editor. These features are superimportant for quickly finding and fixing dynamic parameters. 3) Jump from a request in playback to that same request in the editor. This greatly increases the efficiency of the workflow. 4) Create an extraction rule directly from playback, automatically setting the correct parameters on the extraction rule. Again, this increases efficiency. Visual Studio Performance Testing Quick Reference Guide Page 144
Together, these features really grease the debugging and fix up workflow, and will make you much more efficient when working with web tests. A quick overview of the flow: From the Web test results viewer, select a from a failed request that looks like a dynamic parameter. Right-click from the parameter on the Request tab to jump to the editor to confirm it was not bound.
In the editor, you can see this value is not bound to a context parameter:
Now go back to the results viewer. At this point, you want to find the dynamic values in the response of one of the previous requests, as the dynamic parameter value had to have come from a response body (since that's how http and browsers work). To do this, you want to go to the recorder log. The reason you want to do this from the recorder log is that the recording will have the original recorded value in it. To do this, click on the recorder log icon (we really should have put this on the context menu too!).
This will take you to the same request with the same parameter selected. Now right-click on the parameter and do a quick find to find the parameter value in a previous response. Again, you want to do this from the recording log, since the parameter is dynamic the value will be in the recording log but not the playback log.
Page 145
Search up in response bodies to find the value. Note that if the dynamic string was constructed in java script, you may need to only search of the dynamic part of the value:
Once the extraction rule is added, you also need to bind the parameter values. Choose yes to the message box to launch search and replace from the Web test editor.
Page 146
You can see that we have added tooling to make finding and fixing dynamic parameters much easier in VS 2010!!! Engineering the Solution To engineer this solution, we made several important design changes to Web tests and Web test results. 1) First, we changed the persistence mechanisms for Web test results to store results to a separate log file rather than the in the trx. 2) We created a full public API for the Web test result. 3) We stamp request ids in each http request (enables jumping between playback and the editor). 4) The recorder generates a Web test result file and saves it as part of the recording. About the Web Performance Test Recorder Log The Recorder Log is a file stored in the same directory as the web test is recorded into. You can get to the recorder log from the Web test results viewer as shown above. Or you can open it from VS, browse to the Web test folder and look for *.webtestresult to find recorder log files in your project folder. The name of the recorded result file is stored in the RecordedResultFile attribute in the web test xml file. This file is not added to the project by default, if you wish to share it with team members consider adding it to the solution so it will get checked in to source control. The recorder log is persisted in the same file format as a Web test result. There is a full API over this data (see the WebTestResult and WebTestResultDetails classes). Adding Your Own Recorder Plugins to Make Record/Playback "Just Work" Once you have found and fixed up the dynamic parameters in a test, consider writing a recorder plugin to do this for you automatically each time you record a new test for this web site. Recorder plugins are a new, super-powerful capability to the VS 2010 recorder. Recorder plugins are an extensibility hook that gives you full access to the recorded result and the recorded Web test, and move seamlessly from a recorded request to that corresponding request in the web test. This enables you to make any modifications you see fit to the generated Web test. This is in effect a "catch-all", the ultimate power and productivity tool in your hands to save time fixing up Web tests.
Page 147
I really can't emphasize enough what a powerful solution this is. If you will be scripting a web site for any given period of time, and it requires you fix up the recordings, it will be worthwhile for you to invest in building a recorder plugin for it.
Recorder plugins can be used for any number of reasons: fixing up dynamic parameters (adding extraction rules and bindings), automatically adding validation rules, automatically adding data sources and doing data bindings, filtering out recorded dependents, etc. Recorder plugins are pretty straightforward to code up and install. Recorder Plugins derive from the WebTestRecorderPlugin class. Once you have implemented a plugin, just drop the assembly into either of these directories, and then restart VS: %ProgramFiles%\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\WebTestPlugins %USERPROFILE%\My Documents\Visual Studio 10\WebTestPlugins
Page 148
Here's sample recorder plugin code that adds an extraction rule and binds query string parameters to the extracted value.
using using using using System; System.Collections.Generic; System.Text; System.ComponentModel;
using Microsoft.VisualStudio.TestTools.WebTesting; using Microsoft.VisualStudio.TestTools.WebTesting.Rules; using System.Diagnostics; namespace RecorderPlugins { [DisplayName("Correlate ReportSession")] [Description("Adds extraction rule for Report Session and binds this to querystring parameters that use ReportSession")] public class CorrelateSessionId : WebTestRecorderPlugin { public override void PostWebTestRecording(object sender, PostWebTestRecordingEventArgs e) { // Loop through the responses in the recording, looking for the session Id. bool foundId = false; foreach (WebTestResultUnit unit in e.RecordedWebTestResult.Children) { WebTestResultPage recordedWebTestResultPage = unit as WebTestResultPage; if (recordedWebTestResultPage == null) { continue; } // If we haven't found the session Id yet, look for it in this response. if (!foundId) { // Look for the "ReportSession" string in the response body of a recorded request int indexOfReportSession = recordedWebTestResultPage.RequestResult.Response.BodyString.IndexOf("ReportSession"); if (indexOfReportSession > -1) { // Find the corresponding page in the test, this is the page we want to add an extraction rule to WebTestRequest requestInWebTest = e.RecordedWebTest.GetItem(recordedWebTestResultPage.DeclarativeWebTestItemId) as WebTestRequest; Debug.Assert(requestInWebTest != null); if (requestInWebTest != null) { foundId = true; string startsWith = "?ReportSession="; string endsWith = "&"; string contextParamName = "ReportSession"; AddExtractTextRule(requestInWebTest, startsWith, endsWith, contextParamName); e.RecordedWebTestModified = true; } } }
Page 149
else { // Once we have extracted the session id, bind any session id parameters to the context parameter // This call gets the corresponding request in the web test. WebTestRequest requestInWebTest = e.RecordedWebTest.GetItem(recordedWebTestResultPage.DeclarativeWebTestItemId) as WebTestRequest; Debug.Assert(requestInWebTest != null); if (requestInWebTest != null) { BindQueryStringParameter(requestInWebTest, "SessionId", "SessionId"); } } } } /// /// Code to add an ExtractText rule to the request. /// /// /// /// /// private static void AddExtractTextRule(WebTestRequest request, string startsWith, string endsWith, string contextParameterName) { // add an extraction rule to this request // Get the corresponding request in the Declarative Web test ExtractionRuleReference ruleReference = new ExtractionRuleReference(); ruleReference.Type = typeof(ExtractText); ruleReference.ContextParameterName = contextParameterName; ruleReference.Properties.Add(new PluginOrRuleProperty("EndsWith", endsWith)); ruleReference.Properties.Add(new startsWith)); ruleReference.Properties.Add(new "True")); ruleReference.Properties.Add(new "True")); ruleReference.Properties.Add(new ruleReference.Properties.Add(new "True")); ruleReference.Properties.Add(new PluginOrRuleProperty("UseRegularExpression", } public static void BindQueryStringParameter(WebTestRequest requestInWebTest, string queryStringParameterName, string contextParameterName) { // This code adds data binds the SessionId parameter to the context parameter foreach (QueryStringParameter param in requestInWebTest.QueryStringParameters) { if (param.Name.Equals(queryStringParameterName)) { param.Value = "{{" + contextParameterName + "}}"; PluginOrRuleProperty("StartsWith", PluginOrRuleProperty("HtmlDecode", PluginOrRuleProperty("IgnoreCase", PluginOrRuleProperty("Index", "0")); PluginOrRuleProperty("Required", "False"));
request.ExtractionRuleReferences.Add(ruleReference);
Page 150
} } } } }
More Recorder Enhancements in VS 2010 In addition to lighting up these powerful new scenarios, the VS 2010 does what it did in VS 2008 only better. No More Empty Recorder Pane With VS 2008, there were several cases for which the recorder would not record requests. Most of these involved the IE 7 and IE 8 process model, where these browsers start new processes when crossing security contexts (thus the need to run VS as Admin). These problems have been fixed in VS 2010, as the recorder now records across IE process boundaries. More Apps "Just Work" There were a few cases for which hidden field correlation and dynamic parameter detection did not work with VS 2008. You let us know about those cases, and we have improved the hidden field correlation and dynamic parameter detection tools to handle them in VS 2010. These were mostly around dynamic parameters in AJAX requests. And binary post bodies are now handled correctly, which were not always handled correctly with VS 2008. The recorder now also automatically handles File Uploads so they will "just work". Files that are uploaded are automatically added to the project, and the file upload file name will be dynamically generated to enable you to upload the same file to different names automatically.
Page 151
New Tools Options for the Recorder You asked for more control over the recorder, you got it with new recorder options in Tools Options:
Web Test Editor Enhancements in VS 2010 One of our goals with VS 2010 was to enable you to stay in "declarative" Web tests for more use cases without having to move to a coded Web test. One reason you had to go to code with VS 2005 or VS 2008 was to do looping or conditional execution in your Web test. Looping and Branching The VS 2010 declarative editor now supports powerful new looping and branching constructs. Looping and branching are based on conditional rules, which follow the exact same extensibility model as validation rules and extraction rules. There are many rules "in the box":
Page 152
You can see above, there are a lot of flexible rules already built in. A few scenarios this enables: 1) Conditional logins. In a load test, if you want to simulate a user logging in once and then doing many operations in the test, this can be accomplished easily in a conditional rule. Session IDs are typically handled by cookies, and you can easily set up a rule to only go to the login pages if the login has not happened yet. 2) Variability in your scripts. If you want users to occasionally skip steps in a script, or randomly repeat some steps, this is easily achieved with the probability rule which will only execute some requests based on the probability you specify. 3) Loop until some operation succeeds. If an operation is expected to fail for some users, but will succeed on retry, and you need to model the retry, you can do this by looping while the operation is not successful. To do this, use an extraction rule to indicate whether or not the action was successful, then use the Context Parameter Exists to loop until it is successful. You can debug your loops and conditions using the results viewer, which shows the results of conditional evaluations.
Page 153
A word of caution: do not use loops to do many, many loops within a given test. This will "gum up" the load test engine, since it function is to control virtual user execution. Also, an entire web test result is stored in memory, including all the loops. So running a web test with many loops will run your machine out of memory. You can still run these in a load test to avoid this, but for the reason stated above we recommend against this.
Page 154
More Editor Features I already talked about search and replace in the editor above. There is also a super-handy new Request Details editor that enables you to quickly see and edit the think times, reporting name, and goal for each page. You should use this view each time you record a new test.
Use the Reporting Name and Response Time Goals to really light up your excel load test reports, as both are propagated to the reports. Setting the response time goal will also help you to find slow requests in a load test, as by default there is a new Response Time Goal validation rule added to the test. This rule will fail pages that exceed the goal by the threshold you specify (by default the tolerance is 0). This rule will cause slow requests to fail, and enable you to collect logs on the failures, which may help you determine why the page is slow. New Extensibility Points to Handle Rich Data One area we did not address in VS 2010 is better editor and result viewer handling of rich data types. If you have AJAX code sending and receiving Web services, REST, or JSON requests, you know how difficult these are to work with. Like other releases, our mantra was if we couldn't get it in the box, we wanted to expose extensibility points to enable us and the community to add tooling to help in this area. To this end, we have enabled two extensibility points that will enable us to address this out of band: 1) Web test editor request body editor plugins. 2) New tabs and menu items in Web test result viewer.
Page 155
We plan to release new editor and playback plugins around the time we RTM, so keep an eye on codeplex.com\teamtestsplugins for new releases. Web Test Editor Request Body Plugins Web Test request body plugins provide a way to plug a custom editor into VS for editing form post bodies. These plugins implement either IStringHttpBodyEditorPlugin or IBinaryHttpBodyEditorPlugin, and enable you to customize the edit pane for different post body content. The IStringHttpBodyEditorPlugin interface is super-simple:
public interface IStringHttpBodyEditorPlugin { object CreateEditor(string contentType, string initialValue); string GetNewValue(); bool SupportsContentType(string contentType); }
Basically, SupportsContentType allows you to specify which content types your editor supports. When the editor encounters a particular content type, it will scan the list of editor plugins for the first one it finds to support that type, then host the editor control. The CreateEditor call is used by the plugin to instantiate an instance of the control and provides the initial value to be edited, and the GetNewValue is the way the plugin returns the result of the editing session. The IBinaryHttpBodyEditorPlugin is the same, except that it gets and puts byte arrays.
public interface IBinaryHttpBodyEditorPlugin { object CreateEditor(string contentType, byte[] initialValue); byte[] GetNewValue(); bool SupportsContentType(string contentType); }
We are working on creating new editors for the most common formats now, and will ship "out of band" to codeplex.com\teamtestplugins around RTM. Here's a screen shot of the editor handling msbin1 data in a rich way (I rubbed out some URLs of this public facing site):
Page 156
Web test editor plugins must be deployed to %ProgramFiles%\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\WebTestPlugins. Web Test Result Viewer Plugins The Web Test Result Viewer also supports design time plugins. There are many scenarios for these plugins, here are some example scenarios: 1) The coolest comes from dynaTrace. 2) Tools that automatically analyze the result to point out potential performance problems (see blogs.msdn.com\mtaute for an example). 3) Custom viewers for rich request and response data. This third scenario is the one I want to delve into more in this section. Just as you want a rich editing experience for working with Web services, REST, or JSON requests, you want a rich way to view this data in the result viewer as well. The Web test result viewer plugins provide the perfect extensibility point for this.
Page 157
Result viewer plugins are a bit more involved to code up and install the editor plugins. Like the response body editor, we are working on out of band plugins for the Web test result viewer. Here is a screen shot of the result view plugin for binary data:
Notice the tree views in the bottom panes, showing binary data as a tree. Conclusion Your takeaway after reading this blog post should be - "Wow, VS 2010 is fantastic and will save me tons of time creating and maintaining Web tests, I have to have it!" By working with you directly on the forums and through our blogs, we saw the types of problems you are hitting developing scripts. We also listened to your feedback and folded it back into the tool. In places we didn't have time to address, we've added extensibility points to enable us to deliver features to you out of band, and for you to create your own solutions. Now you can say: "I'm a performance tester, and Visual Studio 2010 was my idea!" Visual Studio Performance Testing Quick Reference Guide Page 158
Recently we had a customer support issue on trouble shooting the Network Emulation driver in VS2010 Ultimate while doing load testing. I thought a blog on how we troubleshooted and isolated the problem would be helpful, so here it is. In this blog, I discuss the problem, symptoms and also explain how Network Emulation works in 2010. I also suggest specific steps to consider to isolate and narrow down the problem. Scope This applies to Visual Studio 2010 Ultimate Customer Scenario The trouble shooting in this document is applicable to situations where you are attempting to use the Network emulation capability newly available in VS 2010 Ultimate while creating a new Load Test and in the "Edit Network Mix" screen of the wizard you select any other network type other than LAN.
Page 159
What is Network Emulation? Microsoft Visual Studio 2010 uses software-based true network emulation for all test types. This includes load tests. True network emulation simulates network conditions by direct manipulation of the network packets. The true network emulator can emulate the behavior of both wired and wireless networks by using a reliable physical link, such as an Ethernet. The following network attributes are incorporated into true network emulation: Round-trip time across the network (latency) The amount of available bandwidth Queuing behavior Packet loss Reordering of packets Error propagations.
True network emulation also provides flexibility in filtering network packets based on IP addresses or protocols such as TCP, UDP, and ICMP. This can be used by network-based developers and testers to emulate a desired test environment, assess performance, predict the effect of change, or make decisions about technology optimization. When compared to hardware test beds, true network emulation is a much cheaper and more flexible solution. How Network Emulation Works in VS2010 Network emulation in VS 2010 Ultimate uses a network device driver that was designed and built by the Microsoft Research Labs and is productized in Visual Studio 2010. The technology has been around since 2005 and is widely used within Microsoft across many server product teams. To use Network Emulation, you will need to install the Visual Studio 2010 Ultimate SKU. Network Emulation is configured as part of Add and new Load Test Type in Visual Studio and following the wizard screens (see above). Once you have set up network emulation following instructions at http://msdn.microsoft.com/en-us/library/dd997557.aspx, you will run your load tests. When the load test starts, it allocates a range of available ports for each of the Network profiles (DSL, 56.K Modem etc.) that you have selected in your network mix. This port range is available to the Network Emulation Driver that is enabled at run time (by default the network emulation driver is disabled). During load testing, when the load generator sends a request to the application under test it specifies a port from the port range. When the network emulation driver sees this port from the select port range, it is able to associate this port with the network profile that this request should follow. This enables the driver to throttle the load in software ensure it meets the network profile you have selected.
Page 160
How to know Network Emulation is not working? Often one of the symptoms you'll see is that load test records socket exceptions in the log such as the one below: "The requested address is not valid in its context xx.xx.xx.xxx:80"
NOTE: There may be other conditions that maybe causing such socket exceptions as well. The load test
may continue to work, but the socket exceptions get logged. The next section will help you isolate and trouble shoot where the problem lies. How to trouble shoot Network Emulation To troubleshoot and isolate problems effectively you must ensure that you have done the basic tests.
1. Ensure that you have full network connectivity across all the machines that are participating in your load test. 2. Ensure you have configured the Network Emulation correctly by following the instructions and making sure admin rights are available for the agent. 3. Ensure that any/all firewalls are dis-abled (at least for trouble shooting) to ensure that firewall is NOT blocking specific ports or traffic on the lab network. o a. Run tcpview (available here) to ensure that any socket connections are actually visible during run time (check for "red" highlights). You may also run your favorite port monitoring tool (portmon is another example) 4. Ensure that there is no virus software on the load generator machine that is possibly obstructing this software. 5. To isolate whether the problem is with the Network Emulation Driver or the Load Test Components you should: o a. Eliminating the network emulation driver as a cause Run the load test with network emulation configured correctly (even though you may be getting socket exceptions) Ping another host to see whether the output shows network show down and/or higher latency. Check if the delay value matches selected network profile. If the latency values match the profile you have selected, then the network driver is working well. From that agent machine where you are running the load test, attempt a connection to any host outside (like your favorite web page). This test verifies that while the load test is running and network driver is enabled, that external or lab connectivity is NOT a problem. This will isolate your network emulation driver from being a problem area. o b. Eliminating the Load Test Components as cause You should download and run this sample test program (available as is, not Microsoft supported) on the same machine as the load generator (agent machine). This sample program simulates the exact set of socket connection calls used in the load testing components. If this test program also displays Socket Exceptions (like in the image below) then this eliminates the Load Testing product as a cause for the socket exceptions and indicates the problem lies in the environment, machine, network or something external to the tooling. Please debug the external problem first before trying to run the load test again. Page 161
If this sample program is working correctly, you will see the output as below and this will confirm that there is a likely problem in the load test program and the environment is not the likely cause. Please contact support or post your query or situation in the forums for further help in this case.
Page 162
Known Issues There is a known issue with the Broadcom network cards where packets are dropped under heavy loads. We recommend if you run into this, try another network card until Broadcom addresses this problem. Also, if IPSEC is enabled, the ports in the network packet are encrypted and as such the network emulation driver will not be able to determine that the packets are from the designated port range as set by the load test engine (described above in "How Network Emulation Works in VS2010"). You must disable IPSEC for network emulation to work. Additional Resources: http://msdn.microsoft.com/en-us/library/dd505008(VS.100).aspx http://blogs.msdn.com/b/lkruger/archive/2009/06/08/introducing-true-network-emulation-in-visualstudio-2010.aspx
Page 163
The technology used to connect remote test execution components is .Net Remoting over Tcp ports. For incoming connections, by default, Test Controller uses Tcp port 6901 and Test Agent uses port 6910. The Client also needs to accept incoming connection in order to get test results from Controller, and, by default, it is using random port for that. For information on how to configure incoming ports, refer to the Tools section in Appendix. For outgoing connections random Tcp ports are used. For all incoming connections Test Controller authenticates calling party and checks that it belongs to specific security group. All connectivity issues can be divided into 2 main groups: network issues and security/permission issues. 2.1. Network/Firewall issues (mainly implied by .Net Remoting technology): Controller :
Listens on TCP port 6901 (can be configurable to use different port). Needs to be able to make outgoing connection to Agents and to the Client. Needs incoming "File and Printer sharing" connection open. Listens on TCP port 6910 (can be configurable to use different port). Needs to be able to make outgoing connection to Controller. Needs to be able to accept incoming calls. Usually you would get Firewall notification when Controller tries to connect to Client 1<sup>st</sup> time. On Windows 2008 Server the notifications are disabled by default and you would need to manually add Firewall exception for Client program (devenv.exe, mstest.exe, mlm.exe) so that it can accept incoming connections.
Agent:
Client:
Page 164
By default, random TCM port is used for incoming connections. If needed, the incoming port can be configured (see the Tools section in Appendix). Needs to be able to make outgoing connection to Controller.
2.2. Permissions There are two scenarios which are different by how Test Controller is operating, and the permissions used by Controller differ depending on the scenario:
Test Controller runs as standalone: physical environments (VS2008 or VS2010). Test Controller is connected to TFS server: virtual environments (VS2010 only).
2.2.1. Permissions: Test Controller not connected to TFS server: To run tests remotely, Client user must belong to either TeamTestControllerUsers, or TeamTestControllerAdmins, or Administrators local group on Controller machine.
To manage Controller/Agent, Client user must belong to TeamTestControllerAdmins or Administrators local group on Controller machine. Agent service account must belong to either TeamTestAgentService or Administrators local group on Controller machine. Controller service account must belong to either TeamTestControllerUsers or Administrators local group on Controller machine. Service accounts with empty/no passwords are not supported.
2.3. Connection Points: Summary Review of the connections gives high level picture of what can fail in Test Controller/Agent connectivity. At this point you can already have a clear idea which requirement is not met for your specific scenario. Next section provides step-by-step troubleshooting. 3. Step-by-step troubleshooting Let's walk through general troubleshooting procedure for Test Controller/Agent connection issues. For simplicity we'll do that in step-by-step manner. Before following these steps you may take a look at Known Issues section in the Appendix to see if your issue is one of known common issues. The troubleshooting is based on the key connection points and in essence involves making sure that:
The services are up and running. Permissions are set up correctly. Network connectivity/Firewall issues.
There are two scenarios which are different by how Test Controller is operating, and troubleshooting steps differ depending on the scenario; hence we will consider each scenario separately:
Test Controller runs as standalone: physical environments (VS2008 or VS2010). Test Controller is connected to TFS server: virtual environments (VS2010 only). Page 165
3.1. Step-by-step troubleshooting: VS2008 or VS2010 physical environments Pre-requisites. Make sure you have necessary permissions.
Depending on what you need to troubleshoot, you may need Administrator permissions on Agent and/or Controller machines.
Step 1. Make sure that the Controller is up and running and Client can connect to Controller.
Use Visual Studio or Microsoft Test Manager (see Tools section above) to view Controller status. If you can't connect to Controller, make sure that Controller service is running:
On Controller machine (you can also do that remotely) re/start controller service (see Tools section in Appendix).
(if you still can't connect) On Controller machine make sure that it can accept incoming connections through Firewall
Open port 6901 (or create exception for the service program/executable). Add Firewall Exception for File and Printer Sharing.
(if you still can't connect) make sure that the user you run the Client under has permissions to connect to Controller:
(if you still can't connect) On Client machine make sure that Firewall is not blocking incoming and outgoing connections:
Make sure that there is Firewall exception for Client program (devenv.exe, mstest.exe, mlm.exe) so that it can accept incoming connections. Make sure that Firewall is not blocking outgoing connections. VS2010 only: the simplest at this time is to re-configure the Controller:
On Controller machine log on as local Administrator, run the Test Controller Configuration Tool (see Tools section above) and re-configure the Controller. All steps should be successful.
(if you still can't connect) Restart Controller service (see the Service Management commands section in Tools section above)
Step 2. Make sure that there is at least one Agent registered on Controller.
Use Visual Studio (Manage Test Controllers dialog) or Microsoft Test Manager (see Tools section in the Appendix) to view connected Agents. If there are no Agents on the Controller, connect the Agent(s).
VS2010 only:
On Agent machine log in as user that belongs to TeamTestAgentServiceAdmins. On Agent machine open command line and run the Test Agent Configuration Tool (see Tools section in the Appendix).
Page 166
Check 'Register with Test Controller', type controller machine name and click on 'Apply Settings'.
VS2008 only:
In Visual Studio (Manage Test Controllers dialog) click on Add Agent. You may need to restart the Agent service.
Step 3. Make sure that Agent is running and Ready (for each Agent) Agent status can be one of: Ready/Offline (temporary excluded from Test Rig)/Not Responding/Running Tests.
Use Visual Studio or Microsoft Test Manager (see Tools section in the Appendix) to check Agent status. If one of the Agents is not shown as Ready, make sure that Agent service is running:
On Agent machine (you can also do that remotely) re/start Agent service (see Tools section in the Appendix).
On Agent machine log on as local Administrator and run the Test Agent Configuration Tool (see Tools section in the Appendix) and re-configure the Agent.
(if Agent is still not Ready) If Agent is shown as Offline, select it and click on the Online button. On Agent machine make sure that agent service can accept incoming connections on port 6901 (if Firewall in on, there must be Firewall exception either for the port or for the service program/executable).
Make sure that Agent service account belongs to the TeamTestAgentService on the Controller.
On Controller machine use Computer Management->Local Groups to add Agent user to the TeamTestAgentService group. Restart services: Stop Agent service/Stop Controller service/Start Controller service/Start Agent service.
Make sure that Agent machine can reach Controller machine (use ping). Restart Agent service (see the Service Management commands section in Tools section above).
Step 4. If all above did not help, it is time now to analyze diagnostics information.
(VS2010 only) Agent/Controller services by default log errors into Application Event Log (see Tools section in the Appendix).
Page 167
Check for suspicious log entries there. Get trace for the components involved in your scenario, some/all of:
Enable tracing see Diagnostics section above. Controller Agent Client Test Agent/Controller Configuration Tool
Make sure that Controller/Agent service accounts have write access to trace files. Check for entries starting with "[E".
Step 5. Take a look at Known Issues section in the Appendix to see if your issue is similar to one of those. Step 6. Collect appropriate diagnostics information and send to Microsoft (create Team Test Forum post or Microsoft Connect bug).
Page 168
4. References
The following is a list of useful information sources related to Test Agent/Controller troubleshooting.
Troubleshooting Test Execution in MSDN. Troubleshooting Controllers, Agents and Rigs (VS2008) in MSDN. Installing and Configuring Visual Studio Agents (VS2010) in MSDN. Understanding Visual Studio Load Agent Controller (Load Test team blog). Troubleshooting errors in lab management (Team Lab blog). Visual Studio Team System Test Forum. Microsoft Connect report bugs/suggestions.
Appendix 1. Tools
Visual Studio: Premium (VS2010 only), Team Test Edition (VS2008 only).
Manage Test Controllers dialog (Main menu->Test->Manage Test Controllers): see status of Controller and all connected Agents, add/remove Agents to Controller, restart Agents/the whole test rig, bring Agents online/offline, configure Agent properties.
Note: on VS2008 this dialog is called Administer Test Controllers. VS2008: update Test Run Configuration to enable remote execution (Main Menu->Test>Edit Test Run Configurations->(select run config)->Controller and Agent->Remote>provide Test Controller name), then run a test.
VS2010: update Test Settings to use remote execution role (Main Menu->Test->Edit Test Settings -> (select test settings)->Roles->Remote Execution), then run a test.
Lab Center->Controllers: see status of Controller and all connected Agents, add/remove Agents to Controller, restart Agents/the whole test rig, bring Agents online/offline, configure Agent properties. Note that Lab Center only shows controllers that are associated with this instance of TFS.
It is run as last step of Test Controller setup. You can use it any time after setup to re-configure Controller. The tool has embedded diagnostics which makes it easier to detect issues.
You can use it any time after setup to re-configure Agent. The tool has embedded diagnostics which makes it easier to detect issues.
Diagnostics information
Both Agent and Controller can be configured to trace diagnostics information (from errors to verbose) to Application Event Log or trace file. Clients can also be configured to trace (from errors to verbose) to trace file.
Tracing can be enabled via .config file or registry (VS2010 only), registry wins. Choose the method that is more convenient for your scenario. Enable tracing via .config file(s):
One of the advantages of using config files is that you can enable tracing for each component separately and using trace settings specific only to this component.
For Controller Service/Agent Service/Agent Process, you need the following sections in the corresponding .config file (qtcontroller.exe.config, qtagentservice.exe.config, qtagent.exe.config, qtagent32.exe.config which by default are located in C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE):
Trace files:
For Client, add the following section to appropriate .config file (devenv.exe.config, mstest.exe.config, mlm.exe.config):
Page 170
<add name="EqtListener" type="System.Diagnostics.TextWriterTraceListener" initializeData="C:\EqtTrace.log" /> </listeners> </trace> <switches> <add name="EqtTraceLevel" value="Verbose" /> </switches> </system.diagnostics>
Trace file: trace will go to the file specified by the initializeData attribute.
Important: please make sure that the location is writable by controller/agent service/process.
One of the advantages of using registry is that you can enable tracing for all components using just one setting, you don't have to modify multiple configuration files.
Create a file with the following content, rename it so that it has .reg extension and double click on it in Windows Explorer:
Windows Registry Editor Version 5.00 [HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\10.0\EnterpriseTools \QualityTools\Diagnostics] "EnableTracing"=dword:00000001 "TraceLevel"=dword:00000004 "LogsDirectory"="C:\"
Notes:
In case of Test Controller/Agent services the HKEY_CURRENT_USER is the registry of the user the services are running under. TraceLevel: 0/1/2/3/4 = Off/Error/Warning/Info/Verbose. LogsDirectory is optional. If that is not specified, %TEMP% will be used. Trace file name is <Process name>.EqtTrace.log, e.g. devenv.EqtTrace.log.
Tracing from Test Controller Configuration Tool and Test Agent Configuration Tool:
To get trace file, click on Apply, then in the "Configuration Summary" window on the view log hyperlink in the bottom.
SysInternals' DebugView can also be used to catch diagnostics information. Controller, Agent and Client use settings from application configuration files: Page 171
Controller service: qtcontroller.exe.config Agent service: qtagentservice.exe.config Agent process: qtagent.exe.config (neutral/64bit agent), qtagent32.exe.config (32bit agent). VS: Devenv.exe.config. Command line test runner: mstest.exe.config. By default these files are located in C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE.
Default ports used by Controller/Agent/Client can be used by some other software. There is firewall between controller and client. In this case you would need to know which port to enable in the firewall so that Controller can send results to the Client.
Agent Service:
<appSettings><add key="AgentServicePort" value="6910"/></appSettings>
Client: add the following registry values (DWORD). The Client will use one of the ports from this range for receiving data from Controller:
Visual Studio Test Controller Visual Studio Test Agent net start vsttcontroller net start vsttagent
Start->Control Panel->Windows Firewall. Start->Run->rsop.msc (on both Agent and Controller machines) Go to Computer configuration->windows settings->security settings->ip security policies Check if there are any policies that may prevent connections. By default there are no policies at all.
IP Security Policy
Computer Management
Local Groups
Ping
You can use ping to make sure that general TCP/IP network connectivity works. You can use telnet to check that you can connect to Agent/Controller, i.e. Firewall is not blocking, etc.
Telnet
Visual Studio Team System Test Forum. Microsoft Connect report bugs/suggestions.
Appendix 2. Known issues The following is a list of known issues and suggested resolutions for them. 2.1. The message or signature supplied for verification has been altered (KB968389) Symptom: Agent cannot connect to Controller. Affected scenarios: Windows XP/Windows 7 connecting to Windows 2003 Server. Additional information:
EventL Log (Agent): The message or signature supplied for verification has been altered. Trace file (Agent) contains:
I, <process id>, <thread id>, <date>, <time>, <machine name>\QTAgentService.exe, AgentService: The message or signature supplied for verification has been altered. I, <process id>, <thread id>, <date>, <time>, <machine name>\QTAgentService.exe, AgentService: Failed to connect to controller. Microsoft.VisualStudio.TestTools.Exceptions.EqtException: The agent can connect to the controller but the controller cannot connect to the agent because of following reason: An error occurred while processing the request on the server: System.IO.IOException: The write operation failed, see inner exception. ---> System.ComponentModel.Win32Exception: The message or signature supplied for verification has been altered at System.Net.NTAuthentication.DecryptNtlm(Byte[] payload, Int32 offset, Int32 count, Int32& newOffset, UInt32 expectedSeqNumber) at System.Net.NTAuthentication.Decrypt(Byte[] payload, Int32 offset, Int32 count,
Page 173
Int32& newOffset, UInt32 expectedSeqNumber) at System.Net.Security.NegoState.DecryptData(Byte[] buffer, Int32 offset, Int32 count, Int32& newOffset) at System.Net.Security.NegotiateStream.ProcessFrameBody(Int32 readBytes, Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest) at System.Net.Security.NegotiateStream.ReadCallback(AsyncProtocolRequest asyncRequest) --- End of inner exception stack trace --at System.Net.Security.NegotiateStream.EndRead(IAsyncResult asyncResult) at System.Runtime.Remoting.Channels.SocketHandler.BeginReadMessageCallback(IAsyncResult ar) Server stack trace: at Microsoft.VisualStudio.TestTools.Controller.AgentMachine.VerifyAgentConnection(Int32 timeout)
Root cause: You installed KB968389 either via Windows Update or manually. Resolution: uninstall KB968389 from Start->Control Panel->Programs and Features->View Installed Updates. 2.2. Controller/Agent in untrusted Windows domains or one is in a workgroup and another one is in domain. Symptom: Agent cannot connect to Controller. Affected scenarios: Test Controller and Agent are not in the same Windows domain. They are either in untrusted domains or one of them is in a domain and another one is in a workgroup. Additional information:
W, <process is>, <thread id>, <date>, <time>, <mMachine name>\QTController.exe, Exception pinging agent <agent name>: System.Security.Authentication.AuthenticationException: Authentication failed on the remote side (the stream might still be available for additional authentication attempts). ---> System.ComponentModel.Win32Exception: No authority could be contacted for authentication Server stack trace: at System.Net.Security.NegoState.ProcessReceivedBlob(Byte[] message, LazyAsyncResult lazyResult) at System.Net.Security.NegotiateStream.AuthenticateAsClient(NetworkCredential credential, ChannelBinding binding, String targetName, ProtectionLevel requiredProtectionLevel, TokenImpersonationLevel allowedImpersonationLevel) at System.Net.Security.NegotiateStream.AuthenticateAsClient(NetworkCredential credential, String targetName, ProtectionLevel requiredProtectionLevel, TokenImpersonationLevel allowedImpersonationLevel) at System.Runtime.Remoting.Channels.Tcp.TcpClientTransportSink.CreateAuthenticatedStream( Stream netStream, String machinePortAndSid) at System.Runtime.Remoting.Channels.BinaryClientFormatterSink.SyncProcessMessage(IMessage msg)
Root cause: Due to Windows security, Agent cannot authenticate to Controller, or vice versa. Resolution:
Visual Studio Performance Testing Quick Reference Guide Page 174
Mirror user account on Controller and Agent: create a user account with same user name and password on both Controller and Agent machine. Use mirrored user account to run Controller and Agent services under this account. If you are using VS2010 RC+ version (i.e. RC or RTM but not Beta2), add the following line to the qtcontroller.exe.config file under the <appSettings> node:
<add key="AgentImpersonationEnabled" value="no"/>
Restart Controller/Agent services (see Tools section in the Appendix). Make sure there is no IP Security Policy that prevents the connection (see IP Security Policy under Tools section in the Appendix).
By default for domain machines Windows uses domain (Kerberos) authentication, but if it fails it will fall back to workgroup (NTLM) authentication. This behavior can be and often is altered by IP Security policies, for instance, there could be a policy to block connections from machines which do not belong to the domain.
Page 175
Best Practice: Blog on various considerations for web tests running under load
The following blog entry describes a number of different features and settings to consider when running web tests under a load test in VSTT (a link to the blog entry is at the bottom of this topic). The following topics are covered:
General Load Test Considerations o Verify web tests and unit tests o Choose an appropriate load profile Using a Step Load Profile Using a Goal-Based Load Profile o Choosing the location of the Load Test Results Store o Consider including Timing Details to collect percentile data o Consider enabling SQL Tracing o Don't Overload the Agent(s) o Add an Analysis Comment Consideration for Load Tests that contain Web Tests o Choose the Appropriate Connection Pool Model ConnectionPerUser ConnectionPool o Consider setting response time goals for web test requests o Consider setting timeouts for web test requests o Choose a value for the "Percentage of New Users" property o Consider setting the "ParseDependentRequests" property of your web test requests to false
http://blogs.msdn.com/billbar/articles/517081.aspx
Page 176
Page 177
Troubleshooting
How to enable logging for test recording
Changed in 2010
In 2008 You can create a log file of each recording which will show headers and post body as well as returned headers and response. The way to enable this is to add the following 2 keys: [HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools \QualityTools\WebLoadTest] "CreateLog"=dword:00000001 [NOTE: 1=create; 0=do not create] "RecorderLogFolder"="C:\\recordlogs" In 2010 The Web test recorder automatically logs the requests and responses. See the section Recorder Log Available
Fix: the web test recorder bar does not work with Internet Explorer Enhanced Security Configuration (IE ESC) enabled. IE ESC can be removed from within the Control panel -> Add Remove Programs / Windows Components and uncheck ESC (Windows Server 2003, Vista). Windows Server 2008 requires a different process to disable this security feature. Start the Server Manager, browse to the Security Information section and click Configure IE ESC. In the next window decide for whom you want to enable or disable this feature. For more details and screenshots: http://blogs.techrepublic.com.com/datacenter/?p=255
Page 178
Issue: When recording web test new browser opens but recorder controls are not present Fix: Enable Add-ons Open IE > Tools > Internet Options > Program Tab > Manage Add-ons Button Highlight Web Test Recorder 10.0 Press Enable Button
Page 179
Page 180
Page 181
Page 182
To enable verbose logging for the agent process: * Edit the file QTAgent.exe.config: (located at <Program Files>\Microsoft Visual Studio 9.0 Team Test Load Agent\LoadTest\QTAgent.exe.config) * Change: <add key="CreateTraceListener" value="no"/> to "yes" * Change: <add name="EqtTraceLevel" value="3" /> to "4" * The file "VSTTAgentProcess.log" will be created in the same directory as QTAgent.exe.config. * Re-run the load test, and look for lines in the log file that look something like: "Bound request on connection group M to IP address NNN.NNN.NNN.NNN" If verbose logging is enabled and these lines are present in the log file, IP Switching should be working. 6) If the number of unique IP addresses being used as shown by the log entries in step 5 is less than the number in the range that was configured, it could be because your load test is configured to use a connection pool with a smaller number of connections than the number of IP addresses specified. If this is the case, you can increase the size of the connection pool, or switch to "Connection per User" mode in the load test's run settings properties.
One example of how these counters can help with load testing is shown in this comment from the VS Test Product Team: "it could provide a better indication than any other existing performance counter to determine when an agent running a load test that contains Web tests is overworked", and a good indication of how much that is affecting the response times reported for the load test. If thats correct, then it would be very valuable to add this category and counter to the Agent counter set in your load test and define a threshold rule on it. We havent experimented with these performance counters yet, but Yun plans to soon. Note that it doesnt help at all for load tests containing unit tests unless those unit tests happen to be coded to use the HttpWebRequest class.
Page 183
If you try to connect the two web tests before generating any code, your test will fail with the following error:
There is no declarative Web test with the name 'DrillDown_Coded' included in this Web test; the string argument to IncludeWebTest must match the name specified in an IncludeDeclarativeWebTest attribute.
How to use methods other than GET and POST in a web test
Summary FormPostHttpBody and StringHttpBody are the two built-in classes for generating HTTP request bodies. If you need to generate requests containing something other than form parameters and strings then you can implement an IHttpBody class. More information http://blogs.msdn.com/joshch/archive/2005/08/24/455726.aspx
Page 184
Page 185
In 2008 Summary It is possible to create a custom data binding to bind to something other than a table, such as a select statement. This blog post describes one possible method creating one class which will manage the data and creating a web test plug-in to add the data into the web test context. More information http://blogs.msdn.com/slumley/pages/custom-data-binding-in-web-tests.aspx In 2010 http://blogs.msdn.com/slumley/archive/2010/01/04/VS-2010-feature-data-source-enhancements.aspx
Page 186
// Generate our random user Random randObj = new Random(); x = randObj.Next(); y = this.Context.WebTestUserId; sUserName = sRndName + Convert.ToString(x) + Convert.ToString(y) + sRndExt;
Or, in a declarative test this can be achieved by setting the username value to: UserName{{$Random(0,10000)}}{{$WebTestUserId}}UserNameExt
Page 188
Page 189
Page 190
How to use Unit Tests to Drive Load with Command Line Apps
The following code can be used in a Unit Test to drive a command line tool (such as a testing tool). The Unit test can then be driven by a load test to emulate multiple copies of the app.
using System.Threading; using System.Diagnostics; using System.IO; ....... [TestMethod] public void TestMethod1() { int x=0; int iDuration = 10000; try { Process myProcess = new Process(); myProcess = Process.Start("c:\\temp\\conapp2.exe", "arg1", "arg2"); myProcess.WaitForExit(iDuration); //Max iDuration milliseconds to return if (!myProcess.HasExited) //If the app has not exited, kill it manually { myProcess.Kill(); Console.WriteLine("Application hung and was killed manually."); } else { x = myProcess.ExitCode; Console.WriteLine("Completed. Exit Code was {0}", x); } } catch (Exception e) { Console.WriteLine("The following exception was raised: " + e.Message); } finally { } }
How to add Console Output to the results store when running Unit tests under load
The following link points to a write-up on how to allow unit tests to write custom output messages to the Load Test Results Store database from Unit tests while they are running in a load test: http://blogs.msdn.com/billbar/pages/adding-console-output-to-load-tests-running-unit-tests.aspx
Page 191
Page 192
How to create a webtest plugin that will only execute on a predefined interval
If you want to write a webtest plugin that will only fire on certain intervals (maybe for polling or reporting), then use the following as a starting point.
public class WebTestPluginActingInfrequently : WebTestPlugin { public override void PostWebTest(object sender, PostWebTestEventArgs e) { if (e.WebTest.Context.WebTestIteration % 100 == 1) { // Do something } } }
The WebTestIteration property is guaranteed to be unique, so no need to worry about locking. If you run this web test by itself it will "do something" because the WebTestIteration will be 1 (unless you run the web test by itself with multiple iterations or data binding). Rather than hard coding the frequency as 1 in 100, you could make the frequency a property of the plugin that you set in the Web test editor, or a Web test context parameter or a load test context parameter: the LoadTestPlugin would need to pass that down to the WebTestPlugin either by setting it in the WebTestContext or just make the frequency a property on the plugin. Note that the WebTestIteration property is incrememented separately for each Scenario (on each agent) in the load test, but if you want the frequency to be across all Web iterations on an agent then you could define a static int in the WebTestPlugin (and use Interlocked.Increment to atomically increment it).
Page 193
If you develop a plug-in or an extraction rule and you want to allow the properties you expose to be Context Parameters that the user specifies you need to add some code to your plugin to check for the existence of a Context Paramter using the curly brace '{{xyz}}' syntax. For example suppose the user had a Context Parameter {{ComparisonEventTarget}} that they want to provide as the property value for the EventTarget property in your plugin, (see the screen shot), then use the following code snippet to have your extraction/plugin checks the value supplied to determine if it contains the syntax "{{". Here is a partial code snippet:
public class DynamicFormFields : WebTestRequestPlugin { // this is our property that is exposed in the Visual Studio UI //we want to allow either supplying a string literal, or a context paramerter name public string EventTarget {get;set;} public override void PreRequest(object sender, PreRequestEventArgs e) { //we will check to see if our EventTarget is a string or do they want us to get it from a context param if ( this.EventTarget.Contains("{{") ) { string contextParamKey = this.EventTarget.Replace("{{", string.Empty).Replace("}}", string.Empty); this.EventTarget = e.WebTest.Context[contextParamKey].ToString(); } //. . . . . . . code to do your work starts here
Page 194
How To: Modify the ServicePointManager to force SSLv3 instead of TLS (Default)
If you need to modify the type of SSL connection to force SSLv3 instead of TLS (Default) then you must modify the ServicePointManager.SecurityProtocol property to force this behavior. This can happen if you are working with a legacy server that requires an older SSLv3 protocol and cannot negotiate for the higher TLS security protocol. In addition, you may need to write code in your test to handle the ServerCertificateValidationCallback to determine if the server certificate provided is valid. A code snippet is provided below.
[TestMethod] public void TestMethod1() { // We're using SSL3 here and not TLS. Without this line, nothing works. ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3; //we wire up the callback so we can override the behavior, force it to accept the cert from the server. ServicePointManager.ServerCertificateValidationCallback = RemoteCertificateValidationCB; --------- <XX SNIPPED XX> --------public static bool RemoteCertificateValidationCB(Object sender, X509Certificate certificate, X509Chain chain, SslPolicyErrors sslPolicyErrors) { //If it is really important, validate the certificate issuer here. //string resultsTrue = certificate.Issuer.ToString(true); //For now, accept any certificate return true; }
Page 195
Validation rule. First you will need to add a dummy request after the page you want to check. The URL is not important because you are going to change it based on outcome of the validation rule. In your validation rule set a context parameter that contains the URL you want to redirect to. Here is a very simple rule that does this. If return code is great than 400, it adds the URL to the context. In this case, it is just redirecting to home page of the site.
public class ErrorCheckValidationRule : ValidationRule { public override void Validate(object sender, ValidationEventArgs e) { if (((int) e.Response.StatusCode) >= 400) {
e.WebTest.Context.Add("ErrorUrl", e.WebTest.Context["WebServer1"].ToString()+"/storecsvs/");
} } }
Page 196
WebTestRequestPlugin First you will need to add a dummy request after the page you want to check. The URL is not important because you are going to change it in the plugin. Add a WebTestRequestPlugin to the dummy request. The plug-in will look for the parameter and if it exists, it will change URL of request. If the parameter does not exist, it will set the skip instruction for the request. Here is a simple plug-in which does this:
public class ErrorCheckPlugin : WebTestRequestPlugin { public override void PreRequest(object sender, PreRequestEventArgs e) { object errorUrl; if (e.WebTest.Context.TryGetValue("ErrorUrl", out errorUrl)) { e.Request.Url = errorUrl.ToString(); } else { //if it does not exist then skip the request e.Instruction = WebTestExecutionInstruction.Skip; } } }
Here is what the web test looks like: The dummy request is http://localhost.
Page 197
Here is what the result looks likes when it is skipped. You can see the status of Not Executed:
A solution for VS 2010 using the new conditional rule logic that works for declarative editor. In VS 2010 you can now do branching and looping in declarative editor. So instead of a web test request plug-in, we can do the redirect with a conditional rule. So you would do the following: 1. 2. 3. 4. 5. 6. Add the validation rule to a request. Still add the dummy request below the one that the validation rule is on Set the URL for this dummy request to {{ErrorUrl}} Right click on this request and choose "Insert Condition" Choose the Context Parameter Exists rule Set the context parameter Name to ErrorUrl. This rule will execute if the ErrorUrl parameter is in the context. 7. Click Ok
Page 198
Page 199
How to add a Web Service reference in a test project - testing services in Unit Tests
If you follow along Sean Lumley's blog (http://blogs.msdn.com/slumley/pages/load-testing-web-serviceswith-unit-tests.aspx), as referenced in the cheat sheet, you'll see that step 2 is to create a New Web Reference. Unfortunately, right-clicking on either the project or references does not give you the option for Add Web Reference. To add the reference, add a service reference: In 2008 http://blogs.msdn.com/slumley/pages/load-testing-web-services-with-unit-tests.aspx
In 2010
Page 200
You will get the following dialog and can add the reference there:
Page 201
D:\TcpView>Tcpvcon.exe -c smsvchost.exe | find "808" /c TCPVCON is a sysinternals tool that is part of "TCPView" and can be downloaded from: http://technet.microsoft.com/en-us/sysinternals/bb795532.aspx If you need to run this command (or others) remotely, you can also look at the tool "PsTools" at the same web page.
Page 202
You can use MSTEST.EXE to start your load test outside Visual Studio. In that case you might run into errors with missing DLLs for plugins that you do not encounter when running your load test inside Visual Studio. Visual Studio looks at references to figure out what to deploy, while MSTEST.EXE does not. To fix this you have to manually add the DLLs as deployment items in the test settings (VS2010) or test run configuration file (VS2008). Select the test settings file that you want to use with MSTEST.EXE. This will be one of the files in the Solution Items folder of your solution with the .testsettings extension [In 2010] .testrunconfig extension [In 2008] Open it in the Test Settings Editor. Go to the Deployment page. Select "Add File" and select the DLLs you want to deploy.
Specify the test settings file you have edited on the command line for MSTEST.EXE with the /testsettings switch [In 2010] /testrunconfig switch [In 2008]
Page 203
Page 204
VS 2010 also allows you to set the cursor to a specific row: this.MoveDataTableCursor("DataSource1", "Products",32);
Page 205
Page 206
Page 207
namespace TestProject1 { public class MyUserObject { public int UserId { get; set; } public string SomeData { get; set; } } [TestClass] public class UnitTestWithUserObjects { private static object s_userObjectsLock = new object(); private static Dictionary<int, MyUserObject> s_userObjects = new Dictionary<int, MyUserObject>(); private TestContext testContextInstance; public UnitTestWithUserObjects() { } public TestContext TestContext { get{ return testContextInstance;} set{ testContextInstance = value;} } [TestMethod] public void TestWithUserObjects() { MyUserObject userObject = GetUserObject(); Console.WriteLine("UserId: " + userObject.UserId); DoSomeThingWithUser(userObject); } private MyUserObject GetUserObject() { int userId; if (this.TestContext.Properties.Contains("$LoadTestUserContext")) { LoadTestUserContext loadTestUserContext = TestContext.Properties["$LoadTestUserContext"] as LoadTestUserContext; userId = loadTestUserContext.UserId; } else {
Page 208
userId = 1; } MyUserObject userObject; lock (s_userObjectsLock) { if (!s_userObjects.TryGetValue(userId, out userObject)) { userObject = new MyUserObject(); userObject.UserId = userId; s_userObjects.Add(userId, userObject); } } return userObject; } private void DoSomeThingWithUser(MyUserObject userObject) { } } }
The solution proposed gives you a unique ID (a load test user Id) as an int. You would need to write code to map the integer value to a unique user name. There are several ways to do this I would suggest that you use a DB table (or .csv file) where each row contains the load test user ID integer as well as the data you need for each user (username, password, anything else). You would then need to write code in your unit test (not using the unit test data binding feature) that reads a row from the database using the LoadTestUserId to get the correct row for that user. A more efficient and only slightly more complex solution would be to load all of the data from this user DB table into memory in the unit test's ClassInitialize method and store it in a static member variable of type Dictionary<int, UserObject> where the int key is the LoadTestUserId. Then as each test method runs it gets the LoadTestUserId as shown in the code attached to the attached email and looks up the user data in this static Dictionary.
Page 209
How to set default extensions that the WebTest recorder will ignore
The following registry entries will dictate the behavior of the webtest recorder:
[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\10.0\EnterpriseTools\QualityTools\WebLoadTest] "WebTestRecorderMode"="exclude" "ExcludeMimeTypes"="image;application/x-javascript;application/x-ns-proxy-autoconfig;text/css" "ExcludeExtensions"=".js;.vbscript;.gif;.jpg;.jpeg;.jpe;.png;.css;.rss"
If you want to get the LoadTestRunId from a load test (this is the ID used in the results database), then you can use the following code inside a load test plugin:
public class LoadTestPlugin1 : ILoadTestPlugin { LoadTest m_loadTest; public void Initialize(LoadTest loadTest) { m_loadTest = loadTest; m_loadTest.LoadTestStarting += new EventHandler(LoadTest_LoadTestStarting); } void LoadTest_LoadTestStarting(object sender, EventArgs e) { long x; long.TryParse(m_loadTest.Context["LoadTestRunId"].ToString(), out x); } }
Page 210
Page 211
Page 212
--NEW--How to store and view transaction times for Unit and Coded UI tests
When running unit tests or Coded UI tests, the standard timer information is not viewable in the standard test results view and only viewable if you view the Results Summary by clicking on the test run complete link. Here is a simple unit test:
[TestMethod] public void TimerTestExanple() { this.TestContext.BeginTimer("Timer1"); Thread.Sleep(2000); this.TestContext.EndTimer("Timer1"); Console.WriteLine("Completed the Timer1 code"); }
When you run the test and click on the test results for that specific test, the following view appears.
Page 213
The timer information does not appear. To view the timer information, you must click on the test run completed link to see the timers.
--UPDATED--HOW TO: Handle 404 errors in dependent requests so the main request does not fail.
A common issue is a dependent request getting a 404 will fail the main request and abort the run. (more information is in the article "ERRORS IN DEPENDENT REQUESTS IN A LOAD TEST DO NOT SHOW UP IN THE DETAILS TEST LOG " To get around this, you could either do the plugin or simply follow the steps below. 1. 2. 3. 4. 5. 6. 7. Select the failing dependent request in the playback log Copy the request (right click) Go to Web Test This should highlight the parent request Right click and Add Dependant Request Change the properties of new dependant request with the URI you copied above. Change HTTP Staus from 0 to 404
Page 214
--NEW--HOW TO: Minimize the amount of data a webtest retains for Response Bodies
If you wish to minimize the footprint of a webtest and want to make the test a bit faster, you can the ResponseBodyCaptureLimit size to 0/1 and the test run will not store any of the response data and it will not do any type of processing on it. The full response body WILL be downloaded so the timing of the request/response is still valid. To see how to implement this, look at the article "FILE DOWNLOADS, DOWNLOAD SIZE AND STORAGE OF FILES DURING WEB TESTS"
http://msdn.microsoft.com/en-us/library/ms182487(VS.80).aspx
SEAN LUMLEYS B LOG ON HOW TO RUN FROM COMMAND LINE :
http://blogs.msdn.com/b/slumley/archive/2008/12/22/running-web-and-load-tests-from-thecommand-line.aspx
MY TIP HAVING DONE THIS BEFORE
Create .bat / .cmd files that contain the full mstest command line, and call those .bat/.cmd files from Scheduled Tasks. If you do this, you can then easily update the .bat/.cmd file and not have to mess with the tasks which are cumbersome to setup initially. Also, you will be able to put those files under source control and live in your project.
SCHEDULED TASKS :
http://support.microsoft.com/kb/324283
Page 215
NOTE: You can also modify this DIRECTLY in the .loadtest file and that way you do not change the built in types everywhere, just for the loadtest. Below is a sample of a custom browser with headers removed.
Page 216
Existing: <BrowserMix> <BrowserProfile Percentage="100"> <Browser Name="Internet Explorer 7.0" MaxConnections="2"> <Headers> <Header Name="User-Agent" Value="Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1)" /> <Header Name="Accept" Value="*/*" /> <Header Name="Accept-Language" Value="{{$IEAcceptLanguage}}" /> <Header Name="Accept-Encoding" Value="GZIP" /> </Headers> </Browser> </BrowserProfile> </BrowserMix> Modified: <BrowserMix> <BrowserProfile Percentage="100"> <Browser Name="EMSCOMM" MaxConnections="2" /> </BrowserProfile> </BrowserMix>
Page 217
At runtime, the web test engine will look for a file with the name MyImage.gif in the test deployment directory. If I just go ahead and run this the test fails with: Request failed: Could not find file 'C:\Users\edglas.000\Documents\Visual Studio 2008\Projects\TestProject1\TestResults\edglas_EDGLAS-LT2 2008-08-05 08_57_58\Out\MyImage.gif'.
Of course the first thing you'll notice is the file path was not recorded in the web test. So how will this file be found, and why is it looking in that directory? When you run tests in VS, the files required to run the tests are "deployed" to a directory for the test run (see my post on deployment), in this case it is edglas_EDGLAS-LT2 2008-08-05 08_57_58. You'll get a different directory every time you run your tests. Notice this directory is available in the web test context:
Page 218
The best way to handle this in your test is to actually add the file to be uploaded to your test project, then add it as a deployment item on your test. That way it will be copied to the out directory prior to running the test, and will be available during execution. This also has the advantage that if you are working with others, the test will also run on their machine (hard-coded full paths in a test are bad, as these tests will fail on another person's machine if their drive isn't set up the same way). Once you've added the file to your project, add it as a deployment item. There's two ways to do this, on the run config or on the test. Since this file is really associated with the test, I recommend putting it on the test. This is not discoverable at all. First, open test view (Test menu | Windows | Test View) and select the web test. Then set the Deployment Items property for the web test's test element.
Page 219
Now add MyImage.gif as a deployment item by clicking ... next to the deployment items property. Since this is a property on the web test, the path is relative to the web test:
Page 220
Another approach is to create a folder in your project where you put all your file upload files. Then specify the relative path in the deployment item properties (relative to the web test in the project). So if my file to upload is in the FileUpload folder
Now in deployment items specify the relative path, which again is relative to the web test. Note the path in the file upload web test parameter is not relative since it will be published "flat" with no subdirectories (no changes required):
Page 221
Now the test runs successfully again. Another option is to deploy the files or entire directory using the run config settings. For the run config, go to the Deployment tab and use the Add Directory to add your folder with files to upload. Note that this path is solution relative, since the run config is in the solution directory. The <Solution Directory> macro is automatically inserted after you select the file or directory.
Page 222
Now if I go to the deployment dir using Explorer (easiest way to find it is from the web test context parameter), I see that both of my images were deployed, and my test still runs successfully. Also any new files I want to upload I can just drop them in this folder in my solution and don't have to add it as a deployment item to my test. Note that all files are published "flat", which means you can't have two different deployment files in different folders with the same name.
Gotcha: Check Your Validation Level in the Load Test Run Settings
By default, all validation rules added to a web test are marked HIGH. By default, all load tests have a validation level of LOW. This means that NONE of the validation rules will run in a load test by default. You either need to lower the level in the web test, or raise the level in the load test.
Page 223
Gotcha: Caching of dependent requests is disabled when playing back Web Tests
Caching for all dependent requests is disabled when you are playing back a web test in Visual Studio. You will notice that if, for example, the same image file is used in multiple web pages in your web test, the image will be fetched multiple times from the web server.
This is not a Test Rig exception but a VS client exception. The resolution is to restart VS to release memory. It is fixed in 2010.
Gotcha: Timeout attribute in coded web test does not work during a load test
If you use the [Timeout()] attribute in a coded web test, it works as expected. However, if you then run that webtest inside a load test, the attribute is ignored. This is expected behavior. To set timeouts, use the request.timeout attribute instead.
public class Coded01 : WebTest { private RandomServerWorkTime testPlugin0 = new RandomServerWorkTime(); [Timeout(1000)]
This attribute is ignored during a load test Visual Studio Performance Testing Quick Reference Guide Page 224
Best Practice: considerations when creating a dynamic goal based load test plugin:
If there is a chance that the load test will be run on a test rig, be sure to limit the code to running on only one agent machine. Running the code on multiple agents will cause contention in the behavior and will yield unexpected results. You will not receive an error. The following code from a load test plugin Initialize method will force the code to run on only one agent and will work for rigs AND for locally run tests:
public void Initialize(LoadTest loadTest) { // ONLY run this on one agent to avoid contention. if (loadTest.Context.AgentId == 1) { LoadTestGoalBasedLoadProfile goalLoadProfile = new LoadTestGoalBasedLoadProfile(); // Since the heartbeat handler is inside the conditional, The event will be setup // only on one machine All LoadProfile changes are sent to the controller and // propogated across the rig automatically loadTest.Heartbeat += new EventHandler<HeartbeatEventArgs>(_loadTest_Heartbeat); } }
Best Practice: Coded web tests and web test plug-ins should not block threads
http://blogs.msdn.com/billbar/archive/2007/06/13/coded-web-tests-and-web-test-plug-ins-should-notblock-the-thread.aspx
Page 225
Extensibility
New Inner-text and Select-tag rules published on Codeplex
Changed in 2010
In 2008 All of the rules in this release on CodePlex relate to the inner text of a tag. For example, for a select tag (list box and combo box), the option text is stored in inner text rather than an attribute: <select name="myselect1"> <option>Milk </option> <option>Coffee</option> <option selected="selected">Tea</option> </select> In order to extract the value of the list box, we need to parse out the inner text of the selected option. TextArea is another tag that does this, but there are also a lot of other examples in HTML where you might want to extract or validate inner text. The new project has these new rules as well as a parser for inner text and select tag: 1. 2. 3. 4. ExtractionRuleInnerText ExtractionRuleSelectTag ValidationRuleInnerText ValidationRuleSelectTag
Download location http://codeplex.com In 2010 Many of the features above are now built into VS 2010. Here is a list of these: http://msdn.microsoft.com/en-us/library/bb385904(VS.100).aspx Visual Studio Performance Testing Quick Reference Guide Page 226
Page 227
Page 228
Page 229
Reference the following assemblies directly in the project: o Microsoft.VisualStudio.QualityTools.LoadTestFramework o Microsoft.VisualStudio.QualityTools.WebTestFramework o Any other assemblies or code you will need to do the functional work of your addin. Add a user control to the project (right click -> Add new -> user control). This will house the items to be displayed on the tab. Add the necessary controls to the main user control. For my example, I needed a checkbox, textbox and a listbox. Set the listbox Dock property to Fill: Note that when you do this, it will cause the listbox to cover the other controls. We will fix this next.
Set the margins for the main control. This will correct the size of the listbox from the previous step. Make sure the value for TOP is big enough to uncover the other controls.
Page 230
sReqName is the URL of the current request. sSize is the calculated size of the ViewState. iTotalSize is the cumulative value. All of these properties are calculated and set in the connect.cs code. The code here is solely for modifying the values displayed in the tab.
public void AddAValueToTheListView(string sReqName, string sSize, int iTotalSize) { tbTotalSize.Text = iTotalSize.ToString(); tbTotalSize.Update(); ListViewItem item = new ListViewItem(new string[] { sReqName, sSize }); listViewTagCounts.Items.Add(item); } private void cbShowNonViewState_CheckedChanged(object sender, EventArgs e) { // Add code to handle hiding non viewstate pages }
Page 231
particular addin. The highlighted lines need to be added to all web test addins.
public void OnConnection(object application, ext_ConnectMode connectMode, object addInInst, ref Array custom) { _applicationObject = (DTE2)application; This method already exists. Delete _addInInstance = (AddIn)addInInst;
The highlighted method names correspond to the matching method definitions below.
Page 232
This is stock code. Copy all of it and simply change the user control name to whatever name you gave your control in the previous section.
//add the dictionary of open playback windows System.Diagnostics.Debug.Assert(!m_controls.ContainsKey(viewer.TestResultId)); Dictionary<Guid, UserControl> userControls = new Dictionary<Guid, UserControl>(); //add the summary Guid summaryGuid = Guid.NewGuid(); Guid responseGuid = Guid.NewGuid(); userControls.Add(responseGuid, c); m_controls.Add(viewer.TestResultId, userControls); //add tabs to playback control viewer.AddResultPage(responseGuid, "ViewState Info", c); }
The text here is the name that appears on the added tab
This is stock code. No void WebTestResultViewerExt_TestCompleted(object sender, WebTestResultViewerExt.TestCompletedEventArgs e) modification is needed. { foreach (UserControl userControl in m_controls[e.TestResultId].Values) { }
}
void WebTestResultViewerExt_WindowClosed(object sender, modification is needed. WebTestResultViewerExt.WindowClosedEventArgs e) { if (m_controls.ContainsKey(e.WebTestResultViewer.TestResultId)) { //process open windows foreach (Guid g in m_controls.Keys) { e.WebTestResultViewer.RemoveResultPage(g); } m_controls.Remove(e.WebTestResultViewer.TestResultId); } }
Page 233
// Count the number of occurrences of each tag in the response Dictionary<string, int> tagCounts = new Dictionary<string, int>(StringComparer.OrdinalIgnoreCase); if (response != null && response.BodyBytes != null) { string str1 = response.ResponseUri.ToString(); string str2 = "No VIEWSTATE Detected"; if (response.BodyString.Contains("__VIEWSTATE")) { //<input type="hidden" name="__VIEWSTATE" id="__VIEWSTATE" value="/wEPQUe...z+aZmiNA==" /> int x = response.BodyString.IndexOf("id=\"__VIEWSTATE\" value=\""); int y = response.BodyString.IndexOf("\" />", x); if ((y - x - 24) > 0) { str2 = Convert.ToString(y - x - 24); iViewStateTotalSize = iViewStateTotalSize + (y - x - 24); } } userControl1.AddAValueToTheListView(str1, str2, iViewStateTotalSize); } } } } The call to my user control to populate the tab. }
Page 234
This is code that does the work for the addin. Here I get all of the data and then call my control to populate the tab.
void WebTestResultViewerExt_SelectionChanged(object sender, This is the user control WebTestResultViewerExt.SelectionChangedEventArgs e) { you created. if (e.WebTestRequestResult != null) { foreach (UserControl userControl in m_controls[e.TestResultId].Values) { UserControl1 userControl1 = userControl as UserControl1; if (userControl1 != null) { WebTestResponse response = e.WebTestRequestResult.Response;
In this post I am going to talk about a new feature that can help with web test recording. The feature is extensible recorder plug-ins for modifying recorded web tests. Basically we are giving you the opportunity to modify the recorded web test after you click stop on the web test recorder bar but prior to the web test being fully saved back to the web test editor. So what problems does this help with? The main one is performing your own custom correlation. In VS 2008 we added a process which runs post recording that attempts to find dynamic fields. You can read this blog post for more information: http://blogs.msdn.com/slumley/pages/web-test-correlation-helper-feature-in-orcas.aspx This process still exists, but this process does not always find all dynamic fields for an application. So if we did not find the dynamic fields in your application you had to manually perform the correlation process. Here is a blog post that goes into detail about the manual process: http://blogs.msdn.com/slumley/pages/how-to-debug-a-web-test.aspx Also there are cases that our correlation process does not find the dynamic values, such as dynamic values in the URL. At a high level, you have to: 1) 2) 3) 4) Determine what parameters are dynamic Then for each parameter find the first occurrence of this in a response body. Add an extraction rule to pull the value out of the response and add it to the context Then modify each query string or form post parameter that needs this value by changing the value to pull the value out of the context.
This new feature allows you to write your own plug-in which can perform correlation or modify the web test in many ways prior to it being saved back to the web test editor. So once you figure out that certain dynamic variable have to be correlated for each of your recordings, you can automate the process. To demonstrate how this works, I am going to write a recorder plug-in which will perform the correlation that I manually walked through in my previous post. Please quickly read that: http://blogs.msdn.com/slumley/pages/vs-2010-feature-web-test-playback-enhancements.aspx Overview Create the plug-in Recorder plug-ins follow the same pattern as WebTestPlugins or WebTestRequestPlugins. To create a plug-in, you will create a class that extends WebTestRecorderPlugin and then override the PostWebTestRecording method:
Page 235
public class Class1 : WebTestRecorderPlugin { public override void PostWebTestRecording(object sender, PostWebTestRecordingEventArgs e) { base.PostWebTestRecording(sender, e); } }
Modify the web test The event args will give you 2 main objects to work with: the recorded result and the recorded web test. This will allow you to iterate through the result looking for certain values and then jump to the same request in the web test to make modifications. You can also just modify the web test if you wanted to add a context parameter or maybe parameterize parts of the URL. If you do modify the web test, you also need to set the ReocrdedWebTestModified property to true.
e.RecordedWebTestModified = true;
Deploy the plug-in After compiling the plug-in, you will need to place the dll in 1 of 2 spots:
1) Program Files\Microsoft Visual Studio
10.0\Common7\IDE\PrivateAssemblies\WebTestRecorderPlugins 2) %USERPROFILE%\My Documents\Visual Studio 10\WebTestRecorderPlugins Executing the plug-in After you deploy the plug-in, you will need to restart VS for the plug-in to be picked up. Now when you create a web test, you will see a new dialog. The dialog will display all of the available plug-ins that can be executed. Select your plug-in and hit ok. Once you are done recording your web test, the plug-in will be executed. Creating the Sample Plug-in First a quick review of the correlation that we are going to automate. Here is the screen shot from correlation tool after I recorded my web test against a reporting services site.
Page 236
We are going to correlate the ReportSession parameter. 1) Create a class library project 2) Right click references and select Add Reference 3) Choose Microsoft.VisualStudio.QualityTools.WebTestFramework 4) Here is the code for my plug-in:
using System.ComponentModel; using Microsoft.VisualStudio.TestTools.WebTesting; using Microsoft.VisualStudio.TestTools.WebTesting.Rules; namespace RecorderPlugins { [DisplayName("Correlate ReportSession")] [Description("Adds extraction rule for Report Session and binds this to querystring parameters that use ReportSession")] public class CorrelateSessionId : WebTestRecorderPlugin { public override void PostWebTestRecording(object sender, PostWebTestRecordingEventArgs e) { //first find the session id bool foundId = false;
Page 237
foreach (WebTestResultUnit unit in e.RecordedWebTestResult.Children) { WebTestResultPage page = unit as WebTestResultPage; if (page != null) { if (!foundId) { int indexOfReportSession = page.RequestResult.Response.BodyString.IndexOf("ReportSession"); if (indexOfReportSession > -1) { //add an extraction rule to this request // Get the corresponding request in the Declarative Web test ExtractionRuleReference ruleReference = new ExtractionRuleReference(); ruleReference.Type = typeof(ExtractText); ruleReference.ContextParameterName = "SessionId"; ruleReference.Properties.Add(new PluginOrRuleProperty("EndsWith", "&ControlID=")); ruleReference.Properties.Add(new PluginOrRuleProperty("HtmlDecode", "True")); ruleReference.Properties.Add(new PluginOrRuleProperty("IgnoreCase", "True")); ruleReference.Properties.Add(new PluginOrRuleProperty("Index", "0")); ruleReference.Properties.Add(new PluginOrRuleProperty("Required", "True")); ruleReference.Properties.Add(new PluginOrRuleProperty("StartsWith", "ReportSession=")); ruleReference.Properties.Add(new PluginOrRuleProperty("UseRegularExpression", "False")); WebTestRequest requestInWebTest = e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest; if (requestInWebTest != null) { requestInWebTest.ExtractionRuleReferences.Add(ruleReference); e.RecordedWebTestModified = true; } foundId = true; } } else { //now update query string parameters WebTestRequest requestInWebTest = e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest; if (requestInWebTest != null) { foreach (QueryStringParameter param in requestInWebTest.QueryStringParameters) { if (param.Name.Equals("ReportSession")) { param.Value = "{{SessionId}}"; } } } }
Page 238
} } } } }
5) Let's review parts of the class. a. Iterate through the result to find first page with ReportSession. This code fragment iterates through each of the recorded objects and searches the response body for ReportSession.
foreach (WebTestResultUnit unit in e.RecordedWebTestResult.Children) { WebTestResultPage page = unit as WebTestResultPage; if (page != null) { if (!foundId) { int indexOfReportSession = page.RequestResult.Response.BodyString.IndexOf("ReportSession"); if (indexOfReportSession > -1) {
b. Now that we found the response, we need to add an extraction rule. This code creates the extraction rule and then finds the correct request in the web test to add the extraction rule to. Each result object has a property called DeclaraticveWebTestItemId which is what we will use to get correct request from the web test.
ExtractionRuleReference ruleReference = new ExtractionRuleReference(); ruleReference.Type = typeof(ExtractText); ruleReference.ContextParameterName = "SessionId"; ruleReference.Properties.Add(new PluginOrRuleProperty("EndsWith", "&ControlID=")); ruleReference.Properties.Add(new PluginOrRuleProperty("HtmlDecode", "True")); ruleReference.Properties.Add(new PluginOrRuleProperty("IgnoreCase", "True")); ruleReference.Properties.Add(new PluginOrRuleProperty("Index", "0")); ruleReference.Properties.Add(new PluginOrRuleProperty("Required", "True")); ruleReference.Properties.Add(new PluginOrRuleProperty("StartsWith", "ReportSession=")); ruleReference.Properties.Add(new PluginOrRuleProperty("UseRegularExpression", "False")); WebTestRequest requestInWebTest = e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest; if (requestInWebTest != null) { requestInWebTest.ExtractionRuleReferences.Add(ruleReference); e.RecordedWebTestModified = true; }
c. Now we need to find all query string parameters that have ReportSession as name and change the value to {{SessionId}}
WebTestRequest requestInWebTest = e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest;
Page 239
6) Now that we have our plug-in, I need to compile and deploy it to one of the locations listed above. 7) Restart VS 8) Open a test project and create a new web test. I now see the following dialog with my plug-in available:
Page 240
9) Select the plug-in 10) Record the same web test against my reporting services site and click stop to end the web test. 11) Now when the correlation process runs, you will see that it does not find the ReportSession parameter. This is because we have already correlated it.
12) Now look at the first request in the web test and you will see the extraction rule.
Page 241
13) Now look at the other requests to see where we are referencing the extraction rule.
This is a slightly more advanced feature, but it provides a huge time savings for automating changes to your recorded web test. If you have multiple people creating web tests, you can use this plug-in to make sure the same parameters or rules are added to each web test. And of course you can automate correlation of parameters or URLs which the built in correlation tool does not find.
Page 242
Please feel free to download and use the emulator. Also, if you feel strongly enough, feel free to suggest or contribute new features.
Page 243
Page 244
Average Max and Min time taken for each page type
-i:IISW3C -recurse:-1 -Q:on "SELECT EXTRACT_EXTENSION(cs-uri-stem) as Type, AVG(timetaken) AS Average, MAX(time-taken) AS Maximum, MIN(time-taken) AS Minimum INTO PageTimes.txt FROM ex*.log WHERE time-taken &amp;gt; 0 GROUP BY Type ORDER BY Average DESC"
Page 245
Pulling data from inside the body string of event viewer logs
logparser -i:evt "SELECT extract_prefix(extract_suffix(Strings,0,'left text'),0,'right text') as String INTO optimizer.txt FROM *.EVT WHERE Strings LIKE '%Optimizer Results%'" -q:ON
(variation) Pulling data from inside the body string of event viewer logs constrained by timeframe
logparser -i:evt -q:ON "SELECT Count(*) AS Qty, SUBSTR(extract_suffix(Message, 0, 'Message :'), 0, 75) as String FROM Error! Hyperlink reference not valid.name>\Application WHERE SourceName LIKE '%Enterprise%' AND Message LIKE '%Timestamp: %' AND TimeGenerated > TIMESTAMP ('2008-06-06 07:23:15', 'yyyy-MM-dd hh:mm:ss' ) GROUP BY String ORDER BY Qty DESC"
List of exceptions from saved event logs searching for keywords in the text output
-I:evt "SELECT QUANTIZE(TimeGenerated, 3600) AS Hour, COUNT(*) As Total, ComputerName FROM *.evt WHERE EventID = 100 AND strings like '%overflow%' GROUP BY ComputerName, hour"
Command to query Netmon file and list out data on each TCP conversation
LogParser -fMode:TCPConn -rtp:-1 "SELECT DateTime, TO_INT(TimeTaken) AS Time, DstPayloadBytes, SUBSTR(DstPayload, 0, 128) AS Start_Of_Payload INTO IE-Take2.txt FROM IE-Take2.cap WHERE DstPort=80 ORDER BY DateTime ASC" -headers:ON
Command to query Netmon and find frame numbers based on specific text in payload
LogParser -fMode:TCPIP -rtp:-1 "SELECT Frame, Payload INTO 3dvia.txt FROM 3dvia.cap WHERE DstPort=80 AND Payload LIKE '%ppContent%' " -headers:ON
Page 246
The hard part has always been getting your mouse over the right spot at the right time. <ctrl><period> is a keyboard shortcut that will drop down that box for you and not require that you have to get your mouse cursor on that little spot.
Page 247
Older articles
Content-Length header not available in Web Request Object
Currently the web request header "Content-Length" in not in the WebTestRequest object. This is expected to be changed in SP1
Page 248
Bug in VS 2008 SP1 causes think time for redirected requests to be ignored in a load test
When a web test is run in a load test, any test requests that result in redirects suffer from a timing bug. Any think time that is specified on the request is ignored. This is fixed in a POST-SP1 hotfix: KB 956397 (http://support.microsoft.com/kb/956397/en-us) http://blogs.msdn.com/billbar/archive/2008/08/04/bug-in-VS-2008-sp1-causes-think-time-forredirected-requests-to-be-ignored-in-a-load-test.aspx
Four New Methods added to the WebTestPlugin Class for 2008 SP1
http://blogs.msdn.com/billbar/pages/web-test-api-enhancements-available-in-VS-2008-sp1-beta.aspx
Page 249
Index
.NET Garbage Collection, 5, 75 AJAX, 9, 25, 46, 122, 142, 151, 155, 248 Application Domain, 14, 190 authentication, 7, 11, 89, 135, 174, 175, 177, 204, 248 Caching, 4, 5, 9, 17, 18, 20, 82, 100, 186, 224 caspol, 116 CodePlex, 3, 9, 226, 243 context, 8, 12, 13, 25, 31, 41, 45, 46, 51, 69, 71, 109, 112, 119, 135, 137, 144, 145, 150, 153, 161, 185, 186, 187, 188, 190, 192, 193, 194, 196, 197, 198, 205, 210, 218, 223, 225, 235, 236 correlation, 25, 140, 144, 151, 235, 236, 241, 242 CSV Files, 6, 111, 112 Data Collectors, 109 data source, 4, 6, 8, 16, 31, 42, 43, 44, 98, 101, 111, 112, 148, 186, 208, 212 declarative web test, 6, 8, 11, 32, 41, 45, 69, 70, 110, 113, 117, 150, 152, 184, 185, 187, 198, 206, 207, 224, 238 dependent requests, 6, 8, 9, 13, 17, 33, 37, 46, 47, 94, 108, 109, 122, 184, 186, 189, 214, 224 Deployment, 5, 73, 81, 82, 84, 89, 203, 218, 219, 220, 221, 222, 223 Execution Interleaving, 27 extract, 4, 6, 12, 25, 26, 33, 34, 35, 36, 37, 39, 41, 42, 43, 44, 51, 119, 144, 146, 148, 149, 150, 152, 153, 185, 194, 224, 226, 235, 237, 238, 239, 241, 242, 246 Fiddler, 5, 7, 72, 110, 127, 132 HIDDEN parameters, 9, 119, 185, 248 HTTP Headers, 4, 11, 48, 94, 109, 125, 178, 186, 216, 217, 246 Content-Type, 7, 11, 129 If-Modified-Since, 8, 186 Pragma, 11 Referrer, 11 SOAPAction, 11 x-microsoftajax, 11 Internet Explorer, 17, 72, 178, 186, 217 IP Address, 5, 77 Licensing, 4, 61, 117 Load Test Options Agents to Use, 68 Delay Start Time, 68 Disable During Warmup, 68 Logging, 6, 7, 117, 122, 153, 178, 180 lusrmgr.msc, 207 MSTest, 7, 15, 27, 109, 130, 164, 166, 170, 172, 215, 248 Network Firewall, 161, 164, 165, 166, 167, 172, 173 Netmon, 110, 246 Netstat, 118, 246 TCP Parameters, 118 TCPView, 202 Tracing, 6, 7, 9, 103, 137, 138, 168, 170, 171, 176, 244 NUnit, 27 Parameter Data Data Source, 4, 6, 8, 16, 31, 98, 101, 111, 148, 208, 212 Random, 4, 8, 16, 164, 165, 187 Sequential, 4, 16, 65, 208 Unique, 4, 8, 16, 77, 94, 105, 109, 144, 183, 187, 193, 208, 209 Parameters, 8, 13, 26, 46, 48, 49, 51, 67, 112, 118, 119, 135, 143, 144, 145, 146, 147, 148, 149, 150, 151, 153, 180, 184, 186, 187, 190, 192, 193, 194, 196, 197, 198, 205, 217, 221, 223, 235, 236, 237, 238, 239, 241, 242, 248 performance counters, 5, 7, 22, 82, 90, 92, 93, 94, 103, 114, 132, 183, 244 Performance Monitor, 5, 93, 94 Permissions, 165, 166 phishing, 113 processor, 6, 22, 63, 91, 92, 117 proxy server, 5, 8, 72, 129, 132, 134, 135, 204, 210 random, 4, 8, 16, 164, 165, 187 redirection, 8, 196 regedit, 11, 81 REGISTRY Settings HKEY_CURRENT_USER, 11, 171, 178, 210 HKEY_LOCAL_MACHINE, 81, 113, 118, 172 Reporting Name, 4, 13, 52, 53, 155 RequestHeadersToRecord, 11 Results WebTestResult, 6, 107, 108, 109, 147 SOAP, 185 SSL Certificates, 8, 185, 195 HTTPS, 6, 113, 115 ServicePointManager, 8, 129, 195 SecurityProtocol, 195 SecurityProtocolType, 195 ServicePointManager. ServerCertificateValidationCallback, 195 SSLv3, 8, 195 TLS, 8, 195 X509Certificate, 185, 195 Symbols, 82 Sysinternals
Page 250
PsTools, 202 Sysinternals, 171 TeamTestAgentService, 165, 167, 207 test rig, 4, 5, 8, 22, 29, 30, 31, 61, 74, 76, 78, 80, 81, 82, 113, 117, 167, 169, 177, 190, 207, 224, 225 TIME_WAIT, 118 timeouts, 4, 5, 9, 15, 46, 73, 84, 92, 120, 174, 176, 224 Transactions, 6, 32, 57, 58, 99, 102, 103, 104, 105, 114 URL, 52, 67, 105, 116, 126, 185, 196, 197, 198, 211, 235, 236 validate, 7, 8, 9, 12, 13, 32, 33, 34, 35, 36, 37, 39, 40, 41, 42, 43, 44, 135, 144, 148, 152, 155, 177, 180, 188, 195, 196, 198, 223, 226 verbose logging, 180, 182, 183 VIEWSTATE, 25, 109, 180, 181, 228, 231, 233, 234, 248 Virtual User Pack, 61, 62, 63, 64, 71 VSTT 2010 branching, 144, 152, 198 conditional rule, 152, 153, 198, 199 looping, 16, 144, 152, 153, 198 VSTT Classes FormPostHttpBody, 184 IHttpBody, 184 StringHttpBody, 125, 184, 207 WebTest, 8, 32, 33, 34, 35, 36, 37, 38, 40, 41, 42, 43, 44, 45, 66, 72, 135, 188, 193, 194, 195, 196, 197, 204, 206, 210, 224 WebProxy, 72, 204 WebTestContext, 7, 137, 188, 193, 205 WebTestPlugin, 9, 11, 40, 45, 186, 189, 193, 204, 224, 249 WebTestRequest, 17, 41, 45, 94, 125, 149, 150, 185, 186, 188, 189, 196, 206, 238, 239, 248 ClientCertificates, 185 WebTestResponse, 188, 234 VSTT Configuration Files Counterset files DefaultCounter, 93 DefaultCountersForAutomaticGraphs, 93 HigherIsBetter, 91 LoadTestCounterCategoryExistsTimeout, 92 LoadTestCounterCategoryReadTimeout, 92 Range, 91 RangeGroup, 91 QTAgent.exe.config, 75, 124, 180, 183 QTAgentService.exe.config, 72, 182 QTAgentServiceUI.exe.config, 180 QTController.exe.config, 74, 92, 180 Test Run Configuration files, 84 Test Setting files, 84 VSTestHost.exe.config, 74, 75, 92
VSTT Extraction Rules ExtractionRuleInnerText, 226 ExtractionRuleSelectTag, 226 VSTT Methods Add Call to Web Test, 66 adding a context parameter, 186, 188 ClassCleanUp, 27, 28 ClassInitialize, 27, 28, 98, 129, 209 GetRequestEnumerator, 41, 188, 195 PostRequest, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44, 186, 188, 189 PostRequestEvent, 195 StringHttpBody, 125, 184, 207 System.Net.HttpWebRequest, 115, 125 TestCleanUp, 28 TestInitialize, 27, 28 WebTestExecutionInstruction, 196, 197 VSTT Plugins LoadTestAborted, 13, 38 LoadTestFinished, 13, 38 LoadTestStarting, 13, 38, 210 LoadTestWarmupComplete, 13, 38, 39 PostPage, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44 PostRequest, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44, 186, 188, 189 PostTransaction, 12, 34, 40, 41, 43 PostWebTest, 12, 38, 40, 41, 44, 188, 193 PrePage, 11, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44 PreRequest, 8, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44, 194, 195, 196, 197, 207 PreTransaction, 11, 12, 32, 40, 41, 42 PreWebTest, 11, 12, 32, 40, 41, 42, 45, 204, 224 TestFinished, 13, 38, 39 TestSelected, 13, 38, 39 TestStarting, 13, 31, 38, 39 ThresholdExceeded, 13, 38, 39 VSTT Properties All Individual Details, 54, 102, 117 Cache Control, 17 EventTarget, 194 Follow Redirects, 185 Goal Based Load Pattern, 22, 114 Initial User Count, 22 LoadTestMaxErrorsPerType, 74 Lower Values Imply Higher Resource Use, 22 MaximumUserCount, 22 Percentage of New Users, 17, 18, 176 ResponseBodyCaptureLimit, 45, 110, 215, 224 Run unit tests in application domain, 4, 14 Sample Rate, 22, 93 Statistics Only, 102
Page 251
Stop Adjusting User Count When Goal Achieved, 22 Target Range for Performance Counter, 22 Test Iterations, 4, 6, 14, 15, 18, 23, 119, 120 Think Time, 4, 6, 8, 9, 13, 15, 23, 57, 104, 112, 155, 187, 192, 249 TimingDetailsStorage, 6, 54, 57, 102, 117, 136, 176, 202 Use Test Iterations, 14 WebTestIteration, 193 VSTT Runtime QTAgent, 75, 119, 124, 180, 183 QTController, 74, 81, 92, 174, 180
VSTestHost, 74, 75, 92, 120 VSTT Settings Administer Test Controllers, 68, 81, 169, 207 Analysis Comment, 9, 176, 226 Follow Redirects, 185 VSTT Test Types Sequential Test Mix, 4, 65 Web Test Composition, 66 VSTT Validation Rules ValidationRuleInnerText, 226 ValidationRuleSelectTag, 226
Page 252
Page 253