Sie sind auf Seite 1von 13

TEST DIRECTOR 5.

0
Developed by Mercury Interactive Test Management Tool To store testing documents in to data base. Working as a Clint / Server application.

Project Admin Data Base TestDirector

MS-Access(Default) Oracle SQL server Sybase

I. Project Administrator:
This part used by test lead, to create data base for new projects testing documents and to estimate test status of on going projects. a) Create Database: Navigation: Start Programs TestDirector 5.0 Project administrator Login by test lead Project menu New project Enter Project Name Specify location (Private / Common) Create Click Ok. Estimate Test Status: Start Programs TestDirector 5.0 Project administrator Login by test lead Select Project Name in List Click Connect Icon Click Expand Symbol in front of Project Name Select Required Table Name in List Expand select statement if required Click Run SQL. Note: In general TestDirector maintains 26 tables and 9 views per every project testing documents to be stored.

II. TESTDIRECTOR: This part used by test engineer to store their job level documents into specified database by test lead. Navigation:

Start Programs TestDirector 5.0 TestDirector Select Project Name Login by Test Engineer Plan Tests (Test Design) Run Tests (Test Execution) Track Defects (Test Reporting) 1. Plan Tests: In general test engineer job starts with test cases selection after completion of required training on that project requirements. If an organisation uses TestDirector for test management test engineers use this plan tests part to store test cases. a) Create Subject: Plan Tests Click New (Under Folder) Enter folder name (Subject / Module) Click OK. b) Create Sub Subject: Plan Tests Select Subject Name Click Folder New Enter Sub Subject Name Click OK. c) Create Test Case: Plan Tests Select Subject Select Sub Subject Click Test New Select Test Type (Manual / WinRunner automated) Enter Test Case Name Click OK. d) Details: After creation of test case, test engineer entries required details for that test case such as Test Case ID, Test Suit ID, Priority, Test Environment, Test Duration, Test Setup and Test Case Pass/Fail Criteria. We can maintain this details in given text area.

e) Design Steps: After completion of details entry for a test case, test engineers are creating a step by step procedure to executive that test case. Navigation: Plan Tests Select Subject Name Select Sub Subject Name Select test Case Design Steps Click Step New Enter Step Description With Expected Click New to Create More steps up to end State Click Close. f) Test Script:

After receiving stable build from development, test engineers are preparing automated test scripts for selected test cases. Navigation: Plan Tests Select Subject Select Sub Subject Selected Automated Test Case Click Test Script Click Launch Set Application In Base State for that Test Click Start Recording Record Our Business Operations with Checkpoints w.r.t Manual Procedure Click Stop recording Click Save Close WinRunner Click Refresh to get Test Script in TestDirector Window. g) Attachment: After enter details and design steps for every test case, test engineer try to maintain attachments if required. Example: Test Data file (.xls, .txt) Navigation: Plan tests Select Subject Select Sub Subject Select Test Case Select Attachments Click File / WEB Browse Path Click OK. 2. Run Tests: After completion of test design and test possible automation, TestDirector provides a facility to execute our tests as manual / automation. a) Create Batch: Navigation: Run Tests Click Test Set Builder Click New Enter Test Suit Name / Batch Name Click OK Select Required Dependent Tests into a Batch Click OK b) Execute Automated Tests: Navigation: Run Tests Select automated tests in batch w.r.t Order Click Automated Click Run Tools Menu Open Browse Executed Test Analyze Results Manually Close WinRunner Specify Status as Passed / Failed Click Close. c) Execute Manual Tests: Navigation: Run Tests Select manual test in batch w.r.t order Click Manual Click Start Run Execute Every Step and Specify Step Status as Passed / Failed Click close after Execution of Last step. 3. Track Defects:

During test execution test engineers are reporting mismatches as defects to developers. For this defect reporting, TestDirector provides a facility like as IEEE Defect Report. Navigation: Track Defects Click Add Fill Fields in Defect Report Click Create Click OK Click Close Click Mail Enter To Address Click OK. Note: For above defect submission TestDirector maintains Internet Connection or Microsoft Outlook like local mailing.

Test Director Ions:


1. Filter: We can use this icon to select specific tests & Defects in list. Navigation: Click Filter Icon Specify Filter Condition Click OK Note: To delete existing filters, we can use Clear Icon. 2. Sort: To arrange existing list of defects and tests, we can use Sort Icon (Ascending By Default / Descending) . Navigation: Click Sort Icon Select Required Field to Sort Select Sort Direction Click OK. 3. Report Icon: To create hart copy for defect reports, we can use this icon. Navigation: Click Report Icon Select Type of Report ( Complete Information / Tabular) Select Print Out Type (Portrait / Landscape) Click OK Click Print Icon Click Close. 4. Test Grid: This grid provides list of all test cases under all sub tests and subjects.

ADVANCE TESTING PROCESS


Testing Process Test Planning Test Design Test Execution Master Build Test Closer From the above advanced testing process a testing team applies Load Testing on that Master Build. Master Build means that a build providing correct functionality. Testing Team applies Load Testing in manual or automation. Manual Load Testing is expensive to create huge multi users environment. Due to this reason organisations are maintaining Automated load test. Example: Load Runner, SQA Load Test, Silk Performer and J Meter. Load Testing Test Reporting

LOADRUNNER 6.0
Developed By Mercury Interactive Load Testing Tool To Estimate performance under load virtually Support Clint / Server, WEB, ERP, SAP and Legacy Technology( C, C++, Cobal etc) Virtual Environment:

Clint

Server

AL

Controller Scenario Port Mapper

RCL Virtual User Generator

TL NL NIC

Customer Expected Configuration + 1.5 KB extra Ram per 1 Virtual user AL Application Layer (Fore Ground Operation) TL Transport Layer (Back ground operation in local host) NL Network Layer / Internet Layer (To connect to remote host using IP address) NIC Network Interface card (To physically establish connection) RCL Remote command launcher to convert local request in to remote request. Vuser Gen Virtual user generator, to make multiple virtual requests depends on a single real request. Port Mapper To submit all virtual requests to single server process port. Control Scenario Control Scenario to return performance results during server process execution to respond that virtual request.

I. Clint / Server Load Testing: LoadRunner allows you to conduct load testing on two tire applications using below components. 1. Customer expected configured server computer.

2. 3. 4. 5. 6. 7.

Two tire application build Remote command launcher Portmapper Virtual user generator Control Scenario Database Server

Test Cases: Test Engineers are preparing test cases for Clint / Server load testing depends on Background operation in our application build. Insert Delete Update Select Commint Rollback

Test Cases (Scenario) Any operation on front end will perform any of these 6 operations on back end.

Time Parameters: To estimate performance of a Clint / Server application, we can use two time parameters. a) Elapsed Time:
Request Transmission

Server Clint This elapsed time is also known as b) Response Time:


Request Transmission ACK Response Transmission

Elapsed Time = Request Transmission + Server Process + Response Transmission

Process Turn Around Time / Swing Time.

Server Process

Response Time = Request Transmission + Acknowledgement Time

Clint

Response Transmission

Time to get first response from server is called response time. Response time is a part of elapsed time. Example: 1. Time to start file downloading.

2. Time to activate printer. Navigation: Assume Customer accepted configured server Start Menu Programs LoadRunner Remote Command Launcher. Start Menu Programs Corresponding database server(Ex Oracle, SQL server, Quadbase, Sybase, MySQletc.)

Start Menu Programs LoadRunner Virtual user Generator File menu Click new Select Clint/ Server Select database type (By default ODBC) Click OK Specify path of project ( c:\ PF \ MI \ LR \ Samples \ bin \ flights.exe) Specify working directory (C:\ Windows \ Temp) Select record into action (Vuser-init / Actions / Vuser-end) Clock OK Record our business operations per one user Click stop recording Tools Menu Create controller scenario Save that script Enter no of Vusers Click OK Click RUN Icon. Transaction Point: LoadRuner returns performance results when you enclose your required operation aas transaction. Select position in actions Insert Menu Start Transaction Enter Transaction Name Click OK Select position at the end of the operation Insert menu End transaction Click OK Rendezdous Point: It is an interrupt point. It stops current process execution until remaining Vusers executes upto that point. Navigation: Select position on top of transaction (Action part) Insert menu Rendezdous Enter name Click OK Analyze Results In general LoadRunner returns performance results in percentile (%) graph.

Y
Response Time

0.5 0.2 0 100% X Percentage Average Response time(ART) = (100% work completion 0% work starting)/ Load Increase Load: Navigation: Group Menu Add Vuser Specify QTY to add Click OK Peak Load: When application server rejects a user request, that state level load is called peak load. Submission of Performance Results: During load testing execution test engineers are reporting performance results to project management through below table format. Scenario Select, Insert, delete etc Select : : : : : : Benchmark Testing: After receiving performance results from testing team, project management decide whether that performance is good / bad? In this review project management depends on below issues. Performance results of old versions Interest of customer site people Performance results of competitive products in market Interest of product Manager Load / Scale 10 15 20 25 : : Up to peak load Average Response Time (msec) 6 8 10 11.5 : : : : 0%

If performance of application is bad, then developers concentrate on changes in Coding Structures and some times motivate customer site people to improve capability of configuration. Variant Operations Load Testing: LoadRunner allows you different operations under different loads to estimate performance. Navigation: Group Menu Add group Enter group Name Specify No of Vusers Browse script name Click OK. Note : 1. For various operations load testing, test engineers use rendezdous point names as same in corresponding Vuser scripts. 2. LoadRunner maintains 30 sec as maximum time gap in between every two consecutive groups. 3. Load runner allows 25 Vusers as maximum per one group. II. WEB Load Testing: LoadRunner allows you to conduct Load testing on 3 tire applications also. In this load testing, test engineers are using below components.

1. 2. 3. 4. 5. 6. 7. 8. 9.

Customer expected configured server computer. Web master build Remote command launcher Portmapper Virtual user generator Control Scenario Browser (IE / Netscape) WEB server Database Server

Test Cases:
TCP / IP DSN

Browser

WEB Server

DB Server

URL open Text Link Image Link Form Data

Load Test Cases Submission Submission

Time Parameters: In WEB load testing test engineers are using two extra time parameters. a) Hits / Sec (HPS): Number of WEB requests received by web server in 1 sec of time. b) Throughput: The speed of WEB server to respond to that request in 1 sec of time (KBPS).

Hits / Sec
- - - - - - - - - - - -- - -- - - - ----- - - -

Throughput

Note: i. For WEB load testing test engineers are using E-Business Web (HTTP , HTML) as Vuser type. ii. By default, LoadRunner treats one action as one transaction. 1. URL OPEN: It emulates you to open a home page of Web site under load. Navigation: Select position in action Insert menu New step Select URL Click OK Enter step name Enter URL Enter target frame if required Click OK web_url(step name, URL = *****, Target Frame = ***, LAST); Analyze Results: To estimate performance of Web application, test engineers are using two special graphs. Navigation: Graphs Menu WEB server resource graphs Select Hits / sec or Throughput. a) Hits / sec: Y
Hits / sec

10

0 7 8 X Elapsed Time

1 2 3

b) Throughput: Y
Throughput

117.44

0 7 8 X Elapsed Time 2. Text Link: It emulates you to open a middle page under load through a text link.

1 2 3

5 6

web_link(Link text, URL = *****, Target Frame = ***, LAST); 3. Image Link: It emulates to open a middle page under load through an image link. web_image(Image file name, URL = path of next page, Target Frame = ***, LAST); 4. Form Submission: It emulates you to submit a form data to Web server under load. web_submit_form(Form name, Attributes Hidden Fields, ITEMDATA, Fields Values, ENDITEM, LAST);

Browser UID PWD


OK

WEB Server
TCP / IP Login.asp
-----------------

xx x xx

DSN

DB Server

3 SQL Response 4 Next Page 1 UID, PWD, Sys Date 2 SQL Request

Syntax: web_submit_form(login, method = GET, action = http://local host / vdir / login.asp, sysdate, ITEMDATA, UID=xxx, PWD=xxx, sysdate=xxx, ENDITEM, LAST); 5. Data Submission:

It emulates you to submit form less or context less data to web server under load. Example: Auto Login, Auto Transaction, Auto Commit(Save), Auto Roll Back .. etc. Syntax: web_submit_data(step name, attributes, ITEMDATA, Field values, ENDITEM, LAST); Bench Marking: From World Wide Web Consortium Standards, a quality web site is taking 3 sec time for link operation under normal load. Form submissions like operations are taking 12 sec to complete.