Sie sind auf Seite 1von 9

'PRODUCT' Software Test Strategy Document Example

Version 1.0 (Initial Draft) November --, 2000 Revision History Date Author Description of revisions Version # November --, 2000 Initial Draft 1.0 1.1 Table of Contents REVISION HISTORY 1. INTRODUCTION 1.1 PURPOSE 4 1.2 SOFTWARE FUNCTIONAL OVERVIEW 1.3 CRITICAL SUCCESS FACTOR 1.4 SOFTWARE TESTING SCOPE (TBD) Inclusions Exclusions 1.5 SOFTWARE TEST COMPLETION CRITERIA 2. TIMEFRAME 3. RESOURCES 4.1 SOFTWARE TESTING TEAM 4.2 HARDWARE REQUIREMENTS 4.3 SOFTWARE REQUIREMENTS 5. APPLICATION TESTING RISKS PROFILE 6. SOFTWARE TEST APPROACH 6.1 STRATEGIES 6.2 GENERAL TEST OBJECTIVES: 6.3 APPLICATION FUNCTIONALITY 6.4 APPLICATION INTERFACES 6.5 SOFTWARE TESTING TYPES 6.5.1 Stability 6.5.2 System 6.5.3 SOFTWARE Regression testing 6.5.4 Installation 6.5.5 Recovery 6.5.6 Configuration 6.5.7 Security 7. BUSINESS AREAS FOR SYSTEM TEST 8. SOFTWARE TEST PREPARATION 8.1 SOFTWARE TEST CASE DEVELOPMENT 8.2 TEST DATA SETUP 8.3 TEST ENVIRONMENT 8.3.1 Database Restoration Strategies. 9. SOFTWARE TEST EXECUTION 9.1 SOFTWARE TESTING EXECUTION PLANNING 1 4 4 4 5 5 5 5 6 6 6 6 6 7 8 8 8 8 8 8 8 9 10 10 11 14 15 16 16 17 17 17 17 18 18

9.2 9.3 10. 10.1 10.2 11. 12. 13. 14. 14.1 14.2 14.3 14.4

SOFTWARE TEST EXECUTION DOCUMENTATION 18 PROBLEM REPORTING 18 STATUS REPORTING 19 SOFTWARE TEST EXECUTION PROCESS 19 PROBLEM STATUS 19 HANDOVER FOR USER ACCEPTANCE TEST TEAM 19 DELIVERABLES 19 APPROVALS 19 APPENDIXES 20 APPENDIX A (BUSINESS PROCESS RISK ASSESSMENT) APPENDIX B (SOFTWARE TEST DATA SETUP) 20 APPENDIX C (SOFTWARE TEST CASE TEMPLATE)20 APPENDIX D (PROBLEM TRACKING PROCESS) 23

20

1. Introduction 1.1 Purpose This document describes the SOFTWARE Test Strategy for the 'PRODUCT' application and tend to support the following objectives: Identify the existing project information and the software components that shoul d be tested Identify types of software testing to be done Recommend and describe the software testing strategy to be employed Identify the required resources and provide the estimate of the test efforts List the deliverables of the test project 1.2 Software Functional Overview With the implementation of the 'PRODUCT' system the users community will be able to m anage sales contacts, turn sales contacts into sales opportunities, assign sales opportunities to sales team members, generate reports, forecast sales, etc. The 'PRODUCT' application is a client/server system with MS Access database (Soon m oving to the SQL server). It consists of the following: 1. Graphical User Interf ace (GUI) screens, running under Windows or NT/2000 client and Master machines i n the MS Outlook 2000 environment; 2. Reports are producing using MS Excel and M S Word 3. E-mails ...(??) 4. Interfaces to MS Outlook 2000 and flat files for da ta import 1.3 Critical Success Factor To support delivery of an application that meets its success criteria, the critical success factor for the testing are: Correctness - Assurance that the data entered, processed, and outputted by appli cation system is accurate and complete. Accuracy and completeness are achieved t hrough control over transactions and data element, which should commence when a transaction is originated and conclude when the transaction data has been used f or its intended purpose. File Integrity - Assurance that the data entered into application system will be returned unaltered. The file integrity procedure ensures that the right file is used and that the data on the file and the sequence in which the data is stored and retrieved is correct. Access control - Assurance that the program prevents unauthorized access and pre vents unauthorized users to destabilize work environment. Scalability - Assurance that the application can handle the scaling criteria wit hin constrains of performance criteria. 1.4 Software Testing Scope (TBD) The Sof tware Testing scope will be covered in this plan. It will describe activities th at will cover the functions and interfaces in the 'PRODUCT' application. The fol lowing lists specific items that are included or excluded from the testing scope . Inclusions - Opportunity Contact - Opportunities - Opportunity Journal - Oppor tunity Comments - Sales Setup Exclusions Outlook2000 or other MS functionality S oftware Testing under illegal hardware/software configurations 1.5 Software Test Completion Criteria Software Testing for a given release will be considered to

be complete when the following conditions have been met: Criterion Description S ignoff of test cases All test cases defined for the release have been reviewed b y the appropriate stakeholders and signed off. Execution of the test All test tr ansactions have been executed successfully at least once. Closure of outstanding problems All problems found during the testing process have been reviewed, clos ed, or deferred by the management agreement. 2. Timeframe The critical dates are as follows: 3. Resources 4.1 Testing Team Name Position Start Date End Date Day s of Effort Test Tech Support Sales DBA The Test Team staffing is based upon the following assumptions: Testing of the coming release is planned to be complete in ... The System Testing is planned after the coding will be completed The promoted into the System test environment 'PRODUCT' version will properly Un it and Integration tested by the development team. Testers will supply the check list for Unit testing to development team. 4.2 Hardware requirements Personal co mputer with Pentium 233 MHz and higher - 2 clients RAM for Windows 95/98/WinMe: 32 MB of RAM for operating system, plus an additional 8MB for MS Outlook 2000 an d higher RAM for Windows NT Workstation, Windows 2000: 48 MB of RAM for the oper ating system, plus an additional 8 MB for MS Outlook 2000 and higher 20 MB of av ailable disk space for 'PRODUCT' and higher The box for database ... . I do not think that we will need a separate box. Allocated space for Test Environment, PV and backup could be enough. 4.3 Software Requirements Windows 95/98/WinMe or Wi ndows NT version 4.0 Service pack 3 or later, Windows 2000 Microsoft Outlook(r) 2000 Microsoft Excel(r) 2000 and Word(r) 2000 for 'PRODUCT' reports, Access 2000 Bug Tracking system (TBD) NOTE: 'PRODUCT' for Workgroups requires CDO 1.21 inst alled. This is on the Microsoft Office(r) 2000 CD. 5. Application Software Testi ng Risks Profile Different aspects and features of 'PRODUCT' present various lev els of risk that can be used to determine the relative levels of testing rigor r equired for each feature. In this ballpark risk analysis, likelihood of defects is determined mainly by complexity of the feature. Impact is determined by the c ritical success factors for organization, such as Dollar Value and Area of Reput ation Business Process Impact Ranking Criteria Dollar Value Impact Reputation Impact 1 . High Direct or Indirect (due to the loss of opportunity) Loss of Revenue Typic ally up to $ millions (or thousands) per month (or per year) Examples: High Impa ct related to the loss of the client Example: 2. Medium Typically up to $ millio ns (or thousands) per month (or per year) Examples: Major inconvenience to the c ustomer Example: 3. Low Typically up to $ millions (or thousands) per month (or per year) Examples: Minor inconvenience or no visible impact to a client Example : The "Business Process Risk Assessment" will be in the Appendix A Based on the Assessment the following Processes will receive High Priority during the testing : 1. Business Process Likelihood Ranking Criteria Likelihood Ranking Low Feature set to be used to a particular company Medium Used by a particular User group. High Core functionality will be used by all User groups The "Business Process Risk As sessment" will be in the Appendix A Based on the Likelihood the following proces ses will have a High priority during the testing: 1. 6. Test Approach 6.1 Strate gies Several strategies are employed in the plan in order to manage the risk and get maximum value from the time available for test preparation and execution. 6 .2 General Test Objectives: To find any bugs that have not been found in unit and integration testing perfor med by development team To ensure that all requirements have been met Positive test cases designed to te st for correct functions will be supplemented with negative test cases designed to find problems and to test for correct error and exception handling. 6.3 Appli cation Functionality All areas of the application will be tested for correct res ults, as documented in the Project requirements document(s), supplemented with i nterfaces and 6.4 Application Interfaces The following Interfaces are included i n the 'PRODUCT' Test Plan: - Internal interface with Outlook 2000 - Reporting ca pabilities with MS Word and MS Excel - Internal interface with MS Access to veri fy correct storage and data retrieval - Text files to verify data Importing capa

bilities 6.5 Testing Types 6.5.1 Stability Stability (Smoke testing or Sanity Ch ecks) testing has a purpose to verify promotions into the test environment in or der not to destabilize the test environment. Software Test Objective: Verify the stability of new builds before accepting them in the Test Environment Technique : Manually to validate the new build by running few simple tests on the separate environment. The Stability testing usually runs for o one or two hours. Complet ion Criteria: The new build will not produce major delays for the testing group when it will be ported into the Test Environment. Special Considerations: There are few questions should be asked with regard of the Stability testing: Should w e prepare special environment, like PV (port verification) or run it in Developm ent environment? What would be the procedure in the case if the new build will n ot be accepted? 6.5.2 System Testing of the application should focus on any targ et requirements that can be traced directly to use cases (or business functions) , and business rules. The goals of these tests are to verify proper data accepta nce, processing, and retrieval, and the appropriate implementation of the busine ss rules. This type of testing is based upon black box techniques, that is, veri fying the application (and its internal processes) by interacting with the appli cation via the GUI and analyzing the output (results). Identified below is an ou tline of the testing recommended for each application: Test Objective: Ensure pr oper application navigation, data entry, processing, and retrieval. Technique: E xecute each use case, use case flow, or function, using valid and invalid data, to verify the following: The expected results occur when valid data is used. The appropriate error / warning messages are displayed when invalid data is used. E ach business rule is properly applied. Completion Criteria: All planned tests ha ve been executed. All identified defects have been addressed. Special Considerat ions: (TBD) [Identify / describe those items or issues (internal or external) th at impact the implementation and execution of System test] 6.5.3 Software Regres sion Testing Software Regression testing has tree purposes. The first is to insu re that the promoted problem is properly corrected. The second purpose is to ver ify that the corrective actions did not produce any additional problems, Third p urpose is to verify that the new promoted into the test environment functionalit y did not brake any previously working software parts. This usually means repeat ing a number tests that were the problems originally created and running few tes ts to verify surrounding functionality. Test Objective: Verify that the reported problems were fixed properly and no additional problems were introduced during the fix. Technique: Manually or develop automated scripts to repeat tests were t he problems were originally discovered. Run few tests to verify the surrounding functionality. Completion Criteria: 'PRODUCT' transactions execute successfully without failure. Special Considerations: What is the extend of verification of s urround functionality? 6.5.4 Installation Installation testing has two purposes. The first is to insure that the software can be installed on all possible confi gurations, such as a new installation, an upgrade, and a complete installation o r custom installation, and under normal and abnormal conditions. Abnormal condit ions include insufficient disk space, lack of privilege to create directories, e tc. The second purpose is to verify that, once installed, the software operates correctly. This usually means running a number tests that were developed for Fun ction testing. Test Objective: Verify and validate that the 'PRODUCT' client sof tware properly installs onto each client under the following conditions: New Ins tallation, a new machine, never installed previously with 'PRODUCT' Update machi ne previously installed 'PRODUCT', same version Update machine previously instal led 'PRODUCT', older version Technique: Manually or develop automated scripts to validate the condition of the target machine (new - 'PRODUCT' never installed, 'PRODUCT' same version or older version already installed). Launch / perform ins tallation. Using a predetermined sub-set of Integration or System test scripts, run the transactions. Completion Criteria: 'PRODUCT' transactions execute succes sfully without failure. Special Considerations: What 'PRODUCT' transactions shou ld be selected to comprise a confidence test that 'PRODUCT' application has been successfully installed and no major software components are missing? 6.5.5 Reco very Failover / Recovery testing ensures that an application or entire system ca n successfully recover from a variety of hardware, software, or network malfunct

ions with undue loss of data or data integrity. Failover testing ensures that, f or those systems that must be kept running, when a failover condition occurs, th e alternate or backup systems properly "take over" for the failed system without loss of data or transactions. Recovery testing is an antagonistic test process in which the application or system is exposed to extreme conditions (or simulate d conditions) such as device I/O failures or invalid database pointers / keys. R ecovery processes are invoked and the application / system is monitored and / or inspected to verify proper application / system / and data recovery has been ac hieved. Test Objective: Verify that recovery processes (manual or automated) pro perly restore the database, applications, and system to a desired, known, state. The following types of conditions are to be included in the testing: Power inte rruption to the client Power interruption to the server Communication interrupti on via network server(s) Interruption, communication, or power loss to DASD and or DASD controller(s) Incomplete cycles (data filter processes interrupted, data synchronization processes interrupted). Invalid database pointer / keys Invalid / corrupted data element in database Technique: Tests created for System testin g should be used to create a series of transactions. Once the desired starting t est point is reached, the following actions should be performed (or simulated) i ndividually: Power interruption to the client: power the PC down Power interrupt ion to the server: simulate or initiate power down procedures for the server Int erruption via network servers: simulate or initiate communication loss with the network (physically disconnect communication wires or power down network server( s) / routers). Interruption, communication, or power loss to DASD and or DASD co ntroller(s): simulate or physically eliminate communication with one or more DAS D controllers or devices. Once the above conditions / simulated conditions are a chieved, additional transactions should executed and upon reaching this second t est point state, recovery procedures should be invoked. Testing for incomplete c ycles utilizes the same technique as described above except that the database pr ocesses themselves should be aborted or prematurely terminated. Testing for the following conditions requires that a known database state be achieved. Several d atabase fields, pointers and keys should be corrupted manually and directly with in the database (via database tools). Additional transactions should be executed using the tests from System Testing and full cycles executed. Completion Criter ia: In all cases above, the application, database, and system should, upon compl etion of recovery procedures, return to a known, desirable state. This state inc ludes data corruption limited to the known corrupted fields, pointers / keys, an d reports indicating the processes or transactions that were not completed due t o interruptions. Special Considerations: Recovery testing is highly intrusive. P rocedures to disconnect cabling (simulating power or communication loss) may not be desirable or feasible. Alternative methods, such as diagnostic software tool s may be required. Resources from the Systems (or Computer Operations), Database , and Networking groups are required. These tests should be run after hours or o n an isolated machine(s). This may call for the separate test server. 6.5.6 Conf iguration Configuration testing verifies operation of the software on different software and hardware configurations. In most production environments, the parti cular hardware specifications for the client workstations, network connections a nd database servers vary. Client workstations may have different software loaded (e.g. applications, drivers, etc.) and at any one time many different combinati ons may be active and using different resources. Test Objective: Validate and ve rify that the client, 'PRODUCT' Application function properly on the prescribed client workstations. Technique: Use Software Integration and System Test scripts Open / close various Microsoft applications, such as Excel and Word, either as part of the test or prior to the start of the test. Execute selected transaction s to simulate users activities into and out of 'PRODUCT' and Microsoft applicati ons. Repeat the above process, minimizing the available conventional memory on t he client. Completion Criteria: For each combination of 'PRODUCT' and Microsoft application, 'PRODUCT' transactions are successfully completed without failure. Special Considerations: What Microsoft Applications are available, accessible on the clients? What applications are typically used? What data are the applicatio ns running (i.e. large spreadsheet opened in Excel, 100 page document in Word).

The entire systems, netware, network servers, databases, etc. should also be doc umented as part of this test. 6.5.7 Security Security and Access Control Testing focus on two key areas of security: - Application security, including access to the Data or Business Functions, and - System Security, including logging into / remote access to the system. Application security ensures that, based upon the desired security, users are restricted to specific functions or are limited in t he data that is available to them. For example, everyone may be permitted to ent er data and create new accounts, but only managers can delete them. If there is security at the data level, testing ensures that user "type" one can see all cus tomer information, including financial data, however, user two only sees the dem ographic data for the same client. System security ensures that only those users granted access to the system are capable of accessing the applications and only through the appropriate gateways. Test Objective: Function / Data Security: Ver ify that user can access only those functions / data for which their user type i s provided permissions. System Security: Verify that only those users with acces s to the system and application(s) are permitted to access them. Technique: Func tion / Data Security: Identify and list each user type and the functions / data each type has permissions for. Create tests for each user type and verify each p ermission by creating transactions specific to each user type. Modify user type and re-run tests for same users. In each case verify those additional functions / data are correctly available or denied. System Access (see special considerati ons below) Completion Criteria: For each known user type the appropriate functio n / data are available and all transactions function as expected and run in prio r System tests Special Considerations: Access to the system must be reviewed / d iscussed with the appropriate network or systems administrator. This testing may not be required as it maybe a function of network or systems administration. Th e remote access control is under special consideration. Performance (Synchroniza tion issue) TBD> 7. Business Areas for System Test For the Test purpose the syst em will be divided into the following areas: 1. Sales Setup 2. Creating Database s 3. Getting Started - User 4. Managing Contacts and Opportunities 5. Managing t he database 6. Reporting 7. Views 8. Features 9. User Tips 8. Test Preparation 8 .1 Test Case Development Test cases will be developed based on the following: Online help - Changes to the PST.doc - Company Setup.doc - Defining Views.doc Importing Contact.doc - Installation Manual.doc - Linking Activities.doc - Quick Guide.doc - Uninstalling.doc Rather than developing detail test cases to verify the appearance and mechanisms of the GUI during the Unit testing, we will devel op a standard checklist to be used by developers. If the timeframe will not perm it the development of detail test scripts to test a coming version of 'PRODUCT' with precise input and output, the test cases along with check lists that will b e explored to a level that will allow a tester to understand the objectives of e ach test will be developed. 8.2 Test Data Setup The test data setup and test dat a dependencies are described in the Appendix B. (To consult with DBA) Test Data setup could be not an issue. However, the data dependencies (what data and from where) should be identified. 8.3 Test environment Test will only be executed usi ng known, controlled databases, in secure testing environment. The Stability tes ting will be executed for all new promotions in the separate environment in orde r not to destabilize the Test environment. 8.3.1 Database Restoration Strategies . The database will be backed up daily. Backups are to be kept for two weeks, so it should be possible to drop back to a clean version of the database if we wil l have a database corruption problem during the testing. (This will be more work if the database definition has changed in the interim. In the case when the dat abase will be moved from the MS Access to SQL server the data conversion could b e run if the test database will have a lot of data.) 9. Test Execution 9.1 Test Execution Planning (See testing types) will be scheduled as follows. Stage 1 will include Unit and integration testing to be done by development team . The GUI checklist will be supplied by the test team and reviewed by the approp riate stakeholders. Stage 2 will include the Stability testing. It to be done by the Test team lead. Stage 3 will include the System testing to be done by the test team and supporti ng personnel. The System testing will be mostly approached from the Business rul

es angle, because the Unit testing will be functional. Stage 4 will include the Installation, Compatibility, Security and other types d escribed in the 6.5 section of this document. These types of testing will be don e by the testing team and supporting personnel Note: The Usability testing will be done during the whole testing cycle and it will concentrate on user friendlin ess issues. 9.2 Test Execution Documentation Testers will check off each success ful step on the test sheets with the execution date, then sign and date-complete d sheets. Any printouts used to verify results will be annotated with the step n umber and attached. This documentation will be retained for inclusion in the pac kage for hand over to the UAT team at the end of the Testing cycle. For test ste ps that find problems, testers will note the test step number in the problem log s, and also annotate the test sheet with the problem log numbers. Once the probl em has been fixed and successfully retested, the tester will update the problem log to reflect this. Test Cases template is described in the Appendix C. 9.3 Pro blem Reporting Problem Reporting will be processed using automated Bug tracking System (Name) Summary reports of outstanding problems will be produced daily and circulated as required. (Four) problem Priority and Severity Levels will be use d. Screen-prints, printouts of database queries, reports, tables, etc. demonstra ting the problem will be attached to a hard copy of each problem log, as appropr iate. The Test Lead will hand all problem logs over to the appropriate stakehold er. Specific procedure will be developed for capturing, recording, fixing, closi ng, etc. problems found during the testing process. The above procedure will dep end on the problem Priority and Severity levels. The appropriate actions will be designed based on the problem Status at given period of time. These are describ ed in the Appendix D. 10. Status Reporting 10.1 Test Execution Process Each Busi ness area will be further subdivided into the sub-Business processes up to small est business execution unit. The number of test cases will be calculated for eac h sub-Business process and percentage of executed test cases will be constantly tracked. 10.2 Problem Status The following metrics in the forms of graphs and re ports will be used to provide the required information of the problem status: We ekly problem detection rates Weekly problem detection rates - by week - diagram Priority 1-2 problems vs. Total problems discovered ratio Re-Open / Fixed proble m ratio (TBD) 11. Handover for User Acceptance Test (UAT) Team On the System tes t completion, the test Lead will hand over the tested system and all accompanyin g test documentation to the (stakeholder). Specific handover criteria will be de veloped and agreed upon. 12. Deliverables The following documents, tool, and rep orts will be created during the testing process: Deliverables By Whom To Whom Wh en 1. Test Strategy 2. Test Plan 3. Test Results 13. Approvals The Test Strategy document must meet the approvals of the appropriate stakeholders. Title Name Si gnature Date 1. 2. 3. 14. Appendixes 14.1 Appendix A (Business Process Risk Asse ssment) 14.2 Appendix B (Test Data Setup) ## Function Data Required Data source 1 14.3 Appendix C (Test Case Template) Business Areas 01. Process name 01.01 Tes t Case 01.01.01 Test Case Prerequisites Tester Sign off Date Version Step Action Date Results Expected Results Pass/Log# Retest .01 1.1 14.4 Appendix D (Problem Tracking Process) This document describes the Bug Tracking Process for the 'PRO DUCT' program. All problems found during the testing will be logged in the XXX B ug tracking system, using a single database for all participating systems. Every one who will be testing, fixing bugs, communicating with a clients, or managing teams doing either activity, will be given "write" access to the database. Sever al different kinds of reports and graphs can be produced in XXX using its standa rd versions, or using the included report writer to create custom reports. Eithe r of these can be combined with XXX's filtering capabilities to report on select ion of data. During the testing activities all promotions to the test environmen t must be associated with a problem log, and agreed with a Testing Team. To avoi d destabilizing the test environment unnecessarily, promotions may be scheduled to include several changes, except for problems classed as high priority (urgent ), because they will hold up the testing. The following Severity Strategy will b e used: Severity 1 - Severity 2 - Severity 3 - Severity 4 - Severity 5 - The fol lowing Priority Strategy will be used: Priority 1 - Complete crash of a system P riority 2 - The important function does not work and there is no workaround Prio

rity 3 - The function does not work, but there is a work around Priority 4 - Par tial function deficiency Priority 5 - Nuisance Regular Bug Tracking Process Step Process Responsible stakeholder Action Bug Status Problem Type 1. Log Problem P roblem Originator: Tester Technical Support Sales Open new log in XXX system Try to reproduce the problem. If the problem is not reproducible, specify in the problem log. Verify whether any duplicates of the same problem is already in the system Enter full description of the problem If necessary print the log & attach any supporting printouts Assign the Priority an Severity of the Problem Assign the Owner as a responsible person to look after the problem resolution (T BD) Open Bug 2. Evaluate the problem and initiate the problem resolution Develop ment Leader Review the Problem Review the Problem Priority and Severity. If disa gree - change the Priority and Severity and specify the reason why the severity or/and Priority were changed: 2.1 If this is a Bug, Assign the Problem for corre ction to a developer. Assigned Bug 2.2 If this is a bug, but it will not be corr ected at this time due to the low Priority/Severity rating, time or resources li mitation: Escalate for decision/agreement Set the problem type as appropriate Annotate the log with recommended solution Set status to pending The Development Leader remains as a problem owner until the problem will be re-a ssign for resolution, corrected, send to training or closed by the management de cision with a Pending status assigned Pending Bug 2.3 If this is an environmenta l issue, initiate the environment correction process by assigning to the appropr iate person Assign Environment Setup 2.4 If this is an Originator's error: Annotate the problem log with explanation Change problem type to Non-problem, Duplicate or Change Request Get a Problem Originator's agreement Set status to Void Void Non-Problem Duplicate Change Request 2.5 If the problem will not be corrected, but it was reported by the Technical support or Sales as a client complain: Change problem type to Training Annotate the problem log with explanation on the workaround Get a Problem Originator's agreement Notify sales and technical writer about new training issue (TBD with sales and t echnical writer of how they want to proceed from there) Set status to Deferred Consider the problem correction in the next release Deferred Training 3 Fix the problem Developer Fix and Unit test a corrected problem Update the problem log with resolution information Set the status to Fixed Pass back to problem Originator to verification Fixed Bug 4 Fix Setup problem ? (could be a network admin.) Fix and test setup in the reported environment If required notify sales and technical writer about possible setup problem with setup correction solution Update the problem log with resolution information Set status to Fixed and redirect the ownership to the problem Originator Fixed S etup 5 Originator agreed with Non-problem or Duplicate Originator Close the prob lem log: Change the problem status for Closed Update other log fields if required Originator remains as a problem owner after he/she closes the problem log Closed

Non-problem, Duplicate or Change Request 6 Promote the fix to the test environm ent Development Leader Verify the fix Promote any modified programs to the next release, and update the problem status to Promoted Promoted Bug 7 Verify the fix Originator Retest the fix Update the problem log Change status to Closed or Re-open Annotate the test execution history with retest information 7.1 If the problem i s fixed: Change the problem status to Closed Originator remains as a problem owner after he/she closes the problem Closed Bug 7.2 If the problem is not fixed or other problems were created as a result of t he fix: Change status of the problem to Re-open Annotate the test execution history with retest information Redirect the ownership to Development Team Leader Re-open Bug NOTE: Priority 1 p roblems will require immediate attention, because they will hold up the testing. If Priority 1 problems will be discovered, the problem resolution will follow p rocess structured as follows: Problem will be logged into the XXX system The notification with request for immediate problem resolution will be sent to D evelopment Team Leader The problem fix will be done in the Test environment and promoted if necessary i nto the development after the fix the retest in the test environment will be don e. 'PRODUCT' program Test Strategy Version 1.1 Software Test Strategy Document E xample END. Extreme Software Testing Main Page 2000 Alex Samurin geocities.com/xtremetesting/

Das könnte Ihnen auch gefallen