Sie sind auf Seite 1von 6

MANUAL TESTING Requirements planning design coding testing delivery maintain Testing: it is a process of verifying are we developing for

ng for write system or not and also validating does the developed system is write or not (s.t = verification + validation) BRS Business Requirement specifications / SRS system requirement specifications Software: it a set of computer programs & d minimal data required to run a system is called software Project: if a software application is a developed is specific customer requirement there it is called project Product: if a software application is a multiful customers requirement then it is called product Early testing : conducting testing as early as possible is development life cycle to find problems at early stages Testing: it is a process of verifying are we developing for write system or not and also validating does the developed system is write or not (s.t = verification + validation) Verification: it is process of verifying are we developing the right system or not also called static testing Validation: it is process of validating does implement code and application is working expected or not also called as dynamic testing Error: An incorrect human action that produces problem in the software is called error Defect/bug/fault: deviations from expected behavior to actual behavior identified during testing is called defect Failure: deviations from expected behavior to actual behavior identified by end user in operation is called failure Why do u software application will gave defects? Incorrect requirements wrong design poor coding complex business logic complex technology work pressure Most common defects: incorrect functionality incorrect data edits poor performance poor security incompatibility poor user interfaces (UI) poor usability Testing Techniques: static testing white box testing black box testing gray box testing Exhausting testing: testing functionality with all possible combinations of valid and invalid inputs is called exhaustive testing. RISK BASE TESTING: identifying critical functionalities to the end user business than pre artarising the test that is desiding what to test first what to test next what to test last and conducting to the test in the same order is called risk based testing Defect clustery: defect clustering says small no of modules may contain more no of defects so identify such risky modules and concentrate more testing on this moduals. TQM = Total quality management Software development life cycle (SDLC) : a sdlc model explains how various implementation activities are carried out while developing a software that is base on the size of the project base on the time

availability (ex: only 3 months etc) cost and resources, management (project manager) desides appropriate module for implementing project. SDLC basically to categories: 1. Sequential model 2. Iterative Sequential model: 1. Waterfall model 2. V model Iterative: 1. RAD model prototype model spiral model agile model V Model (verification and validation): v model is introduced by eliminating the drawback of water fall model that is testing is associated to all stages of software development to find defects as soon as possible RAD model: in this model a big project divided into moduals. Every module will be consider a separate project a decided teams will be schedule implementation all this moduals so all moduals are getting implemented simultaneously once all moduals are constructed all this moduals will be combined then the delivery to the customer Proto type: This model is preferred when the customer requirements are not clear. This model a prototype will be develop and demonstrated to the customer to get the early feedback. Spiral model: spiral model is recommended for maintenance project where there dynamic requirements and frequently change in requirements. Customer business demands requirement by requirement the project may implemented during maintenance. Agile model: in this model a big project divided into module than every module again divided into sub moduals than the sub module will be implemented one by one release to the customer for acceptance testing to get the early feedback Static testing: verifying are we developing the right system or not is called static testing. Help of reviews and walk through Review: examining a project related work or a process related work is called reviews. Cross checking requirements, design, code and test cases Walk through: knowledge transfer secession conducted by domain experts to provide common understandings to the team about a subject is called walk through. Types of reviews: management technical - code formal informal White box testing: testing conduct on the source code by developers to check whether the developed source code is working as expected or not is called white box testing White box testing = unit testing + integrative testing STUB: a simulated programme that replaces a called programme is called stub. DRIVER: a simulated programme that replaces a calling programme is called stub. Black box testing: testing conducted as the application by tester to end user does the application is working as per the customer business requirements or not its called black box testing. System testing and user acceptance testing collectively called black box testing Need for bbt: white box testing will be conducted in a technical perception to check the application with end user perception black box is required. Non functionalities will not validated in WBT as per non functionality also very imp the end user business WBT will be conducted by developers in a positive approach. Where there is a chance of bugs left in to find is defects BBT testing is required in WBT

developers hesitate to find defects on their code. Where as in BBT testers will work between intentions of finding defects result a quality system WBT is conducted to ensure the code coverage where as BBT will be conducted to ensure requirements coverage. System testing: validating functionality or non functionality requirement of the system is called system testing - functionality non functionality system testings Functional testing = positive negative Positive : testing a functionality with a positive perception to check what system suppose to do is called positive testing Ex: entering valid username and password and clicking on submit to check how does the login working for valid operation is called positive Negative: entering in valid user name invalid pw and clicking submit to check how the login behaves in for invalid operation is called negative testing Types of testing in functional system testing : Smoke testing / sanity testing: smoke testing or sanity testing are almost one and the same which is a kind of quick test carried out. On the given application to deter main whether the build is detailed testbul or not If is this picture is conducted at testing environment before a system testing than it is called smoke testing if the same picture is conducted at protection (live) environment before acceptance testing than it is called sanity testing Formal testing: testing a software application by following all pre planned processors and profor documentation Informal / ad hoc testing: testing a application without following any systematic processors that is as we wish than it is called ad-hoc testing Exploratory testing: ET the application understanding the functionalities then testing it is called ET Monkey/ gorilla / rattle / zigzag testing: testing a application in a unevenly in the intention of finding trickily defects is called monkey testing Retesting: testing a functionality repetitively that is again and again is called retesting Regression testing: IMP: re running or re executing the selective test cases for the dependent functionalities on the modified build is called regression testing. RT is helpful to identify the side effects. End to end testing: testing conducted on the completed application to check the overall functionalities of the system including the data reflection among all modules is called to end to end testing TYPES OF TESTING IN NO FUNCTIONALITIES SYSTEM: User interface / graphical user interface: validating does the uses interfaces are professionally design are not is called as testing Usability testing: validating whether the application is easily understandable and opera table by all types of users or not is called Performance testing: analyzing various efficiency characteristic of a system such as load, response time, data transfer rate, transfer for minute, t for second Load testing: validating how the application behaving under various load conditions does is called LT

Security testing: validating does all security conditions are properly implemented the system r not is called AUTHERISATION TESTING: checking does a system in the provision of creating a login accounts setting permissions and changing permissions or not is called autherisation testing Recovery testing: globalization testing: localization testing: installation testing: un installation testing: compatibility testing checklist for compatibility testing User acceptance testing: testing conducted on the completed application by domain experts or by end users to deter main whether the application is fir for live protection or not is called user acceptance testing. Alfa testing beta testing: BBT testing technique: equivalence class partition (ECP) bounding value analysis (BVA) decision table testing state transition testing use case testing. 100% requirement coverage the following techniques are helpful to define minimum and best test cases validate a system GRAY BOX TESTING: it is a mixing combination of white box testing and black box testing. If u interest with both structural and non structural components of a system to validate any specific scenarios in the system than it is called gray box testing. SOFTWARE TESTING LIFE CYCLE: Test planning test analysis test design test executions test closure Test planning: it is the first page of testing where project manager test manager will define high level plane and approach of testing software and test is prepares a detailed work plan. Test strategy: it is a management high level plan and approach that provides sufficient confident on the project being tested. Defined on PM are by TM Test plan: it is a detailed work plan document derive from test strategy by test lead this test plan document explains detailed activity is carried out Objectives of test plan: scope of testing approach to be followed resources schedules & deliverables Test analysis: In this phase test engineer will studied various requirement specifications such as BRS, SRS to understand customer business requirements while analyzing this requirement if there are any quations/doubts we record the same in requirement clarification note Test design: in this page test engineers will document test scenarios. Test cases and test data and requirement traceability matrix which helps in testing a application easily and effectively Test scenarios: an item is functionality or a feature to be tested is called test scenario Test case: a test case is a set of pre condition; steps to be followed input data & expected behavior to validate functionality in d system. In simple words a test case is brief description of what 2 test & how 2test Tractability: the ability of identify a batch of test cases that belongs to a requirement is called tractability Traceability matrix: mapping between test cases to requirement is called traceability matrix Test case execution process: we arrange the test cases in test case priority order than we execute test cases in the same order priority based testing one test case may have multiple test steps. we execute this steps in same order. If all steps are passed then final test case results is pass if any one of the step is

failed then final test case results is failed Reproducible: if a defect is accuring every time than it is called reproducible defects. If a bug is not accuring every time than it is called not reproducible defect. Classification of defect priority: very high high medium low is priority clarification Software configuration management: common repository:(storage): a centralized computer system (server) where you store and manage all project resources such as BRS,SRS,Test Plan, Test Scenarios, Test Cases, Builds and defect report etc. Use of common repository: as a software testing is a team activity to share the among the team and also to track the project status. Configuration management useful: software configuration management includes Project resources management (BRS/SRS) change control management- version control (ver1, ver2...) BRS: business recruitment specifications / SRS: system recruitment specifications RCN: recruitment clarification note / RTM: recruitment trasabililty matrix ISO 9001,9002: quality , product standards / 14001: environment standards HLD: higher level design / LLD: lower level design / FRS: functional requirement specifications ROI: return on investment / AUT : application under test Test tools followed web sites: sticky minds.com, onestoptesting.com

AUTOMATION AUTOMATIONS: making the manual process automatic is called automations testing In other words automations testing is a process of identifying the functionalities which need to be frequently retested developing automation test and executing automation test to test this functionalities on the modified version Benefits of AMT: Faster accurate (reliable) repeatable reusable comprehensive Automation testing approach: automation testing can be achieved can be carried out the following 3 ways. 1. PROGRAMMING LANGUAGES: C,C++,JAVA,.NET,ECT 2. SCRIPTING LANGUAGES: VB SCRIPT, JAVA SCRIPT,PHYTHAN SCRIPT,RUBY SCRIPT,PEARL SCRIP 3. TESTING TOOLS: QTP, WIN RUNNER, RFI, SILK TEST, LOAD RUNNER, SELICUM, ETC. Clarifications of automation testing tools: 1.FUNCTIONAL TESTING : QTP, WIN RUNNER, SILK TEST , SELECUM (OPEN SOURCE), RFT, TEST PARTNER, TEST COMPLETE

2. PERFORMANCE TESTING : LOAD RUNNER, SILK PERFORMANCE, RPT, WEBLOAD, QA LOAD, OPEN STA J METER(O/S) 3. TEST MANAGEMENT: Q C TEST MANAGER SILK CONTROL 4. BUGS REPORTING TOOLS: QC, BUG ZILLA, BUG IT PR TRACKING MANTISA ISSUE MANAGER Tools selection criteria: cost features offered by the tool technology supported by the tool environment supported by the tool user friendliness performance portability maintainability vender support Automation testing life cycle: test planning analyize application analyzing under the testing setup test environment prepare basic tests enhance the tests debug the test run the test analyze test results bug reporting. Working with ADD-In-Manager: add in act as a interface between QTP and you application. Add in is liberary that contains a set of free defined objects each to recognize a particular control present in your application If appropriate add in not loaded QTP will not identify the object properly. Wo we may not able to automat all operations on the controls. By default QTP comes free the following add in : 1. Active 2. Visual basic 3. Web (HTML)

Unit TestingThisinitialstageintestingnormallycarriedoutbythedeveloperwhowrotethe codeandsometimesbyapeerusingthewhiteboxtestingtechnique. Integration TestingThisstageiscarriedoutintwomodes,asacompletepackageoras aincrementtotheearlierpackage.Mostofthetimeblackboxtestingtechniqueisused. However,sometimesacombinationofBlackandWhiteboxtestingisalsousedinthisstage. System TestingInthisstagethesoftwareistestedfromallpossibledimensionsforall intendedpurposesandplatforms.InthisstageBlackboxtestingtechniqueisnormallyused. User Acceptance TestingThistestingstagecarriedoutinordertogetcustomersign-off offinishedproduct.A'pass'inthisstagealsoensuresthatthecustomerhasacceptedthe softwareandisreadyfortheiruse.

Das könnte Ihnen auch gefallen