Beruflich Dokumente
Kultur Dokumente
SS ZG515
BITS Pilani
Pilani Campus
PC Reddy
Guest Faculty WILP, BITS Pilani
BITS Pilani
Pilani Campus
DW Testing
Challenges of Data warehouse Testing
Testing Goal
Testing Methodology
Testing Types
Test Stop Criteria
Challenges of
Data warehouse Testing:
Data selection from multiple source systems and analysis that follows pose great
challenge.
Testing Goal
Testing Methodology
Provision of appropriate tools to speed the process of Test Execution & Evaluation.
Regression Testing
Testing Types:
The following are types of Testing performed for Data warehousing projects.
Unit Testing
Integration Testing
System Testing
Regression Testing
Integration testing
Integration Testing
Its major objective is to verify the
data produced and validate the design
Integration testing
Prerequisite:
Implementation Checklist for move from development to test.
All unit testing completed and summarized.
Migration to the test environment from the development environment.
Data available in the test environment.
Objectives:
Validate the business requirements, functional requirements
Validate the data for correct business rules that correct number of rows are transferred and
verify load volumes.
Ensure mapping order is correct and dependencies among workflows are in place.
Integration testing
Testing the individual mappings to verify the transformations and also at the workflow
level.
Integration testing
Inputs:
Project Plan,Business requirements document
Test cases and steps
Access to personal files on the network
Executed and approved unit test cases or peer review reports
Source to Target Matrices(STM)
Extract and Load Order document
Note: The project manager is responsible for ensuring all the input criteria are completed by
the appropriate project team member as defined in the project Deliverables Matrix prior to
each phase of testing
Integration testing
Environment:
Integration testing is performed in the test environment.
Tools:
Data access tools (e.g., TOAD, PL/SQL) are used to analyze content of tables and to
analyze results of loads.
ETL Tools(e.g. Informatica,Datastage).
Test management tool(e.g. Test Director ,QC) that maintains and tracks the
requirements, test cases, defects and traceability matrix.
Integration testing
Deliverables:
Executed Integration Test Case documents, i.e., documented actual results against each
test, signed and dated by the tester(s).
Signed and approved Test Case Index & Results document which contains results of
executed Integration test scripts.
Updated Requirements Traceability Matrix
Integration testing
Test Case Index and Results:
The DW&BI team should use the Test Case Index and Results document to report result of
testing. The document tracks the following
Test Case #: Enter a test case number in sequential outline format (e.g., 1, 1.1, 2, 2.1, 3).
Description: Provide a brief description that covers each test case instance as fully as
possible.
Requirement # and Description: List each requirement number that corresponds to the listed
test case number and briefly describe.
Criticality: Provide a relative criticality ranking for each test case instance (Low, Medium,
High).
Result: Indicate each test case result (Pass [test case meets acceptable criteria], Fail [test
case does not meet acceptable criteria], Hold [test case requires additional data for result
to be determined].)
Fail Description Reference SPR#: For each failed test case, list the assigned Software
Process Report (SPR) #, briefly describe what caused the failure.
Robot / SQL Script Name: Indicate the assigned SQL script name, as applicable.
System Testing
System Testing
System Testing is performed to prove that the system meets
the Functional Specifications from an end to end perspective.
The testing team will verify that the data in the source system databases and
the data in the Target is consistent through out the process
System Testing
Prerequisite:
Finalized Implementation Checklist
Input:
Project Plan,Business requirements document
System Test Cases and steps
Updated Operations Manual
Signed and approved integration Test Case Index, Test Case documents, and scripts
System Testing
Objectives:
Verify the QA environment is an exact replica of Production prior to running the system
test
Run end-to-end system test starting from the source databases to target and verify the
data output.
Record initialization and incremental load statistics
Verify functionality of the system meets the business specifications
Verify error handling and reconciliation processes are functioning properly
System Testing
Environment:System testing is performed in the QA environment
Tools:
Data access tools (e.g., TOAD, PL/SQL) are used to analyze content of tables and to analyze
results of loads.
ETL Tools(e.g. Informatica,Datastage).
Test management tool(e.g. Test Director ,QC) that maintains and tracks the requirements, test
cases, defects and traceability matrix
Data:
Production replicated data
System Testing
Deliverables:
Executed System Test Cases, i.e., documented actual results against each test, signed
and dated by the tester(s)
Signed and approved Test Case Index & Results document which contains results of
executed system test scripts
Requirements Traceability Matrix
A summary report
UAT
User Acceptance Testing:
The objective of this testing to ensure that System meets the expectations of the
business users.
It aims to prove that the entire system operates effectively in a production environment
and that the system successfully supports the business processes from a user's
perspective.
The tests will also include functions that involve source systems connectivity, jobs
scheduling and Business reports functionality.
Deployment Test
Tests the deployment of the solution .
Tests overall technical deployment checklist and timeframes .
Tests the security aspects of the system including user authentication and authorization, and
user-access levels.
Tests the operability of the system including job control and scheduling
Regression Test
Regression Testing:
Performed after a defect reported is fixed by the developer.
Performed when a Change Request is implemented on an existing production system.
Testing stops when the result is unproductive (No. of errors per person per day reduces)
DB testing vs DW testing
DB Testing
DW Testing
Questions ????.