Sie sind auf Seite 1von 29

Data Warehousing

SS ZG515
BITS Pilani
Pilani Campus

PC Reddy
Guest Faculty WILP, BITS Pilani

BITS Pilani
Pilani Campus

Data Warehousing Lecture 13


DW Project Management
Testing - A practical approach

DW Testing
Challenges of Data warehouse Testing
Testing Goal
Testing Methodology
Testing Types
Test Stop Criteria

BITS Pilani, Pilani Campus

Challenges of
Data warehouse Testing:

Data selection from multiple source systems and analysis that follows pose great
challenge.

Volume and the complexity of the data.

Inconsistent and redundant data in a data warehouse.

Loss of data during the ETL process.

Non-Availability of comprehensive test bed

Critical Data for Business.

BITS Pilani, Pilani Campus

Testing Goal

Our main aim is to check the quality of that data.

Data completeness. Ensures that all expected data is loaded.

Data transformation. Ensures that all data is transformed correctly


according to business rules and/or design specifications.

Data quality. Ensures that the ETL application correctly rejects,


substitutes default values, corrects or ignores and reports invalid data.

BITS Pilani, Pilani Campus

Testing Methodology

Use of Traceability matrix to enable full test coverage of Business Requirements.

In depth review of Test Cases.

Manipulation of Test Data to ensure full test coverage.

Provision of appropriate tools to speed the process of Test Execution & Evaluation.

Regression Testing

BITS Pilani, Pilani Campus

Testing Types:
The following are types of Testing performed for Data warehousing projects.

Unit Testing

Integration Testing

Technical Shakedown Testing

System Testing

Operation readiness Testing

User Acceptance Testing

Regression Testing

BITS Pilani, Pilani Campus

Integration testing

Integration Testing
Its major objective is to verify the
data produced and validate the design

BITS Pilani, Pilani Campus

Integration testing
Prerequisite:
Implementation Checklist for move from development to test.
All unit testing completed and summarized.
Migration to the test environment from the development environment.
Data available in the test environment.

Objectives:
Validate the business requirements, functional requirements
Validate the data for correct business rules that correct number of rows are transferred and
verify load volumes.
Ensure mapping order is correct and dependencies among workflows are in place.

BITS Pilani, Pilani Campus

Integration testing

Validate, target tables are populated with correct number of records.

To Check for Error log messages in appropriate file.

To check for restarting of Jobs in case of failures.

Validate the execution of workflows and data at the following stages


Source to Staging .
Staging to ODS.
ODS to Data Mart

Verify integration of new mappings with existing mappings.

Validate proper functionality of mapping variables and parameter files.

Testing the individual mappings to verify the transformations and also at the workflow
level.

BITS Pilani, Pilani Campus

Integration testing
Inputs:
Project Plan,Business requirements document
Test cases and steps
Access to personal files on the network
Executed and approved unit test cases or peer review reports
Source to Target Matrices(STM)
Extract and Load Order document
Note: The project manager is responsible for ensuring all the input criteria are completed by
the appropriate project team member as defined in the project Deliverables Matrix prior to
each phase of testing

BITS Pilani, Pilani Campus

Integration testing
Environment:
Integration testing is performed in the test environment.

Tools:
Data access tools (e.g., TOAD, PL/SQL) are used to analyze content of tables and to
analyze results of loads.
ETL Tools(e.g. Informatica,Datastage).
Test management tool(e.g. Test Director ,QC) that maintains and tracks the
requirements, test cases, defects and traceability matrix.

BITS Pilani, Pilani Campus

Integration testing
Deliverables:
Executed Integration Test Case documents, i.e., documented actual results against each
test, signed and dated by the tester(s).
Signed and approved Test Case Index & Results document which contains results of
executed Integration test scripts.
Updated Requirements Traceability Matrix

BITS Pilani, Pilani Campus

Integration testing
Test Case Index and Results:
The DW&BI team should use the Test Case Index and Results document to report result of
testing. The document tracks the following
Test Case #: Enter a test case number in sequential outline format (e.g., 1, 1.1, 2, 2.1, 3).
Description: Provide a brief description that covers each test case instance as fully as
possible.
Requirement # and Description: List each requirement number that corresponds to the listed
test case number and briefly describe.
Criticality: Provide a relative criticality ranking for each test case instance (Low, Medium,
High).
Result: Indicate each test case result (Pass [test case meets acceptable criteria], Fail [test
case does not meet acceptable criteria], Hold [test case requires additional data for result
to be determined].)
Fail Description Reference SPR#: For each failed test case, list the assigned Software
Process Report (SPR) #, briefly describe what caused the failure.
Robot / SQL Script Name: Indicate the assigned SQL script name, as applicable.

BITS Pilani, Pilani Campus

Technical Shakedown Test


Technical Shakedown Test:
A Technical Shakedown Test will be conducted prior to System Testing
Objective:
Software has been configured correctly (including Informatica architecture, Source
system connectivity and Business Objects).
All the code has been migrated to the QA environments correctly.
All required connectivity between systems are in place.

BITS Pilani, Pilani Campus

System Testing

System Testing
System Testing is performed to prove that the system meets
the Functional Specifications from an end to end perspective.
The testing team will verify that the data in the source system databases and
the data in the Target is consistent through out the process

BITS Pilani, Pilani Campus

System Testing
Prerequisite:
Finalized Implementation Checklist

All integration testing should be completed


Migration from the Test environment to the QA environment, as applicable
Production configuration and data available

Input:
Project Plan,Business requirements document
System Test Cases and steps
Updated Operations Manual
Signed and approved integration Test Case Index, Test Case documents, and scripts

BITS Pilani, Pilani Campus

System Testing
Objectives:
Verify the QA environment is an exact replica of Production prior to running the system
test
Run end-to-end system test starting from the source databases to target and verify the
data output.
Record initialization and incremental load statistics
Verify functionality of the system meets the business specifications
Verify error handling and reconciliation processes are functioning properly

BITS Pilani, Pilani Campus

System Testing
Environment:System testing is performed in the QA environment

Tools:
Data access tools (e.g., TOAD, PL/SQL) are used to analyze content of tables and to analyze
results of loads.
ETL Tools(e.g. Informatica,Datastage).
Test management tool(e.g. Test Director ,QC) that maintains and tracks the requirements, test
cases, defects and traceability matrix

Data:
Production replicated data

BITS Pilani, Pilani Campus

System Testing
Deliverables:
Executed System Test Cases, i.e., documented actual results against each test, signed
and dated by the tester(s)
Signed and approved Test Case Index & Results document which contains results of
executed system test scripts
Requirements Traceability Matrix
A summary report

BITS Pilani, Pilani Campus

UAT
User Acceptance Testing:
The objective of this testing to ensure that System meets the expectations of the
business users.
It aims to prove that the entire system operates effectively in a production environment
and that the system successfully supports the business processes from a user's
perspective.
The tests will also include functions that involve source systems connectivity, jobs
scheduling and Business reports functionality.

BITS Pilani, Pilani Campus

ORT and Deployment test


Operational Readiness Testing (ORT):
This is the final phase of testing which focuses on verifying the deployment of software and the
operational readiness of the application.

Deployment Test
Tests the deployment of the solution .
Tests overall technical deployment checklist and timeframes .
Tests the security aspects of the system including user authentication and authorization, and
user-access levels.
Tests the operability of the system including job control and scheduling

BITS Pilani, Pilani Campus

Regression Test
Regression Testing:
Performed after a defect reported is fixed by the developer.
Performed when a Change Request is implemented on an existing production system.

Inputs :Impact analysis workbook prepared by the developer


Test Result Report of System Integration Test ,if Change Request is implemented on an
existing production system.

BITS Pilani, Pilani Campus

Test Stop Criteria


Test Stop Criteria:
Reaching deadlines, e.g.: release deadlines, testing deadlines
Test Cases completed with certain percentage passed
Test budget has been depleted
Coverage of code or requirements reaches a specified point
Defects rate falls below a certain level

Testing stops when the result is unproductive (No. of errors per person per day reduces)

BITS Pilani, Pilani Campus

DW testing vs OLTP testing


User Triggered vs System triggered

Back end testing (systems team) , front end testing (user)

Batch vs Online gratification

Challenge to maintain user interest.

Volume of test data


Possible Scenarios/test cases

You can never fully test a DW!!

Special scripts to validate test results.

pre-transformation to post-transformation comparison scripts.


Data quality validation scripts.

BITS Pilani, Pilani Campus

DB testing vs DW testing
DB Testing

Smaller scale of data


Data is consistently
injected from uniform
sources.
Focus on create, read,
update, delete
operations (CRUD)
Normalized DB is
used in a typical DB
testing

DW Testing

Large volume of data


is involved in testing.
Data comes from
different sources .
Most of the testing
focused on Read and
limited testing on
Update/Delete.
Denormalized DB is
used.
BITS Pilani, Pilani Campus

ETL Testing techniques


Verify that data is transformed correctly according to
various business requirements and rules.
Make sure that all projected data is loaded into the data
warehouse without any data loss and truncation.

Make sure that ETL application appropriately rejects,


replaces with default values and reports invalid data.
Make sure that data is loaded in data warehouse within
prescribed and expected time frames to confirm
improved performance and scalability

BITS Pilani, Pilani Campus

ETL Testing Challenges


Challenges.
- Incompatible and duplicate data.

- Loss of data during ETL process.


- Unavailability of inclusive test bed.
- Testers have no privileges to execute ETL jobs by their own.
- Volume and complexity of data is very huge.
- Fault in business process and procedures.
- Trouble acquiring and building test data.
- Missing business flow information.

Data is important for businesses to make the critical


business decisions. ETL testing plays a significant role
validating and ensuring that the business information is
exact, consistent and reliable. Also, it minimizes hazard
of data loss in production.
BITS Pilani, Pilani Campus

Questions ????.

BITS Pilani, Pilani Campus

Das könnte Ihnen auch gefallen