Beruflich Dokumente
Kultur Dokumente
Informatica PowerCenter
Applies to:
Informatica PowerCenter
Summary
This article briefs about different kinds of testing approaches using Informatica PowerCenter.
Author Bio
Author(s): Sukumar Balasubramanian
Company: CIBC
Created on: March 23, 2010
Sukumar Balasubramanian is an experienced Informatica ETL Consultant working with CIBC, Canada. He
has good exposure to Data Integration/Data Warehousing Projects. He is also a key contributor in
Informatica-l group of ittoolbox.
Table of Contents
Introduction ......................................................................................................................................................... 3
Unit Testing ......................................................................................................................................................... 3
Quantitative Testing ........................................................................................................................................ 3
Qualitative Testing .......................................................................................................................................... 4
Integration Testing .............................................................................................................................................. 4
Count Validation .............................................................................................................................................. 4
Dimensional Analysis ...................................................................................................................................... 4
Statistical Analysis .......................................................................................................................................... 4
Data Quality Validation.................................................................................................................................... 5
Granularity ....................................................................................................................................................... 5
Other validations ............................................................................................................................................. 5
UAT (User Acceptance Test) .............................................................................................................................. 6
Testing Architecture ............................................................................................................................................ 6
Unsecured ....................................................................................................................................................... 6
Secured ........................................................................................................................................................... 6
Informatica Data Subset.................................................................................................................................. 6
Testing Processes .............................................................................................................................................. 8
Informatica PowerCenter Testing ....................................................................................................................... 8
Disclaimer and Liability notice .......................................................................................................................... 10
Introduction
This article briefs about the following:
Different Testing Types like Unit Testing, Integration and UAT.
Testing Architecture & Processes
Testing tools available with Informatica
Testing facilities available with Informatica PowerCenter
Unit Testing
Unit testing can be broadly classified into 2 categories.
Quantitative Testing
a) Have customized SQL queries to check the source/targets and here we will perform the
Record Count Verification.
b) Analyze the rejections and build a process to handle those rejections. This requires a clear
business requirement from the business on how to handle the data rejections. Do we need
to reload or reject and inform etc? Discussions are required and appropriate process must
be developed.
Performance Improvement
a) Network Performance
b) Session Performance
c) Database Performance
d) Analyze and if required define the Informatica and DB partitioning requirements.
Qualitative Testing
Analyze & validate your transformation business rules. More of functional testing.
e) You need review field by field from source to target and ensure that the required
transformation logic is applied.
f) If you are making changes to existing mappings make use of the data lineage feature
available with Informatica PowerCenter. This will help you to find the consequences of
altering or deleting a port from existing mapping.
g) Ensure that appropriate dimension lookup’s have been used and your development is in
sync with your business requirements.
Integration Testing
After unit testing is complete; it should form the basis of starting integration testing. Integration testing should
test out initial and incremental loading of the data warehouse.
Integration Testing would cover End-to-End Testing for DWH. The coverage of the tests would include the below:
Count Validation
Record Count Verification: DWH backend/Reporting queries against source and target as an
initial check.
Control totals: To ensure accuracy in data entry and processing, control totals can be
compared by the system with manually entered or otherwise calculated control totals using the
data fields such as quantities, line items, documents, or dollars, or simple record counts
Hash totals: This is a technique for improving data accuracy, whereby totals are obtained on
identifier fields (i.e., fields for which it would logically be meaningless to construct a total), such
as account number, social security number, part number, or employee number. These totals
have no significance other than for internal system control purposes.
Limit checks: The program tests specified data fields against defined high or low value limits
(e.g., quantities or dollars) for acceptability before further processing.
Dimensional Analysis
Statistical Analysis
o When you validate the calculations you don’t require loading the entire rows into target and
validating it.
o Instead you use the Enable Test Load feature available in Informatica PowerCenter.
Property Description
You can configure the Integration Service to perform a test load.
Enable Test
With a test load, the Integration Service reads and transforms data without writing to targets.
Load
The Integration Service generates all session files, and performs all pre- and post-session
functions, as if running the full session.
The Integration Service writes data to relational targets, but rolls back the data when the
session completes. For all other target types, such as flat file and SAP BW, the Integration
Service does not write data to the targets.
Enter the number of source rows you want to test in the Number of Rows to Test field.
You cannot perform a test load on sessions using XML sources.
You can perform a test load for relational targets when you configure a session for normal
mode. If you configure the session for bulk mode, the session fails.
Number of Enter the number of source rows you want the Integration Service to test load.
Rows to Test The Integration Service reads the number you configure for the test load.
Overflow checks: This is a limit check based on the capacity of a data field or data file area to
accept data. This programming technique can be used to detect the truncation of a financial or
quantity data field value after computation (e.g., addition, multiplication, and division). Usually,
the first digit is the one lost.
Format checks: These are used to determine that data are entered in the proper mode, as
numeric or alphabetical characters, within designated fields of information. The proper mode in
each case depends on the data field definition.
Sign test: This is a test for a numeric data field containing a designation of an algebraic sign, +
or - , which can be used to denote, for example, debits or credits for financial data fields.
Size test: This test can be used to test the full size of the data field. For example, a social
security number in the United States should have nine digits
Granularity
Other validations
Audit Trails, Transaction Logs, Error Logs and Validity checks.
Note: Based on your project and business needs you might have additional testing requirements.
In this phase you will involve the user to test the end results and ensure that business is satisfied
with the quality of the data.
Any changes to the business requirement will follow the change management process and
eventually those changes have to follow the SDLC process.
Testing Architecture
1. Unsecured
2. Secured
Unsecured
Even now many organizations go for unsecured testing architecture because it requires little budget and less
maintenance.
Assume that you have sales data warehouse, you have implement a change requirement where by you need
1 year worth of data from production.
In this case you will develop a mapping to read the data from production warehouse and load into
development and proceed with the development.
Meaning, a developer can see the production data as it is. Some organizations will perform data masking
before bringing the data from production to UAT or Development environment.
Secured
In this case production data will be always masked before they are available in the DEV environment,
Informatica Data Subset helps IT organizations untangle complex transactional systems, separating out only
functionally related data.
• Dramatically accelerate development and test cycles and reduce storage costs by creating fully
functional, smaller targeted data subsets for development, testing, and training systems, while
maintaining full data integrity
• Quickly build and update nonproduction systems with a small subset of production data and replicate
current subsets of nonproduction copies faster
• Simplify test data management and shrink the footprint of nonproduction systems to significantly
reduce IT infrastructure and maintenance costs
• Reduce application and upgrade deployment risks by properly testing configuration updates with up-
to-date, realistic data before introducing them into production
• Easily customize provisioning rules to meet each organization’s changing business requirements
• Lower training costs by standardizing on one approach and one infrastructure
• Train employees effectively using reliable, production-like data in training systems
• Untangle complex operational systems and separate data along business lines to quickly build the
divested organization’s system
• Accelerate the provisioning of new systems by using only data that’s relevant to the divested
organization
• Decrease the cost and time of data divestiture with no reimplementation costs
• Dramatically increase an IT team’s productivity by reusing a comprehensive list of data objects for
data selection and updating processes across multiple projects, instead of coding by hand—which is
expensive, resource intensive, and time consuming
• Accelerate application delivery by decreasing R&D cycle time and streamlining test data
management
• Improve the reliability of application delivery by ensuring IT teams have ready access to updated
quality production data
• Lower administration costs by centrally managing data growth solutions across all packaged and
custom applications
• Substantially accelerate time to value for subsets of packaged applications
• Decrease maintenance costs by eliminating custom code and scripting
Testing Processes
Concentrate on the following for any testing requirements that you have:
In any organization we will have parallel activities going on. Like BA’s want to test a functionality in UAT
for which they need data from Production. Developer wants to perform a unit testing for which he/she
needs data from Production.
For the above such requests we need to create a data load matrix and prioritize their needs.
Debugger: Very useful tool for debugging a valid mapping to gain troubleshooting information about
data and error conditions. Refer informatica documentation to know more about debugger tool.
Property Description
Enable You can configure the Integration Service to perform a test load.
Test Load
With a test load, the Integration Service reads and transforms data without writing to
targets. The Integration Service generates all session files, and performs all pre-
and post-session functions, as if running the full session.
The Integration Service writes data to relational targets, but rolls back the data
when the session completes. For all other target types, such as flat file and SAP
BW, the Integration Service does not write data to the targets.
Enter the number of source rows you want to test in the Number of Rows to Test
field.
You cannot perform a test load on sessions using XML sources.
You can perform a test load for relational targets when you configure a session for
normal mode. If you configure the session for bulk mode, the session fails.
Number of Enter the number of source rows you want the Integration Service to test load.
Rows to
The Integration Service reads the number you configure for the test load.
Test
o Test a development environment. Run the Integration Service in safe mode to test a
development environment before migrating to production
o Troubleshoot the Integration Service. Configure the Integration Service to fail over in safe
mode and troubleshoot errors when you migrate or test a production environment configured
for high availability. After the Integration Service fails over in safe mode, you can correct the
error that caused the Integration Service to fail over.
Syntax Testing: Test your customized queries using your source qualifier before executing the
session.
Performance Testing for identifying the following bottlenecks:
o Target
o Source
o Mapping
o Session
o System
Use the following methods to identify performance bottlenecks:
Run test sessions. You can configure a test session to read from a flat file source or to write to
a flat file target to identify source and target bottlenecks.
Analyze performance details. Analyze performance details, such as performance counters, to
determine where session performance decreases.
Analyze thread statistics. Analyze thread statistics to determine the optimal number of
partition points.
Monitor system performance. You can use system monitoring tools to view the percentage of
CPU use, I/O waits, and paging to identify system bottlenecks. You can also use the Workflow
Monitor to view system resource usage.
Use PowerCenter conditional filter in the Source Qualifier to improve performance.
Share metadata. You can share metadata with a third party. For example, you want to send a
mapping to someone else for testing or analysis, but you do not want to disclose repository
connection information for security reasons. You can export the mapping to an XML file and edit
the repository connection information before sending the XML file. The third party can import the
mapping from the XML file and analyze the metadata.