Sie sind auf Seite 1von 79

8/2/2017

About The Speaker


Testing Process and Tools
for Data Migration/Integration Wayne Yaddow

and DWH ETL Projects 20+ years with IBM, z/OS development and
test
Assuring Data Content, Structure, and Quality Since 2006, focused on DW data quality
testing as a consultant
Wayne Yaddow
ETL Tester Training Co-author of the book, Testing the Data
Datagaps.com Warehouse and author of ETL testing
wyaddow@gmail.com
articles for several industry publications
ETL coach/mentor for testers

1 2

1
8/2/2017

Course/Tutorial Structure Background on this Course

Topic 1: ETL Concepts and Terminology


Researched practices of ETL testing in world-wide
Topic 2: Challenges of Data Integration & ETL Testing organizations

Topic 3: Planning for Tests


Identified challenges faced by the practitioners
during ETL testing
Topic 4: Requirements-Based Test Scenarios, Test Cases, Test Data
Documented process and tools commonly used for
Topic 5: ETL Test Tools, Methods, and Automation DW/ETL testing

Topic 6: Recommended Tester Skills .

Topic 7: QA Best Practices

3 4

2
8/2/2017

Class Objectives Take-aways from this ETL Testing Course


Attendees will further realize You will have the ability to further
Testing challenges unique to ETLs and data integration Define and pursue your source data quality entry criteria early

The importance of project requirements, data models Discuss whether your design & development teams / supplier
and data mapping documents provides adequate design / development documentation for test
planning
The significance of an ETL QA process
Define project data quality and when it gets addressed in the
Effective test scenarios SDLC
QA strategies, test plans and test cases Develop a test strategy (a real one that you can follow)
Estimating process for QA resources Acquire the skilled resources ,,, early
ETL tester skill requirements
Prepare carefully, end-to-end data reconciliation practices
6 8

3
8/2/2017

Guidance on Test Automation Best Practice Framework for ETL QA

How best to choose test scenarios for automation Data Analysis Data Flow ETL
& and ETL Development DW ETL QA
Understand a pathway to test automation using ETL Validator Requirements Design

End to end testing guidance to include automation


Test automation best practices QA reviews and
feedback

How automation supports ETL regression testing


QA reviews and

Understanding that some tests may not be possible to feedback. Test


planning

automate QA design reviews and


feedback. Test scenario
QA support and participation and test case

How to avoid test automation failures throughout the SDLC whether or not
development.

ETL Testing, defect


an agile SDLC. management, regression testing.
How to convert manual test scenarios to automated scripts
9 13

4
8/2/2017

Course Audience and Prerequisites Customizing the ETL Testing Course


Audience
Project managers 1. Class instructor works with client team to assess
QA managers and test leads learning needs in detail
ETL test (QA) engineers
ETL developers 2. Conduct pre-course discussion to finalize topics and
ETL and data integration architects content
Data analysts and business analysts 3. Schedule course
Recommended Prerequisite Knowledge/Experiences 4. Conduct the course
Basic data and database technologies 5. Provide several days of post-training support
Principals of data warehousing or data integration
Experience with Excel data quality functions
Familiarity with principals of data profiling

14 15

5
8/2/2017

Questions? Data Quality Problems?

16 18

6
8/2/2017

Foremost reason for DW/BI failures Considerations for QA Support


lack of data quality on DW/BI Projects
Project Mgr. Project defect mgt. and overall quality
.
Data Analysts, Business Analysts Requirements reviews & test support

Database Architect Data model, data mapping reviews / inspections

DBAs set up and verify schemas, environment, performance

ETL/BI Report Developers Unit and component test and workflow


Every data management system needs a data quality testing
program
Business Sponsor Acceptance test strategy and planning
Joe Caserta, co-author, The Data Warehouse ETL Toolkit
End users Acceptance test cases and execution

Test Mgr./Lead Test plan, scenario design and execution

Testers Component, integration, and system test planning & execution


20 21

7
8/2/2017

DW Common Architecture
Ultimate Goals for ETL Testing
Sources and DW Targets
There is an exponentially increasing cost to businesses associated with finding
defects late in the development lifecycle. Considering the importance of early
detection, we list our primary goals for testing the ETLs.
Data completeness: Make certain that all expected data is loaded.
Data transformation: Ensuring that all data is transformed correctly according
to business rules and/or design specifications.
Data quality: Ensuring that the ETL tool (DataStage) correctly rejects,
substitutes default values, corrects or ignores and reports invalid data.
Performance and scalability tests: Making sure that data loads and queries
perform within expected time frames and that the technical architecture is
scalable.
Integration testing: Ensuring that the ETL process functions well with other
upstream and downstream processes.
Regression tests: Assuring that existing functionality remains intact each time a
new release of code is completed
User-acceptance testing: Ensuring the solution meets users current
expectations and anticipates their future expectations
25 26

8
8/2/2017

Common ETL DW Development Tools Common BI Tools

Informatica PowerCenter IBM Cognos Business Intelligence


IBM DataStage
SAP Business Objects
Oracle Data Integrator, Warehouse Builder
Microsoft SQL Server Integrated Services (SSIS) Oracle Hyperion
Talend Talend Open Studio for Data Integration SAS Enterprise BI Server
SAS Data Integration, ETL Studio Microsoft BI, SSRS
SAP BusinessObjects Data Integrator
Clover ETL WebFOCUS
Pentaho Pentaho Data Integration

28 29

9
8/2/2017

What ETL Development Tools Can Do Data Movement Projects Using ETLs
Extract data Data warehouse (DW or EDW) -- database system used for reporting
MQ, web services and data analysis. It's a central repository for data which is created by
integrating data from one or more disparate sources, often over a long
Semi-structured data (email, web logs, wiki pages period of time (months, years).
Flat files, txt, DBMS, XML, XLS
Unstructured data (blogs, documents) Data integration -- involves combining data from different sources and
providing users with a unified view of that data.
Clean
Using Lookups, validations, filters, translations, inserting Data migration -- the process of transferring data between storage types,
defaults formats, or computer systems often a one-time project. It is a key
Transform and load data consideration for any new system implementation, upgrade, or
consolidation.
Change data structures, aggregate, rollup, sort, partition, de-
duplicate Source, DataVersity, 2015

Call to external tools

30 31

10
8/2/2017

Test Planning Using Source to Target Mapping


DW Variety of Source System Formats
Data mapping, source to target, is used as a first step for a wide variety of data integration tasks
RDBMS including: Data transformation or data mediation between a data source and a destination.
Oracle
SQL Server
DB2
Teradata
FLAT FILES
.TSV, .XLS, .TXT, .CSV
Cloud
Salesforce, Windows Azure
MS-ACCESS
.MDB
XML
Temporary Storage
IoT (Internet of Things) and real-time data

32 46

11
8/2/2017

Levels of Data Mapping Test Planning Using Source to Target Mapping

47 48

12
8/2/2017

Highlights of Source to Target


QA in all DW Development Phases
Mapping Document Contents
Mapping documents: the foundation for ETL testing
Data Modelers, DA, ETL developers and BAs, QAs have a keen interest in

DB connections for source and target tables


The source and target data descriptions what each table represents
The source and target field descriptions what each field represents
Examples of field contents
Source and target data types, dates and times (metadata)
Null, not Null, default indicators
Transformation, aggregation, enrichment description rules for each field
Error handling conditions and logic for each record, each field
Columns participating in referential integrity
Primary / foreign key columns that assure source records are unique
How tables are joined (the type of SQL join)
Change data capture (CDC) characteristics
Change and version log to describe, in detail, additions and changes to the
mappings
From Guru99 website
52 63

13
8/2/2017

Where Data Integration/ETL Testing is Needed ETL Development Tools and Process

64 65

14
8/2/2017

DW Common Architecture
Sources and DW Targets
Agenda
DW Concepts and Terms
Challenges of DW Testing
Planning for DW Tests
DW Test Scenarios
Test Data Planning
QA Risk Management
DW Test Tools and Automation
Recommended Tester Skills
DW QA Best Practices
75 82

15
8/2/2017

Typical DW Testing Challenges Test Team Challenges

Sharing knowledge among team and project members


Planning backwards (dates are committed before test
planning)
Being brought onto the project late
Continually adding and clarifying requirements
Dealing with with the approach of development team
Unwillingness to continuously improve the processes
Being taken seriously by management and project

88 89

16
8/2/2017

Why ETL Data Errors Occur Challenges of Finding Defects Late in Process

Data is copied between different systems, at different locations, are updated at


different times
Data flows--
at high volumes
in different formats
from multiple sources (internal, external)
through multiple platforms (SQL, No-SQL, Hadoop)
via on premise or cloud
None of the above play well each other
Such complexity of data movement makes it hard to validate every piece of data Tricentis.com: testing tools
Compliments of FirstEigen, DataBuck big data testing tool, FirstEigen.com
90 91

17
8/2/2017

What Should be Known About Each DW Source? What ETL Processes Can Do
And therefore, what we should prepare to test

Is source spec documentation available?


What are options for extracting data (update
notification, incremental extracts, full extracts)?
What is required / available frequency of the extracts?
What is the historical and current quality of the data?
Are required data fields populated properly,
consistently?
What is the data volume to be processed?
Who is the source owner? Do they offer SME support?

92 93

18
8/2/2017

What Can Happen to Data During ETLs Handling Data Exceptions


The most important requirements for good exception handling design
suitable for the data reconciliation process are:
Actual loss of data quality may happen due to the following All data exceptions managed and unmanaged must be captured,
reasons: preserved and easily accessed for analysis
Inconsistent data from the source Exception handling should be organized into logical modules e.g. interface
Logical flaws in the data exception/transformation rules exceptions, data load exceptions, data validation exceptions, data
Unhandled exceptions integration exceptions
Constraint violations Exceptions should be clearly classified and standardized (like exceptions
due to attribute validation, referential integrity exceptions, data integration
Loss of data in the data exception repositories
exceptions, duplicates etc.)
Hardware related issues, such as loss of files, data connectivity or system Easy association between captured exceptions and processes/data
failures transformations that created these exceptions
Manual data intervention Clear strategy of how to manage non-integrated data from multiple
Are all data transformation rules correct? It must be verified that the sources e.g. requirements may be quite different from presenting all
transformation rules include all possible cases and treat them correctly. data including duplicates, to presenting only integrated data, excluding
any other.
Thorough consideration is required of how to continue (or interrupt) the
process in case of unmanaged exception
Clear business rules of how to manage exceptions after they are captured
96 97

19
8/2/2017

What Can Happen to Data During ETLs Common Challenges for DW QA


Challenges for DW Test Planning & Design
Presumed loss of data quality will occur when:
Source data fails to meet data mapping specifications
End users do not have visibility to the modified and transformed data Data requirements are not clearly defined - data rules for
End users are not aware of all the data transformation/exception rules
Data excluded during validation is not preserved and is not visible integrity, controls, security, availability and recoverability are
Data exception/validation rules understood differently by different users often ill-defined.
Modifications of data in the source systems are not captured by the data
warehouse due to the batched data extracts (e.g. data in the source
Source to target data model and mappings: a) not reviewed
system was modified a number of times between batched data / approved by project stakeholders, b) not consistently
extractions) maintained through development lifecycle.
Subtle modifications of data in the source systems, which make data
warehouse data transformation rules inconsistent Requirements and data model changes throughout SDLC
Poor implementation of the slowly changing dimensions
(SCD) strategy for ETL phase
No data dictionary is available
98 99

20
8/2/2017

Common Challenges for DW QA (cont.) The Variety of Source Data in Todays DW


Challenges Related to Source Data
Huge, complex, heterogeneous, source data inputs and types
with high rates of updating
Same field names in multiple tables have different meanings
Source data profiling not completed early in project
Source data is often not in compliance with requirements:
dirty
Assumptions that source data is OK because the
operational systems seem to work just fine
Decreased test coverage due to complex organization of data

100 101

21
8/2/2017

Need to Fix Bad Source Data The Challenge of DW Documentation

Most important documents for QA planners and testers:


Huge data volumes
1. Source to Target Data Mappings
Large varieties of data types and formats
2. Data Models
Need to cleanse data at rest and in motion 3. Data Dictionaries
Validating external data 4. DB Schemas
5. DW ETL Architecture Description
Hyperhybrid environments
6. Error and exception messages
7. Business Requirements for DW & BI Reports

102 104

22
8/2/2017

Defects in Source & Target Data (cont.) Common ETL Defects and Causes
Issue Description Possible Causes Example(s)
- Invalid or incorrect Lookup table should contain a field value of
lookup table in the High which maps to Critical. However,
Data that does not transformation logic Source data field contains Hig - missing
Missing Data make it into the target - Bad data from the the h and fails the lookup, resulting in the
database source database target data field containing null. If this
(Needs cleansing) occurs on a key field, a possible join would
- Invalid joins be missed and the entire row could fall out.
- Invalid field
lengths on target
Source field value New Mexico City is
Data being lost by database
being truncated to New Mexico C since
Truncation of Data truncation of the data - Transformation
the source data field did not have the
field logic not taking into
correct length to capture the entire field.
account field
lengths from source

Data types not set up Source data field Source data field was required to be a date,
Data Type
correctly on target not configured however, when initially configured, was
Mismatch
database correctly setup as a VarChar.

Source: RTTS, QuerySurge

114 118

23
8/2/2017

Common ETL Defects and Causes Common ETL Defects and Causes
Issue Description Possible Causes Example(s) Issue Description Possible Causes Example(s)

Development team A Source data field for null was supposed to Records which should Development team If a case has the deleted field populated,
Null source values not did not include the be transformed to None in the target data Extra Records not be in the ETL are did not include filter the case and any data related to the case
Null Translation being transformed to null translation in field. However, the logic was not included in the ETL in their code should not be in any ETL
correct target values the transformation implemented, resulting in the target data Development team
Records which should If a case was in a certain state, it should be
logic field containing null values. Not Enough had a filter in their
be in the ETL are ETLd over to the data warehouse but not
Records code which should
Opposite of the Null included in the ETL the data mart
not have been there
Translation error. Field Ex. 1) Target field should only be populated Ex. 1) Most cases may fall into a certain
Development team
should be null but is when the source field contains certain branch of logic for a transformation but a
incorrectly
Wrong populated with a non- values, otherwise should be set to null Development team small subset of cases (sometimes with
translated the
Translation null value or field Ex. 2) Target field should be "Odd" if the did not take into unusual data) may not fall into any
source field for
should be populated, source value is an odd number but target Testing sometimes account special branches. How the testers code and the
certain values
but with the wrong field is "Even" (This is a very basic example) can lead to finding cases. For example developers code handle these cases could
value Transformation holes in the international cities be different (and possibly both end up
Development team Logic Errors/Holes transformation logic that contain special being wrong) and the logic is changed to
A source data field was supposed to be
Source data fields not inadvertently or realizing the logic language specific accommodate the cases.
transformed to target data field
being transformed to mapped the source is unclear characters might Ex. 2) Tester and developer have different
Misplaced Data 'Last_Name'. However, the development
the correct target data data field to the need to be dealt interpretations of the transformation logic,
team inadvertently mapped the source data
field wrong target data with in the ETL code which results in different values. This will
field to 'First_Name'
field Source: RTTS, QuerySurge
lead to the logic being re-written to become
Source: RTTS, QuerySurge
more clear.
119 120

24
8/2/2017

Common ETL Defects and Causes Common ETL Defects and Causes
Issue Description Possible Causes Example(s)
Issue Description Possible Causes Example(s)
Development team
did not add an Product names on a case should be Development team
Duplicate records are
Simple/Small Capitalization, spacing additional space separated by a comma and then a space but did not add the Duplicate records in the sales report was
two or more records
Errors and other small errors after a comma for target field only has it separated by a Duplicate Records appropriate code to doubling up several sales transactions
that contain the same
populating the comma filter out duplicate which skewed the report significantly
data
target field. records
Ensuring that the Numbers that are not
Development team Development team
sequence number of formatted to the The sales data did not contain the correct
did not configure Numeric Field rounded the
reports are in the correct decimal point or precision and all sales were being rounded
the sequence Duplicate records in the sales report was Precision numbers to the
Sequence correct order is very not rounded per to the whole dollar
generator correctly doubling up several sales transactions wrong decimal point
Generator important when specifications
resulting in records which skewed the report significantly
processing follow-up
with a duplicate Development team
reports or answering
sequence number did not take into
to an audit Data rows that get Missing data rows on the sales table caused
There was a restriction in the "where" account data
Several of the Rejected Rows rejected due to data major issues with the end of year sales
clause that limited how certain reports conditions that
Find requirements members of the issues report
were brought over. Used in mappings that could break the ETL
that are understood development team for a particular row
Undocumented were understood to be necessary, but were
but are not actually did not understand
Requirements not actually in the requirements.
documented the understood
Occasionally it turns out that the
anywhere undocumented Source: RTTS, QuerySurge
understood requirements are not what the
Source: RTTS, QuerySurge requirements.
business wanted. 121 122

25
8/2/2017

Questions? Agenda
DW Concepts and Terms
Challenges of DW Testing
Test Planning & Management
DW Test Scenarios
Test Data Planning
QA Risk Management
DW Test Tools and Automation
Recommended Tester Skills
DW QA Best Practices
127 128

26
8/2/2017

Highlights of DW Test Mgt. Responsibilities Phases in DW Testing


1. Develop a Master Test Plan to highlight all project QA 1. Understand the business needs
activities Review business requirements
2. Create and execute a repeatable system integration test plan Attend business specification and technical specification
(SIT) that details both full progression of testing (through walkthroughs (data model, mappings)
iterations) and ongoing regression testing
2. Test plan creation, reviews and walkthroughs
3. Coach project designers and developers on reviews plus unit
and component testing 3. Test scenarios, test case creation, review and walkthrough
4. Continually confirm to stakeholders that data in the 4. Test bed and environment planning & setup
warehouse is complete and correct 5. Collect and create test data files, tables & objects
5. Certify to stakeholders and operations that the DW is worthy
and production ready 6. Execute test cases and regression tests
7. Deploy -- validating business rules in the production
Agile Data Warehousing for the Enterprise, Ralph Hughes, MK Press, 2015 environment
131 132

27
8/2/2017

Most Common DW Issues Discovered by QA QA Strategy / Master Planning Starting Point


Key issues affecting QA on three DW projects
Testers carefully analyze
1. Source to target mappings: 1) often not sufficiently
reviewed, therefore, in error and 2) not consistently Business requirements documentation
maintained through IT lifecycle
Data models for source and target schemas
2. Field values are null when specified as Not Null. Source to target mappings
Data dictionary(s)
3. Excessive ETL errors discovered after entry to formal QA
ETL design and logic documents
4. Duplicate records and duplicate field values QA DB deployment tasks / steps
QA tool needs for DB testing
5. ETL SQL / transformation errors leading to missing rows
and invalid field values in target tables

137 138

28
8/2/2017

Plan Your Test Strategy, Methods & Tools DW Documentation for Test Planning

From Data Warehouse Documentation Roadmap, David Walker


142 144

29
8/2/2017

Planning for DW QA Process Sogetis BI Testing Approach

Data Analysis ETL


Data Flow and
& Development DW ETL QA
ETL Design
Requirements

-> Develop QA approach and plan


-> Develop test scenario ideas
-> Develop test cases
-> Conduct reviews of test cases
-> Plan QA environment
-> Record test cases in defect mgt. tool
-> Conduct ETL tests
-> Test ETL and data fixes
-> Run regression tests
QA tasks throughout the SDLC -> Develop test reports

148 149

30
8/2/2017

Planning the QA Strategy Agenda


Set objectives for testing / validation
DW tables adhere to data model and mappings DW Concepts and Terms
Data completeness in all DW tables
Challenges of DW Testing
Planning for DW Tests
Data transformations correctness from source to target
DW Test Scenarios
Data quality throughout all tables -- profiling Test Data Planning
Performance and scalability -- meets requirements QA Risk Management
Integration testing all data loaded DW Test Tools and Automation
Recommended Tester Skills
User-acceptance testing DW meets requirements
DW QA Best Practices
158 180

31
8/2/2017

QA in All DW Project Phases Data Transformation and Validation Flow

System testing
End to end testing
Regression testing
Load testing
Design Essential for Reconcilable Data Warehouse 4 2011 Formation Data Pty Ltd
Security testing

189 192

32
8/2/2017

Example of Cleaning, Enriching, Example of Cleaning, Enriching,


Transforming & Loading Data Transforming & Loading Data

Utopiainc.com Utopiainc.com
193 194

33
8/2/2017

ETL Testing Videos on the Web


Basic ETL test scripts Types of Data Transformations to be Tested
https://www.youtube.com/watch?v=tqrtbKizLPc
More ETL test scripts, how to set up test cases
Applying business rules (derivations, e.g., calculating new
https://www.youtube.com/watch?v=c1lm8L1LOoM https://www.youtube.com/watch?v=cLpFqZehS9o
https://www.youtube.com/watch?v=s_7iPt4qaNM&t=2667s
measures, aggregating data)
ETL Testing with flat source files
Cleaning (mapping NULL to 0 or "Male" to "M")
https://www.youtube.com/watch?v=XkdPOier_SI&t=45s
How to develop test cases for each column whether transformation or not (good) Filtering (selecting only certain columns to load)
https://www.youtube.com/watch?v=zMSURD41AqM
Using Informatica mapping objects to test ETLs
Splitting individual column into multiple columns and vice versa
https://www.youtube.com/watch?v=hz6FGxrVdto https://www.youtube.com/watch?v=TP1XoiyYCb8
Joining data from multiple sources (lookup, merge)
ETL testing with lookups
https://www.youtube.com/watch?v=ZcKyOS0lQnc Transposing rows and columns
Lookups plus use of Minus to check on dups, missing records, nulls, referential integrity, data model (col
names, datat types, etc.), use SQL Rank function this is a valuable video Applying data validation (e.g., if the first 3 columns in a row are
https://www.youtube.com/watch?v=xGl8jXR-2cI empty then reject the row from processing)
Testing SCDs
https://www.youtube.com/watch?v=_WgUSB2KwVM
https://www.youtube.com/watch?v=Q_6ESKZwOBg&list=PLwcz5ft4sFokiNaa51WMv6eqSAUsvekij 218 221

34
8/2/2017

Transformations Types of Data Transformations to be Tested


Conversions are used to change data types and formats or to calculate and derive new
The transformation phase is where the data manipulation and data from existing data and all should be tested
branching takes place. Among the types of transformations:
Data type conversions: convert data from one data type to another data type
Data is reshaped to fit the destination
Arithmetic conversions: perform arithmetic operations (add, multiply, etc..)
Data type changes are performed (such as converting text
to integer) Format conversions: convert a value (currency, date, length, etc..) from one format into
another one
Duplicates are removed from the data
The data is joined to lookup tables to validate against a String conversions: transform strings (upper & lower-case, concatenate, replace,
substring)
trusted list of values
Extra spaces are trimmed Split conversions: break a value into different elements. For example, the following
expression breaks a name (John Doe) into first name (John) and surname (Doe)
Case and capitalization are standardized
Standardization conversions: standardize attributes to contain identical values for
Bad rows of data are flagged and sent down a separate equivalent data elements. We can use a set of rules or look-up tables to search for valid
path in the data pipeline values. For example, the following expression substitutes Jan. or 1 with January
The granularity of the data is changed, where one input Default value: when a value is missing (null, empty string, etc..), it is possible to define
row becomes several output rows or vice versa a default value conversion.
222 223

35
8/2/2017

Understanding Data Transformation Requirements DW Transformation Functions to be Tested


Raw data from disparate sources usually needs to be cleaned and Data format changes - change data from different sources to a standard
normalized in order to make sense in your data warehouse. set of formats for the DW;
Data from different systems typically doesnt play together very well, and De-duplication compare records from multiple sources to identify
requires work to get it to cooperate. Here are some common examples: duplicates and merge them into a unified one;
Establishing key relationships across data sources, many of which might not Splitting-up fields/integrating fields split-up a data item from the source
exist in the raw data
systems into one or more fields in the DW/integrate two or more fields
Updating new values on existing records without sacrificing performance from the operational systems into a DW field;
Developing time zone consistency
Lookup value normalization (US = United States = USA) Derive new values compute derived values using agreed formulas (e.g.:
averages, totals, etc..);
If you ignore the transformation step, the data in your warehouse will be
impossibly difficult to work with, full of inconsistencies, and decision makers Aggregation create aggregate records based on the atomic DW data;
will lose faith in its reliability.
Other transformations -- such as filtering, sorting, joining data from
multiple source, transposing or pivoting, etc..

224 225

36
8/2/2017

DW Loading Functions to be Tested (cont.) DW ETL Mgt. Functions to be Tested (cont.)


The DW copies data processed by ETL extraction and transformation
subsystems -- loads into the DW. The ETL job scheduler
Generate surrogate keys create standard keys for the DW separate from The ETL job recovery /restart system
the source systems keys
The ETL workflow monitor ensures that the ETL processes are operating
Processes slowly changing dimensions (SCDs) efficiently and gathers statistics regarding ETL execution or infrastructure
performance.
Handle late arriving data apply special modifications to the standard
processing procedures to deal with late-arriving fact and dimension data The data lineage and dependency system identifies the source of a data
element and all intermediate locations and transformations for that data
Drop indexes on the DW when new records are inserted element.
Load dimension records The DW security system security is an important consideration for the ETL
Load fact records system and the recommended method is role-based security on all data and
metadata in the ETL system
Compute aggregate records using base fact and dimension records
The metadata repository management.
Rebuild or regenerate indexes once all loads are complete
Log violations of referential integrity issues during the load process
226 227

37
8/2/2017

Testing Source to Target Data Transformations Examples of Transformation Testing


Verify that transaction scripts are transforming the data as per the
expected logic based on the data mapping sheet. Transformation Validations: Usually the most difficult and time consuming piece
Verify that detailed and aggregated data sets are created and are to develop scripts for, these tests confirm that your basic logic is sound. Factor in
matching. all of the following:
Verify that transaction audit log and time stamping
Validate the business rules and data integrity Column transformations, usually IF-THEN-ELSE type of things, math formulas,
Check cleansing of invalid records according to business rules etc.
Check all table / entity integrity Null and other value replacements
a. Referential Integrity
b. Domain Integrity Lookups/Referential Integrity make sure a) each lookup is correct and b)
Check for data consistency: Validity, Accuracy, Usability and Integrity of there are no nulls where there shouldnt be
related data between applications Confirm SCD logic triggers properly, and effective date fields are tagged cleanly
Check for data redundancy
a. Duplicate Records Aggregates match their base tables
b. Repeated data records
Check aggregate calculations
Verify that transformation is getting completed within SLA
228 229

38
8/2/2017

A Primary Testing Skill: SQL Sample SQL Queries


Testers need to Distinct Values: For critical data columns, the list of distinct values in the source and
target can be compared and analysed.
/* from source */
Join data from multiple tables SELECT country, state, city, count(*)
FROM customer GROUP BY country, state, city
Run queries that filter to narrow data results /* from target */
SELECT country_cd, state_cd, city_cd, count(*)
Identify source to target column differences FROM customer_dim GROUP BY country_cd, state_cd, city_cd

Run aggregating queries to total / summarize values Compare aggregate values such as count, avg, max, min between the source and
target tables (fact or dimension)
Format SQL query results /* from source */
SELECT count(row_id), count(fst_name), count(lst_name), avg(revenue)
Verify stored procedures and views FROM customer
/* from target */
SELECT count(row_id), count(first_name), count(last_name), avg(revenue)
Convert data types in query results (dates, times) FROM customer_dim

230 231

39
8/2/2017

Sample SQL Queries (cont.) Sample SQL Queries (cont.)


Check for Duplicates: Verify that the unique key, primary key and any other column or
Check for values outside a range
a combination of columns that should be unique as per the business requirements,
SELECT COUNT(DISTINCT [Status])
have no duplicate rows. FROM TABLE
SELECT fst_name, lst_name, mid_name, date_of_birth, count(1) WHERE STATUS NOT IN (val1,val2,val3);
FROM Customer
GROUP BY fst_name, lst_name, mid_name HAVING count(1)>1
Check for missing records in the Target
/* from source */
Compare differences between two tables using SQL Except SELECT *
/* from source */ FROM FIRST
SELECT ID, Col1, Col2 MINUS
FROM [table1] /* from target */
EXCEPT SELECT *
/* from target */ FROM SECOND
SELECT ID, Col1, Col2
FROM [table2]

232 233

40
8/2/2017

Sample SQL Queries (cont) Sample SQL Queries (cont)

234 235

41
8/2/2017

A Primary Testing Skill:


Data Profiling Learning Sources
Data Profiling
Column / attribute / field profiling provides statistical
measurements associated with:
Frequency distribution of data values
Number of records, source & target
Number of null (i.e., blank) values
Data types (e.g., integers, characters)
Column lengths
Unique and min/max values in each column
Patterns in the data

243 244

42
8/2/2017

Basic ETL Test Scenarios Basic ETL Test Scenarios (cont.)

Verify source to target loads using data mapping Verify no data truncation in each column of each table
specifications
Verify data types and formats are as specified in design
Verify that all tables, records, and columns were loaded from
source to staging Verify no duplicate records in target tables.

Verify that Primary & Foreign keys were properly generated Verify data transformations based on business rules
using a sequence generator. No orphan foreign keys.
Check for string columns that are incorrectly left or right
Verify that not-null columns were populated trimmed

Assure that extraction scripts are granted security access to Verify Transaction Audit Log recording is occurring
the source systems

258 259

43
8/2/2017

Basic ETL Test Scenarios (cont.) Basic ETL Test Scenarios (cont.)
Verify numeric columns are populated with correct
precision Verify that Null source values are translated to correct target
value
Verify that ETL sessions completed with only planned
exceptions Verify correct Lookup translation to target

Verify all cleansing, transformation, error and Verify that no extra records in target -- records which should
exception handling not be in the ETL are included in the ETL

Verify ETL calculations, aggregations and data mapping Check logs for data loading status, rejected records and error
correctness messages after ETL's (extracts, transformations, loads)

Verify relations between columns, (e.g. If column X


contains some value then column Y should contains a
correlated value)
260 261

44
8/2/2017

Basic ETL Test Scenarios (cont.) Basic ETL Test Scenarios (cont.)
Test Scenarios Test-Cases Test Scenarios Test-Cases

Validate the source and the target table structure as Verify there are no integrity constraints like Foreign Key
per the mapping document. Data Consistency check Verify where the length and data type of an attribute may vary
Verify data types should be validated in the source and in different tables
the target systems based on mapping document
Structure Validation Verify that all the data is loaded (where it should be loaded) to
Use mapping document to verify the length and types the target system from the source system
of data in the source and target schema. They could be
different. Verify by counting the number of records in the source and the
Data Completeness
target systems
Validation
Validate column names in the target system.
Verify that column boundary values (ex. Min/max) are correct
Validating the mapping document to ensure all the
information has been provided. The mapping Validate the unique values of primary keys
Validating Mapping document document should have change log, data types, length,
Validate the values of all data in the target system
transformation rules, etc..
Data Correctness
Search for misspelled or inaccurate data in target tables
Validate Constraints Validate all specified constraints to assure they are Validation
applied to the target tables.

262 263

45
8/2/2017

Basic ETL Test Scenarios (cont.) Basic ETL Test Scenarios (cont.)
Test Scenarios Test-Cases
Verify that duplicate values in the target system do not exist when data is Tester query identifies errors after ETL
coming from multiple columns in source systems
Duplicate Validation
Validating primary keys and other columns if there is any duplicate values
as per the business requirement
Validate date fields for various actions performed in ETL process
From_Date should not greater than To_Date
Date Validation checks
Format of date values should be as specified.
Date values should not have junk values or unexpected null values
Validate full data set in the source and the target tables by using Minus
query
Perform both source minus target and target minus source
Full Data Validation When minus query returns a value, it represents mismatching rows
using Minus Query The count returned by Intersect should match with the individual
counts of source and target tables
If the minus query returns no rows and the count intersect is less
than the source count or the target table count, then the table holds
duplicate rows
264 265

46
8/2/2017

Basic ETL Test Scenarios (cont.) Basic ETL Test Scenarios (cont.)

Physical duplicates

Logical duplicates

266 267

47
8/2/2017

Basic ETL Test Scenarios (cont.) Basic ETL Test Scenarios (cont.)
Source Target (DW)

268 269

48
8/2/2017

Basic ETL Test Scenarios (cont.) Basic ETL Test Scenarios (cont.)

Source

Target (DW)

270 271

49
8/2/2017

Basic ETL Test Scenarios (cont.) Basic ETL Test Scenarios (cont.)

272 273

50
8/2/2017

Basic ETL Test Scenarios (cont.) Basic ETL Test Scenarios (cont.)
What is change data capture (CDC)?
Detects all changes inserts, updates,
deletes
Reads log to find all changes
Makes all changes

Tests must be conducted to assure that all


changes applied to DW

274 275

51
8/2/2017

Tree-top Sync Testing Non-Functional Testing


Non-functional testing involves performing load testing, stress testing

Load Testing
The primary target of load testing is to check if most running
transactions have performance impact on the database.

Testers check:
The response time for executing the transactions for multiple
remote users.
Time taken by the database to fetch specific records.

Examples of load testing:


Running most-used transaction repeatedly to view performance of
database system.
Downloading a series of large files from the internet.
Running multiple applications on a computer or server
simultaneously.

276 284

52
8/2/2017

DQ Tools / Techniques for QA Team


Agenda DB EDITORS MS Access
(Ex., Toad, SSMS, SQL Developer) Table and data analysis across schemas
Data profiling for value range & including most functionality of Excel
DW Concepts and Terms boundary analysis plus much more.
SQL queries
Challenges of DW Testing Null field analysis DW Testing Automation
Row counting Datagaps: ETL Validator
Planning for DW Tests Data type analysis Torana: iCEDQ
DW Test Scenarios Referential integrity analysis
Distinct value analysis by field
Informatica: Data Validation
Option (DVO)
Test Data Planning Duplicate data analysis RTTS: QuerySurge
Cardinality analysis
QA Risk Management Schema tests: Stored procedures,
tables, views, package validation
DW Test Tools and Automation
MS Excel
Recommended Tester Skills Data filtering for profile analysis
Data value sampling
DW QA Best Practices Data type analysis

318 321

53
8/2/2017

What is Automated ETL Testing? Manual vs. DVO Automated Testing

Data Warehouse Test automation is often described as the use


of programmable tools to control
1) the execution of tests
2) the comparison of actual outcomes to predicted
outcomes
Commonly, test automation involves automating a manual
process already in place that uses a formalized testing process.

From ROI of Test Automation Benefits and Costs, DorothyGraham.co.uk


Courtesy of Informatica DVO
322 323

54
8/2/2017

Datagaps ETL Validator Datagaps ETL Validator


Query Compare Test Cases (source to target)
ETL Testing Automation Create query test cases that compare and verify data copied directly (no
Business and data analysts, developers and testers leverage transformations) from source to target and that no records or columns were
ETL Validators intuitive interfaces to create automation test dropped.
cases. Develop tests to verify that data values were not truncated nor otherwise
changed during ETL from source to target.
Test for data completeness & consistency comparing data
between source and targets Data Profile Test Cases (source or target)
Test ETL data column transformations by leveraging the Test to compare aggregate values, distinct values, min/max values between
power of SQL within ETL Validator source and target.
Verify that data from each source field was completely and correctly loaded
Ensure referential integrity by verifying foreign-key
to targets.
relationship Verify that not null fields were populated as expected and that no nulls
exist.

335 336

55
8/2/2017

Datagaps ETL Validator (2) Datagaps ETL Validator (3)


Component Test Cases Data Compare Test Plan
Develop tests within ETL Validator Test Plans to verify that all tables and specified
Create tests that check for duplicate records and duplicate values in target
columns were loaded from source to target correctly.
data.
Verify that surrogate keys uniquely identify rows of data.
Test that cleansing and transformations of data from source to target meet Foreign Key Compare Test Plan
business specifications. Develop tests to verify referential integrity of joins between tables (ex. primary and
Write tests that use ETL Validator Lookup to verify that source to target foreign keys). This test represents the ability to detect orphan records based on
lookups were processed correctly. primary-foreign key relationships.
Develop tests to assure that numeric value precisions are as required in
target data Metadata Compare Test Tool
Verify that values for each defined columns datatype in target DB are as Generate tests that verify table and index data values are propagated correctly across
defined in the data model or mapping document. all environments (eg., development, QA, UAT)
Tests to assure that target attribute constraints have been applied correctly
during ETLs (ex., defaults applied, trimming of data values, de-duplicating
Data Migration Wizard
field data and rows, adding/removing blanks in fields, etc.)
Generate tests to compare between sources and targets when data has been directly
337
copied during ETLs (no cleansing, no transformations). 338

56
8/2/2017

Datagaps ETL Validator Features Datagaps ETL Validator Features


Flat Files Testing Wizard Based Test Creation: ETL Validator provides wizard driven interfaces
Users write a set of rules for ETL Validator to test each with drag and drop capabilities to create and execute test cases. The intuitive UI
column in the incoming file and at run time, ETL Validator ensures that enables testing teams to be more productive and perform comprehensive
the data in the target flat files or tables is in accordance with the rules. testing with ease.
No Custom Programming: SQL is the only language required to automate test
Big Data Testing cases using ETL Validator. This reduces the total cost of ownership as SQL
Compare data from big data sources such as Hadoop Hive with that of knowledge is a common skill in engineering, testing and business analysis teams.
data residing in traditional data sources such as databases. And, access
Scheduling Capabilities: ETL Validator provides multiple options to assemble
flat files in HDFS and perform flat file testing. and schedule test plans. At the scheduled time, ETL Validators server
automatically executes the tests and emails summary reports to stakeholders.
Data Integration Testing
Users can write automation test cases to ensure that master data has Enterprise Collaboration: All test cases are stored in ETL Validators enterprise
the same definition across CRM, ERP and enterprise data systems. repository. This enables users to reuse test cases across the organization.
Integration with HPs ALM and web-based reporting enables application life-
DevOps cycle management.
Recent advances in continuous delivery and integration are supported
by ETL Validator through automated testing.
340 341

57
8/2/2017

Datagaps BI Validator Features Selecting Test Tools


Migration Testing -- Compare reports across BI Tools
Compare reports data while migrating from or to OBIEE, Cognos, BO & Cognos Best Practice: Avoid letting tools drive the test method /
process and techniques rather than assessing needs and
Upgrade Testing -- Compare reports across environments
Compare pre and post upgrade reports or reports across environments for data available tools.
(CSV) or PDF differences.
Symptoms to watch for:
Regression Testing -- Regression test reports & dashboards The tool vendors are overly optimistic (ex., proclaims
Baseline & compare report/dashboard PDF snapshots to identify data & layout
differences 100% source to target validation)
Catalog Testing -- Compare catalog between environments
Testers are inexperienced in testing may not recognize
Baseline and Compare web catalog XML to identify changes or compare them what the tool can and cannot do
across environments. Monitor dashboard and report performance periodically in
production environment. Testers are recommending tools simply for the sake of
automation
Functional Testing -- Compare report data with SQL Query
Compare data in a report with data from a SQL query on the Source or Target
database
342 350

58
8/2/2017

Challenges of Automated DW Testing Sogetis and Gartners BI QA Maturity Model

Some limitations of data conversion test automation:

Automating immature testing processes and methods


Technical limitations of most tools
Need for development/training of test developers
Difficulty in measuring return on investment of
test automation

363 380

59
8/2/2017

QA Tools Save Time and Costs Agenda


Source Files Quality Checker Testing Toolbox
Saves 30% of QA effort DW Concepts and Terms
Data Quality Testing Toolbox Challenges of DW Testing
Saves 30% of QA effort Planning for DW Tests
Source Target Comparer Testing Toolbox
Saves 30% of QA effort DW Test Scenarios
Test Case Generator Test Data Planning
Saves 35% of QA effort QA Risk Management
Report Testing Toolbox
Saves 40% of QA effort DW Test Tools and Automation
Test Data management Toolbox Recommended Tester Skills
Saves 30% of QA effort DW QA Best Practices
381 382

60
8/2/2017

DW Testers Should Generally Understand DW Integration Technologies


For their local project
Business terminology, data flows and process
The source data extract processes
The data staging tasks
ETL design & construction
DW source & target schemas
Business rules & terminology to write effective defect reports
Strong knowledge of data sources & targets (XML, txt, RDMS)
How target DW is incrementally updated
Profiling methods and tools for source & target data
How ETL data exceptions are handled

(And, much more)

384 385

61
8/2/2017

Common DW Tester Tasks Testers Need Broad DW QA Expertise


Classification for DW testing activities:
Testers typically exercise the following tasks to test ETLs
1. Business requirement testing Testing whether requirements expressed by business
Write SQL expressions that join relevant tables and that select users have been met
columns for analysis 2. ETL testing Testing accuracy of data movement from source systems to DW as per
specifications
Verify each test query is correct and includes everything needed
3. DW database testing Testing database performance at normal and stressed workloads.
for source / target analysis This category also includes tests to verify data quality in the DW
Execute each query and load the result to a file 4. System integration testing Testing entire integrated data warehouse system to
evaluate ability to meet functional as well as nonfunctional requirements
Import to Excel / Access to analyze results to identify 5. Nonfunctional requirement testing Testing quality of service of the DW to meet
differences. business expectations (ex., security, stress, performance)
6. Reporting / Front end testing Testing if reports / OLAP system provide functionally
Run test automation tools correct data access to end users

386 387

62
8/2/2017

Important Tester Skills Important Tester Skills (cont.)


Understanding of data models, data mapping documents,
Firm knowledge of DW, BI, Analytics and DB concepts ETL design and ETL coding. Participate in reviews
Advanced expertise with Experience with multiple DB systems: Oracle, SQL Server,
SQL queries, Sybase, DB2 etc..
stored procedures, Ability to troubleshoot ETL tool and stored procedure
DB and SQL editors (ex. Toad, SSMS) sessions and workflows

Expert data profiling methods & tools skills Skills to understand and validation business data
transformations
Skills of MS Excel / Access for data analysis
Ability to perform adequate testing with huge volumes of
Skills to develop DW /BI test plans data. Selections of data samples

388 389

63
8/2/2017

A Primary Tester Skill


Important Tester Skills (cont.)
SQL Queries
Experiences with data-centric testing Testers need to
Deployment of DB code to databases Join data from multiple tables
Unix scripting, DB deployment tools (ex., Autosys, Anthill) Run queries that filter to narrow data results
Use of automated ETL testing tools such as a scripting Identify source to target column differences
language, Informatica DVO, etc..
Run aggregating queries to aggregate / summarize data
Understanding of project data used by the business: data Format SQL query results
sources, data tables, data dictionaries, business terminology, business
rules for each table field Verify stored procedures and views
Convert data types in query results (dates, times)

390 391

64
8/2/2017

A Primary Tester Skill A Primary Tester Skill


Data Profiling DB Editors
Column / attribute / field profiling provides statistical DB editors are used to perform tasks like the following:
measurements associated with: Control DB code access
Create, browse, or alter objects (ex. tables)
Frequency distribution of data values
Create and execute SQL queries
Number of records, source & target Copy data between schemas
Number of null (i.e., blank), missing values Validate mappings between source and target
Data types (e.g., integers, characters) Edit and execute stored procedures & views
Column lengths Create negative DB conditions testing
Graphically build, execute, and tune queries
Unique and min/max values in each column
Manage common database tasks from one central window
Patterns in the data View Oracle, SQL Server, etc. DB schemas, dictionaries

Ex., Dells Toad, Oracle SQL Developer, Microsoft SQL Server Mgt. Services (SSMS)

392 393

65
8/2/2017

A Primary Tester Skill Sample Ad for ETL Test Lead


Verify Data Cleansing Methods
Minimum 4 years of experience in ETL & BI test planning and testing
Remove un-needed source columns in targets Experience in analyzing ETL mapping documents

Check for inconsistent data formats Strong in SQL queries: Oracle / SQL Server /DB2
Strong with SQL scripts based on ETL mapping documents to compare data
Verify correct Lookups used to replace source column values
Strong in ETL data validation: Informatica / Datastage / SSIS
Verify data from multiple sources combined correctly Extensive Data Warehouse testing background working with huge volume of data.
Duplicate values or records removed Exposure to end-to-end data validation for ETL & BI systems
Verify normalized spellings Strong in BI report validation in Cognos / Business Objects / Microstrategy / SSRS BI
environments
Verify aggregated data
Work with SMEs to resolve gaps/questions in requirements
Verify that missing required data values from source applied Assist developers to recreate test failures leading to problem resolutions to
to target. Ex., a data field in a source is either optional or requirements, code or test cases
mandatory but not enforced, hence, intermittent data. Exposure to DB tools: Toad / PL/SQL Developer / SSMS
However, field value is required in the target system. Nice to have: Exposure to automating DW testing
CareerBuilders Website, July 20, 2016

394 395

(And, much more)

66
8/2/2017

Estimating Test Resources QA Support on DW Projects

Approximate Tester to
Condition Programmer Ratio

New team with testers performing all QA duties 1:2

Teams with distributed QA duties 1:3

Teams with no test automation 1:4

Teams with test automation 1:6

Test automation with use of parm-driven unit tests 1:7

Agile Data Warehousing for the Enterprise, 2016, Ralph Hughes

396 397

67
8/2/2017

Benefits of Acquiring QA Support


QA Support on DW Projects
From Project Team
Project Mgr. Project defect mgt. and overall quality

Data and Business Analyst Requirements testing and acceptance Helps assure that stakeholders are confident in QA
Database Architect Data model / mapping reviews Allow stakeholders to participate in testing what they may
know best
DBA set up and verify schemas in test environment
Teach developers new testing ideas and methods
ETL Developer -- Unit test planning and execution Pair testing with designers and developers and testers
Business Sponsor Acceptance test strategy Learn how developers test code
End users Acceptance test scenario devleopment and execution

Test Mgr. Test plan , scenario design and execution

Testers Component, integration, and system test planning &


execution
398 399

68
8/2/2017

QA Support on DW Projects QA Support on DW Agile Projects


Lifecycle Phase Roles Responsibilities
Requirements, Planning Data Steward Gather, verify data validation and
Tester functional requirements
Define Service Level Agreements (SLAs)
Identify Key Performance Indicators
(KPIs)

Design and Strategy Data Analyst Document system requirements


Tester Define data validation and testing
strategy
Develop test scenarios and test cases

Development and Testers Create test data and automation scripts


Integration Execute test automation scripts

400 401

69
8/2/2017

Agenda Implement a DW/BI Center of Excellence

DW Concepts and Terms


Challenges of DW Testing
Planning for DW Tests
DW Test Scenarios
Test Data Planning
QA Risk Management
DW Test Tools and Automation
Recommended Tester Skills
DW QA Best Practices Source: Infosys Corp

402 403

70
8/2/2017

QA Team Best Practices QA Team Best Practices (cont.)


ETL planning and testing 5. Analyze a.) source data quality and b.) data field profiles
1. Participate in ETL requirements & design reviews before input to Informatica or other ETL data-build
services.
2. Gain in-depth knowledge of each ETL, the order of
execution, restraints, transformations 6. Participate in data model and data mapping reviews.
3. Participate in ETL unit test case reviews 7. Review defects during ETL unit testing to target
vulnerable process areas.
4. After ETLs are run, use checklists for QA assessments of
dropped records, session failures

413 414

71
8/2/2017

QA Team Best Practices (cont.) QA Team Best Practices (cont.)


12. Encourage effective documentation for ETL stored
8. Assess ETL logs: session, workflow, errors
procedures.
9. Review ETL workflow outputs, source to target counts
13. QA is provided with development or separate
10. Verify source to target mapping docs with loaded tables environment for early data testing. QA should have
using TOAD and other tools permissions to modify data in order to perform
negative tests.
11. After ETL runs or manual data loads, assess data in every
table with focus on vital columns (dirty data, incorrect
formats, duplicates, etc.) 14. Developers unit test target tables after each ETL load
before entry to formal QA

415 416

72
8/2/2017

QA Team Best Practices (cont.) QA Team Best Practices (cont.)


19. Automate all that is possible
15. Maintain data models and source to target mapping
/ transformation rules documents. Ensure tests are repeatable;
Tests must have easy to interpret results (e.g., pass, fail);
16. Invest in off-the-shelf data quality analysis tools for and
pre- and post-ETL analysis. Build every test within the same framework.
17. Invest in automated DB regression test tools and 20. Conduct data profiling on source and target data:
training to support frequent data loads. before ETLs for sources; after ETLs for targets
21. Identify strategic test tools and integrate current tools to
18. Implement key DW test metrics for reporting
enable end to end automation. Standardize the use of an
automation framework

417 418

73
8/2/2017

QA Team Best Practices (cont.) Promoting Best Practice DW QA

22. Centralize a repository for all DW project templates, 1. Provide evidence from published analysts and industry
checklists, test artifacts, lessons learnt, trackers, research on the high failure rate due to a lack of best-practices
questionnaires, training materials 2. Establish principals of DW testing from a QA handbook /
23. Constantly improve DW and BI test competencies / skills of guidebook
all QA staff 3. Perform an up-front DW impact assessment to identify
24. Implement risk-based approaches to testing, and hotspots
methodically optimized test case definition are the basic 4. Focus on the impact of a delayed DW loading to wider
requirements for high test coverage corporate strategy
25. Test cases should be developed so that they are easy to
understand from the business perspective.
26. Profile and audit all source data before writing docs.

419 420

74
8/2/2017

Promoting Best Practice DW QA Valuable Books

4. Demonstrate the effects of poor DW development


practices
5. Gain a thorough understanding of DW QA best-
practices so you can argue effectively for their adaption
6. 'Pitch' unique benefits that appeal to different sponsors

421 422

75
8/2/2017

Valuable Books Valuable Books

425 426

76
8/2/2017

Valuable Books Valuable Books

427 428

77
8/2/2017

Valuable Books Valuable Books

429 430

78
8/2/2017

Thank You!

Questions, comments?

Wayne Yaddow
wayne@datagaps.com
1-(914) 466-4066

431

79

Das könnte Ihnen auch gefallen