Beruflich Dokumente
Kultur Dokumente
How much more interactive can your business be with customers, partners,
employees and managers?
3
What is Business Intelligence (BI)?
An alternative way of describing BI is: the technology required to turn raw data
into information to support decision-making within corporations and business
processes
4
Why BI?
BI technologies help bring decision-makers the data in a form they can quickly
digest and apply to their decision making.
BI turns data into information for managers and executives and in general, people
making decisions in a company.
5
Benefits
The benefits of a well-planned BI implementation are going to be closely tied to
the business objectives driving the project.
Deliver actionable insight and information to the right place with less effort .
Identify and operate based on a single version of the truth, allowing all
analysis to be completed on a core foundation with confidence.
6
Business Intelligence Platform Requirements
OLAP
Data Mining
Interfaces
The business intelligence platform should provide good integration across these
technologies. It should be a coherent platform, not a set of diverse and
heterogeneous technologies.
7
Business Intelligence Components
DATA OLAP
MINING
Data
Warehouse
LOAD
TRANSFORM
EXTRACT
Operational Data
8
Business Intelligence Architecture
9
Business Intelligence Technologies
Increasing potential to
support business decisions End User
Decision Making
Business Analyst
Data Presentation
Visualization Techniques
Data Mining Data Analyst
Information discovery
Data Exploration
OLAP, DSS, EIS, Querying and Reporting
DB Admin
Data Warehouses / Data Marts
Data Sources
Paper, Files, Information Providers, Database Systems, OLTP
10
Data Warehousing
What is a Data Warehouse?
A data warehouse is a relational database that is designed for query and analysis
rather than for transaction processing. It usually contains historical data derived
from transaction data.
It is a series of processes, procedures and tools (h/w & s/w) that help the
enterprise understand more about itself, its products, its customers and the
market it services
12
Facts !
NOT possible to
Data Warehouse is
purchase a Data
NOT a specific
Warehouse, but it is
technology
possible to build one.
13
Why Data Warehousing?
14
Defining Data warehouse
The operational
systems are designed
around the application
and functions. e.g.
Product
Loans , savings , credit
cards in case of a
Bank. Where Data
Warehouse is designed
around a subject like Organized by processes Organized by
Customer , Product , or tasks subject
Vendor etc.
16
Time Variant
Time Data
{
Key
It helps in Business trend analysis
In contrast to OLTP environment, data warehouse’s focus
on change over time that is what we mean by time variant.
17
Integrated
Data is stored once in a single integrated location
Auto
AutoPolicy
Policy Data Warehouse
Processing
Processing Database
System
System
Customer Fire
FirePolicy
Policy
data Processing
Processing
stored System
System
in several
databases FACTS,
FACTS,LIFE
LIFE Subject = Customer
Commercial,
Commercial,Accounting
Accounting
Applications
Applications
External
Sources
Product Data
ion Warehouse
Databas Database
Production
Production Data
Data
es
Applications
Applications Warehouse
Warehouse
Environment
Environment
Update • Load
Insert • Read-Only
Delete
This is logical because the purpose of a data warehouse is to enable you to
analyze what has occurred.
19
So, what’s different between OLTP
and Data Warehouse?
20
OLTP vs. Data Warehouse
OLTP systems are tuned for known transactions and workloads while workload is
not known in a data warehouse
e.g., average amount spent on phone calls between 9AM-5PM in Pune during
the month of December
21
OLTP vs. Data Warehouse
22
OLTP vs Data Warehouse
23
OLTP vs Data Warehouse
24
To summarize ...
25
Data Warehouse Architectures
Centralized
In a centralized architecture, there exists only one data warehouse which stores
all data necessary for business analysis. As already shown in the previous
section, the disadvantage is the loss of performance in opposite to distributed
approaches.
Central Architecture
26
Data Warehouse Architectures Contd…
Federated
Federated
Architecture
27
Data Warehouse Architectures Contd…
Tiered:
Advantages:
Legacy Data
Inventory
Extract
Transform Enterprise Data
Load Data Mart
Warehouse
Operational Data
The Post
Purchase
Organizationally Data
structured Mart
VISA
External Data Departmentally
Sources structured
Asset Assembly (and Management) Asset Exploitation
29
Data Warehouse Architecture Components
Data Sources:
Legacy data
Operational data Disparate data
External data resources sources
Data Management :
Metadata - At all levels of the data warehouse, information is required to support
the maintenance and use of the Data Warehouse.
Data Mart – A data mart is a subject oriented data warehouse.
30
Introduction To Data Marts
From the Data Warehouse , atomic data flows to various departments for their
customized needs. If this data is periodically extracted from data warehouse
and loaded into a local database, it becomes a data mart. The data in Data Mart
has a different level of granularity than that of Data Warehouse. Since the data
in Data Marts is highly customized and lightly summarized , the departments can
do whatever they want without worrying about resource utilization. Also the
departments can use the analytical software they find convenient. The cost of
processing becomes very low.
31
Data Mart Overview
and Analysts
Data Warehouse
DM Marketing
DM HR
DM Sales DM HR Human
Resources
DM Finance
Data Marts
DM Marketing
Satisfy 80% of
Financial Analysts,
the local end-
Strategic Planners,
users’ requests
and Executives
32
From The Data Warehouse To Data Marts
Information
Individually Less
Structured
Departmentally History
Structured Normalized
Detailed
Organizationally More
Structured Data Warehouse
Data
33
Operational Data Store (ODS)
What is an ODS
An Operational Data Store (ODS) integrates data from multiple business operation
sources to address operational problems that span one or more business functions.
An ODS has the following features:
Detailed — ODS data is generally more detailed than data warehouse data.
Summary data is usually not stored in an ODS; the exact granularity depends on the
subject that is being supported.
34
Operational Data Store (ODS) Contd…
Operational
A Data Store Data Warehouse
EIS
DSS
B
Apps
PC
C Current or near Historical data
current data
Summary and detail
Detailed data
Non-volatile
Updates allowed snapshots only
35
Benefits Of ODS
Provides a complete view of customer relationships, the data for which might be
stored in several operational databases -- this data can include data from an
organization’s internal systems, as well as external data from third-party vendors.
Operates as a store for detailed data, updated frequently and used for drill-downs
from the data warehouse which contains summary data.
Provides more current data than in a data warehouse and more integrated than an
OLTP system
36
Definition Of Data Warehouse
37
Basic Design Approaches of Data Warehouse
38
The Top Down Approach
The Dependent Data Mart structure or Hub & Spoke: The Top-Down Approach
The data flow in the top down OLAP environment begins with data extraction
from the operational data sources. This data is loaded into the staging area and
validated and consolidated for ensuring a level of accuracy and then transferred
to the Operational Data Store (ODS).
Detailed data is regularly extracted from the ODS and temporarily hosted in the
staging area for aggregation, summarization and then extracted and loaded into
the Data warehouse.
39
The Top Down Approach Contd…
Inmon Approach
The data marts are treated as sub sets of the data warehouse. Each
data mart is built for an individual department and is optimized for
analysis needs of the particular department for which it is created.
40
The Bottom-Up Approach
Ralph Kimball designed the data warehouse with the data marts connected
to it with a bus structure.
The bus structure contained all the common elements that are used by data
marts such as conformed dimensions, measures etc defined for the enterprise
as a whole.
This architecture makes the data warehouse more of a virtual reality than a
physical reality
All data marts could be located in one server or could be located on different
servers across the enterprise while the data warehouse would be a virtual
entity being nothing more than a sum total of all the data marts
In this context even the cubes constructed by using OLAP tools could be
considered as data marts.
41
The Bottom-Up Approach Contd…
Kimball Approach
• The data flow in the bottom up approach starts with extraction of data
from operational databases into the staging area where it is processed
and consolidated and then loaded into the ODS.
42
The Bottom-Up Approach Contd…
The data in the ODS is appended to or replaced by the fresh data being
loaded. After the ODS is refreshed the current data is once again
extracted into the staging area and processed to fit into the Data mart
structure. The data from the Data Mart, then is extracted to the staging
area aggregated, summarized and so on and loaded into the Data Warehouse and
made available to the end user for analysis.
43
Modeling Fundamentals:
What is Data Model ?
44
Modeling Fundamentals:
What is Data Modeling ?
A technique aimed at optimizing the way that information is stored and used within an organization.
It begins with the identification of the main data groups, for example the invoice, and continues by
defining the detailed content of each of these groups. This results in structured definitions for all of
the information that is stored and used within a given system.
Is an essential precursor to analysis & design, maintenance & documentation and improving the
performance of an existing system.
Is the process of creating a data model by applying a data model theory to create a data model
instance.
45
Modeling Fundamentals:
Types OF Data Modeling
Logical Data Model (LDM) - A logical design is conceptual and abstract.
The process of logical design involves arranging data into a series of logical
relationships called entities and attributes.
Logical data model includes all required entities, attributes, key groups, and
relationships that represent business information and define business rules.
48
Modeling Fundamentals:
Types OF Data Modeling
Entity relationship diagram (ERD) – A data model utilizing several
notations to depict data in terms of the entities and relationships described by
that data.
Databases are used to store structured data. The structure of this data, together
with other constraints, can be designed using a variety of techniques, one of
which is called entity-relationship modeling or ERM.
ERD Diagram 49
Modeling Fundamentals:
Types OF Data Modeling
Important Terminologies –
Entity – Are the principal data object about which information is to be collected.
A class of persons, places, objects, events, or concepts about which we need to
capture and store data.
•Persons: agency, contractor, customer,
department, division, employee, instructor,
student, supplier.
•Places: sales region, building, room,
branch office, campus.
•Objects: book, machine, part, product, raw material,
software license, software package, tool, vehicle model,
vehicle.
•Events: application, award, cancellation, class, flight,
invoice, order, registration, renewal, requisition,
reservation, sale, trip.
•Concepts: account, block of time, bond, course, fund,
qualification, stock.
50
Modeling Fundamentals:
Types OF Data Modeling
STUDENT CURRICULUM
Is being studied by is enrolled in
51
Modeling Fundamentals:
Types OF Data Modeling
bidirectional
Student Is being studied by is enrolled in Curriculum
52
Modeling Fundamentals:
Types OF Data Modeling
Dimensional Data Modeling (DDM) - Dimensional modeling is the
design concept used by many data warehouse designers to build their data
warehouse.
Is a logical design technique that seeks to present the data in a standard, intuitive
framework that allows for high-performance access. It adheres to a discipline that
uses the relational model with some important restrictions.
Every dimensional model is composed of one table with a multi-part key, called
the fact table, and a set of smaller tables called dimension tables.
Components of a DM:
Fact Table
Dimension table
Attributes
A fact (measure) table contains measures (sales gross value, total units sold) and
dimension columns. These dimension columns are actually foreign keys from the
respective dimension tables. 53
Types OF Data Modeling
Why Dimensional Modeling?
Early Days
123 754 123 36 892 26 714 123 549
123 562
263 263 123 26
288 95
788 652 123 562
526 82
698 999 78
52 123 549
55
Types OF Data Modeling
Why Dimensional Modeling?
56
Transaction vs. Query Environments:
Design Goals
Transaction Environments Query Environments
Get data in fast
Organize data by transaction flow
Expect high volumes of inserts, updates, and deletes
Allow retrieval of specific information quickly
Provide transaction level access Get data out fast
Stability over time
• Organize data by business analysis
• Expect high volumes of complicated
queries
Provide reasonable performance for
a variety of information requests
• Provide multidimensional data view
• Ease of use
Data
Transaction Refinery
Databases
57
What is a Star Schema ?
58
What is a Star Schema ?
The Star schema model is essentially a method to store data which are multi-
dimensional in nature, in a relational database. It consists of a single “fact table"
with a compound primary key, with one segment for each “dimension" and with
additional columns of additive, numeric facts.
Channel
Customer Product
59
Fact Tables
60
Fact Tables – Additivity of Measures
Characteristics of a Measure – Fully Additive, Non-
Additive, Semi-Additive.
Fully Additive – When it is meaningful to summarize it by adding values together
across any dimensions.
Example – Sales_Dollar; We can add Sales_Dollar values together across all dates in
a particular month.
Non-Additive – These measures can not be added together across dimensions.
Example – Margin expressed as a percentage of sales; On a particular day, a sales
person sells a customer 4 different products, each at a rate of 25% margin rate. Can
we add all these and say that margin rate for customer for that day is 100%?
Semi-Additive – These measures can be summarized across some dimensions, but
not all.
Example – Bank Account Balances at the end of the day is fully additive. But, if we
do it for different days, will it be additive?
61
Defining Fact Table Structure
Item
Fact Item Day Store
ITEM_ID
Week WEEK_ID
STORE_ID
SALES_DOLLARS
SALES_UNITS
Store
Fact Columns
Fact Table Structure
62
What is a Dimension?
Data Warehouse is
• Subject-Oriented
•Integrated
• Time-Variant
• Non-volatile
Subject Dimension
63
Dimensional Hierarchy
A dimensional hierarchy expresses the one-to-many relationships between
attributes.
Year
Quarter
Month
Date
Sequence
Current Flag
Day of Week
Dimensional Hierarchy
64
Types of Dimensions
Conformed Dimensions
Degenerate Dimensions
Junk Dimensions
Conformed Dimensions:
Data marts may have several Fact Tables. In any two data marts in an
enterprise, there could be common dimensions between the Fact tables.
These common dimensions must be conformed; indicating that they are
either the same or one is strictly the rollup of the other. The advantage of
conformed dimensions is that the two data marts don't have to be on the
same machine and don't need to be created at the same time.
65
Conformed Dimensions … But
Why?
66
Types of Dimensions
Degenerate Dimensions:
Certain fields which cannot be grouped with any Dimension table are usually
stored in the Fact tables. But, they are not true Fact values. Common examples
include invoice numbers or order numbers.
A degenerate dimension is represented by a dimension key attributes with no
corresponding dimension table.
Degenerate Dimensions
Junk Dimensions:
A junk dimension is a convenient grouping of flags and attributes
to get them out of a fact table into a useful dimensional framework.
67
What are Slow changing Dimensions?
68
Three Methods…
Approach Results
Type 1: Overwriting the old values in Losing the ability to track the
the dimension record old history
69
Type one
Implementing Type 1:
Advantages Disadvantages
Easy to implement History is lost
No key affected
70
Type two
Implementing Type 2:
Advantages Disadvantages
Automatically partitions history Abrupt point of time constraints
No time constraints required not effective
71
Type three
Implementing Type 3:
Add a new field of current strategy for the affected attributes with an effective
data field as well.
Advantages Disadvantages
Useful for tracking new and old intermediate values are lost
values Complex
Time
Dimension
Location
Dimension Acct Year
Product
Dimension Region
Acct Period
Department
Store
Acct Week
Item
Item
Store
Acct Week
Sales
73
Dimension Tables
Dimension attributes are used as the source of most of the interesting constraints
in data warehouse queries, and they are virtually always the source of the row
headers in the SQL answer set.
It should be obvious that the power of the data warehouse is proportional to the
quality and depth of the dimension tables.
74
Attributes in a Dimension Table
75
Basic Dimensional Model
Lookup Item
ITEM_ID
ITEM_DESC
Lookup Store DEPT_ID Fact Item Day Store
DEPT_DESC
STORE_ID ITEM_ID
STORE_DESC WEEK_ID
REGION_ID STORE_ID
REGION_DESC SALES_DOLLARS
SALES_UNITS
Lookup Week
WEEK_ID
PERIOD_ID
YEAR_ID
76
ETL Concepts
ETL !!!
(Extract, Transform, Load) –
ETL refers to the methods involved in accessing and manipulating source
data and loading it into target database. During the ETL process, more often,
data is extracted from an OLTP database, transformed to match the data
warehouse schema, and loaded into the data warehouse database.
78
WHAT IS ETL?
79
EXTRACTION (Data Capturing)
The ETL extraction element is responsible for extracting data from the source system.
During extraction, data may be removed from the source system or a copy made and the
original data retained in the source system.
80
EXTRACTION (Data Transmission)
Legacy systems may require too much effort to implement such offload processes, so
legacy data is often copied into the data warehouse, leaving the original data in place.
Extracted data is loaded into the data warehouse staging area (a relational database
usually separate from the data warehouse database), for manipulation by the
remaining ETL processes.
81
EXTRACTION (Cleansing Process)
Data extraction is generally performed within the source system itself.
82
TRANSFORMATION
The ETL transformation element is responsible for data validation, data accuracy, data
type conversion, and business rule application. An ETL system that uses inline
transformations during extraction is less robust and flexible than one that confines
transformations to the reformatting element. Transformations performed in the OLTP
system impose a performance burden on the OLTP database.
83
TRANSFORMATION (contd.)
Data Validation
Check that all rows in the fact table match rows in dimension tables to enforce data integrity.
Data Accuracy
Ensure that fields contain appropriate values, such as only "off" or "on" in a status field.
84
LOADING
The ETL loading element is responsible for loading transformed data into the data
warehouse database.
Data warehouses are usually updated periodically rather than continuously, and large
numbers of records are often loaded to multiple tables in a single data load.
The data warehouse is often taken offline during update operations so that data can be
loaded faster and SQL Server 2000 Analysis Services can update OLAP cubes to
incorporate the new data. BULK INSERT, bcp, and the Bulk Copy API are the best tools
for data loading operations.
The design of the loading element should focus on efficiency and performance to
minimize the data warehouse offline time.
85
ETL Tools
What are ETL Tools?
ETL Tools are meant to extract, transform and load the data into Data Warehouse for
decision making. Before the evolution of ETL Tools, the above mentioned ETL process
was done manually by using SQL code created by programmers. This task was tedious
and cumbersome in many cases since it involved many resources, complex coding and
more work hours. On top of it, maintaining the code placed a great challenge among the
programmers
Selecting an appropriate ETL tool is the most important decision that has to be made
when choosing the components of a data warehousing application. The ETL tool
operates at the heart of the data warehouse, extracting data from multiple data sources,
transforming the data to make it accessible to business analysis, and loading multiple
target databases
86
Features of ETL Tools
Features of ETL Tools
The ETL tools have the ability to extract data from various sources like RDBMS ,
DB2 , COBOL data files and flat files at scheduled intervals , do required
transformation and load the data into Data Warehouse which resides on RDBMS.
The ETL tools can connect to a RDBMS and get the list of tables and their
attributes. The general steps for designing an ETL process are
Define the structure of source data
Define the structure of Destination Data
Map elements of source data to elements of destination data
Define the transformation required like changing values , summing
Schedule the execution of process
The process once executed , generates a log showing status of process ,
number of records inserted etc. Various reports about processes are available
which can form the Metadata.
87