Beruflich Dokumente
Kultur Dokumente
Hybrid 2.0
Author:
Version:
V3.0
Status:
Final Draft
Date:
05/09/2014
258731137.doc
Version V3.0
Page 1 of 62
Document data
Document Change History
Version # Date
Author
Section
V0.1
18-06-2014 Manoj Singh,Upal
Chakraborty
V0.2
25-06-2014 Manoj Singh
V0.3
V0.4
V0.5
V0.6
V0.7
V0.8
V0.9
V1.0
V1.1
26-06-2014
30-06-2014
30-06-2014
03-07-2014
04-07-2014
08-07-2014
10-07-2014
14-07-2014
Manoj Singh
Manoj Singh
Manoj Singh
Biju Das
Biju Das
Biju Das
Manoj Singh
Manoj Singh
V1.2
V1.3
V1.4
V1.5
16-07-2014
16-07-2014
22-07-2014
31-07-2014
Biju Das
Manoj Singh
Manoj Singh
Manoj Singh
V2.0
V3.0
05-09-2014 Biju
Nature of Change
Initial draft
Department
Role
Rajib Roy
Detailed design
Reviewer
Frits Hermans
Reviewer
Eric Janssen
BI Operations
Reviewer
Contributor
Document Approval
Version # Approval Date
258731137.doc
Version V3.0
Page 2 of 62
Role
Name
Department
Table of Contents
1
258731137.doc
Version V3.0
Page 3 of 62
4.6.6
MDW Process.................................................................................................29
4.6.7
ATLAS lookup synchronization........................................................................30
4.6.8
Update Max Key Lookup in Legacy.................................................................30
4.6.9
Source Target Mapping...................................................................................30
4.6.9.1
Event Facet..............................................................................................30
4.6.10 Loading process for KPI files...........................................................................31
4.7
Impact of changes due to introduction of new field in the PRODUCT_ITEM_HIST
table 32
4.7.1
Changes to be made in Legacy system:.........................................................32
4.7.2
Changes to be made in the existing MDW lookups:........................................33
4.8
ATLAS Housekeeping............................................................................................33
5 Process Flow Diagrams................................................................................................35
5.1
Feed File Process Flow :........................................................................................35
5.2
MDW Process Flow :..............................................................................................37
5.3
KPI File Process Flow............................................................................................38
6 TWS scheduling :.........................................................................................................40
7. Teradata Design............................................................................................................ 41
7.1 Hybrid 2.0 data Load (New)........................................................................................41
7.1.1 Event.Base facet: LDM tables data load..................................................................41
7.1.1.1 One time load script for EVENT_CLASS Table..............................................41
7.1.1.2 One time load script for EVENT_TABLE_TYPE Table...................................42
7.1.1.3 New Table: INCENTIVE_EVENT (SET TABLE).............................................42
7.1.1.4 New Table: INCENTIVE_RESULT_TYPE (SET TABLE)................................43
7.1.1.5 Copy-to-Prod Configuration for Event.Base facet..........................................44
7.1.1.6 Data Backup Configure Backup facet Script................................................44
7.1.1.7 One-to-One view (New).................................................................................45
7.1.1.8 One-to-One PL view (New)............................................................................45
7.1.1.9 One time script for INCENTIVE_RESULT_TYPE TABLE...............................45
7.1.2 Offer.Product facet: LDM tables data load................................................................47
7.1.2.1 Existing Table: PRODUCT_ITEM_HIST (SET TABLE)...................................48
7.1.2.2 Data Backup Configure Backup facet Script................................................50
7.1.2.3 One time script for PRODUCT_ITEM TABLE.................................................51
7.1.2.4 One-to-One view (New).................................................................................51
7.1.2.5 One-to-One PL view (New)............................................................................51
7.1.3 Miscellaneous facet: LDM tables data load..............................................................51
7.1.3.1 Data Backup Configure Backup facet Script................................................52
7.1.3.2 One time script for CREATION_SOURCE_TYPE TABLE..............................53
7.1.4 Campaign facet: LDM tables data load....................................................................53
7.1.4.1 Data Backup Configure Backup facet Script................................................54
7.1.4.2 One time load script for CAMPAIGN_STRATEGY Table................................55
7.2 Data Backup...............................................................................................................55
7.3 Data Retention and Purging........................................................................................55
7.4 Statistics Gathering.....................................................................................................55
7.5 ADQM......................................................................................................................... 56
7.5.1 Values to be calculated.........................................................................................56
7.5.2 DETAILED KPI CALCULATIONS.........................................................................58
7.5.3 Price Reference Feed..........................................................................................58
7.5.3.1 Number of records per Price plan, FBS.........................................................58
7.5.4 Campaign Discount Feed.....................................................................................59
7.5.4.1 Number of records per Result Code, Campaign............................................59
8. Appendix....................................................................................................................... 61
258731137.doc
Version V3.0
Page 4 of 62
258731137.doc
Version V3.0
Page 5 of 62
1 Introduction
1.1 Overview
Flex Hybrid2.0 is an improvement over the originally deployed Hybrid 1.0 also known as Flex 1.0 ,
deployed in the month of January 2014 .Hybrid 2.0 project mainly focuses on the capability to
introduce more than one Hybrid Price Plan and the capability to apply reduced monthly charges,
based on campaign data from Unica.
Also earlier in Flex Hybrid 1.0 it was not possible to link Data from Vesta system with Data from
Surepay (containing product id) system in order to reconcile and check the monthly charge.
The payment information from Vesta does not have any product Id information. Vesta payments
happen few days after Surepay balance event data come to ATLAS. The Surepay balance events
does not contain the event amount so it is not possible to reconcile between Surepay events and
Vesta payments.
As part of hybrid 2.0 it was needed to have reference tables which will allow reconciliation (flex
product) of monthly charge vs. monthly bundle using price plan id into account.
The above improvements would be brought in by introducing 2 feeds from MOSA , one of the feed
would contain Campaign related data targeting the customers eligible for receiving discounts or
reduced pricing plan and the other would contain the Priceplan and bundle related data along with
their associated price which would help the business to set an association between the Flex products
and their corresponding prices involved for them. The above data feeds would be propagated through
the DDL server and passed through the Abinitio MDW process to LDM in Atlas.
1.2 Purpose
The purpose of this document is to describe in detail the changes needed in ETL process to load
MOSA data to ATLAS LDM via DDL.Two new data files along with 2 KPI files will be loaded to LDM
through DDL.
258731137.doc
Version V3.0
Page 6 of 62
Title
Document Name
Vodafone IT HLD
5
6
7
Detailed DesignEBU_CVM_v1.2.doc
258731137.doc
Version V3.0
Page 7 of 62
VFNL-Unify-DD-DDL.doc,Version 0.2,dated
30-01-2012
VFNL-Unify-DA-DDL.doc,Version 2.6,dated
14-09-2012
VFNL-Unify-IDD-DDL-ATLAS-11-1.doc
dated 06/12/2013
VFNL-DA-ETL Design ATLAS.doc,version
1.2 ,dated 15/04/2013
Detailed Design-EBU_CVM_v1.2.doc
Interface documents (IDD) from AMDOCS needs to be confirmed and finalized . Details such
as the datatype of the fields are still pending at their side.
Sample data from the source were not received during DLD phase. The datatypes, format
and details such as nullability of the source fields and exact attribute name still not available
during Detailed design phase.
OLA will be needed to be taken care in later phases of the project.
1.6 Risks
If the source doesnt send a final verison of the IDD with all file details before development
phase , there might a risk of not processing the data from the source file correctly and may
result in data inconsistency.
258731137.doc
Version V3.0
Page 8 of 62
2 Scope
3.1 In Scope
The below are the requirements from reporting perspective with their feasibility status :
Two new feeds from MOSA would send data towards Atlas . One of the feed would contain
price reference data and the other would contain campaign related data for the targeted
customers .
Processing of 2 data files and 2 metadata files from the new feed through DDL and loading of
the data into Atlas LDM tables using Abinitio MDW .
Processing of 2 KPI files and their corresponding metadata files and loading them into Atlas
LDM tables .
2.1 Assumptions
258731137.doc
Version V3.0
Page 9 of 62
Version V3.0
Page 10 of 62
MOSA should provide data as per the requirements of Hybrid 2.0 project.
Any filtrations required on the data should be provided in MOSA.
The data file should be sent before the metadata file.
The KPI file should be sent before the corresponding metadata file
Data Files :- There are two data files which will be needed as a part of the Hybrid 2.0
population :- campaign_discount and price_reference .
The campaign_discount file contains the list of Hybrid customers identified by their ctn , the
corresponding campaign associated with it identified by the campaigncode field , the
discount associated with it and the resultcode associated with it which actually indicates
whether a hybrid customer is eligible or campaign or not .
The hybrid campaign customer list is actually initiated by Unica which is the campaign
management system and sent to MOSA which checks if a customer is eligible for the
discount to be applied or not and accordingly populates the value of the resultcode(1 or 0) .
The price_reference file is a snapshot file which contains the list of all possible combinations
of Hybrid Price Plan and the FBS and the associated price for them.
(ii)
KPI Files :- There would be two KPI files associated with each of the above mentioned data
files one for price reference and the other for campaign discount .
The KPI file for campaign_discount provides the count of records per result_code and
campaign_code provided.
The KPI file for price_reference provides the count of records per price_plan_id and FBS_id .
mosa_price_reference_YYYYMMDD_99999999.dat.gz
mosa_campaign_discount_YYYYMMDD_99999999.dat.gz
Version V3.0
Page 11 of 62
Type
Description/Example
priceplan
FBS
Price
string
string
decimal
Flex
Flex_Sma_DataS
2750
mosa_campaign_discount_YYYYMMDD_99999999.dat.gz
Attribute
CTN
CampaignCode
Type
decimal
string
DATE
DATE
decimal
decimal
FBS ID
String
PricePlanId
string
Price
decimal
datetimestamp
Result
Decimal
Description/Example
610774354
Campaign identification as received within
HCL input file. Not relevant for MOSA
discount calculation, but used to report back
in the return-file towards UNICA.
Effective start date of a campaign
Effective end date of a campaign
Referred to as Current bundle value in Euros
The absolute amount, that is to be deducted
from the Monthly Charge
The FBS ID applicable at the moment of
Charge calculation by MOSA. This is the
actual name of the FBS.
The PricePlan ID applicable at the moment
of Charge calculation by MOSA. This is the
actual name of the Price Plan (not the
numeric code), e.g. flex 1
The actual price that would have applied if no
campaign was applicable.
Date Time stamp of the moment the discount
was applied
1 = Reduction Applied
2 = Reduction Not applied, minimum value
not reached
Version V3.0
Page 12 of 62
File Type: gz
File Transfer Method: SFTP push
File authorization: at least read/ write for all (UNIX: rw-rw-rw-)
Column delimiter: |. Please note that there should not be any pipe in any of the data attributes.
Each KPI file is accompanied by a corresponding metadata-file.
KPI files should be delivered in Unix-zip format (compressed).
The metadata files should be delivered as plain text (not compressed).
Below are the KPIs decided for each source file:
mosa_price_reference_kpi_YYYYMMDD_99999999.dat.gz
mosa_campaign_discount_kpi_YYYYMMDD_99999999.dat.gz
Note*:- Please note that the sequence number is governed by the source system , for every file
delivery the sequence number is supposed to be incremented by 1 by the source system .
mosa_price_reference_kpi_YYYYMMDD_99999999.dat.gz
column name
SYSTEM
KPI_ID
GROUP_LEVEL_1
GROUP_LEVEL_2
KPI_DATE
KPI_VALUE
UOM
Value
MOSA
5
Priceplanid
FBSid
Date
Number of counted records per group level 1 and 2
Number
258731137.doc
Version V3.0
Page 13 of 62
mosa_campaign_discount_kpi_YYYYMMDD_99999999.dat.gz
column name
SYSTEM
KPI_ID
GROUP_LEVEL_1
GROUP_LEVEL_2
KPI_DATE
KPI_VALUE
UOM
Value
MOSA
6
Result_Code
Campaign_code
Date
Number of counted records per group level 1 and 2
Number
Version V3.0
Page 14 of 62
Example(s):
For the data files, the names of the metadata files would be
mosa_campaign_discount_YYYYMMDD_<SEQNO>.met
mosa_price_reference_YYYYMMDD_<SEQNO>.met
For the KPI files, the names of the metadata files would be
mosa_campaign_discount_kpi_YYYYMMDD_SEQNO>.met
mosa_price_reference_kpi_YYYYMMDD_SEQNO>.met
BYTES
DELIMITER
FULL_DELTA
FIELD_NAMES
CRE_DATE
SEQ_NO
VERSION
Description
Name of the source system
Name of the source file
Number of records in the datafile. This
number must be equal to the actual
number of records in the transferred
file.
Number of bytes in the datafile. This
number must be equal to the actual
number of bytes in the transferred file.
Character used as delimiter between
the attributes of the datafile
Indicator if the datafile holds a full dump
or deltas only
List of all the attributes separated by a
|.
Same as date_of_content in naming
convention
Same as file_seqno in naming
convention
Version of interface defintion
Version V3.0
Page 15 of 62
Example
MOSA
DATA or KPI
1723
332324
|
D/F
20090422
00000001
001
SOURCE_SYSTEM=MOSA
FILE_TYPE=DATA
NUM_REC=292934
BYTES=3232324
DELIMITER=|
FULL_DELTA=D
FIELD_NAMES=<Incoming field names >
CRE_DATE=20130801
SEQ_NO=00000001
VERSION=001
258731137.doc
Version V3.0
Page 16 of 62
4 ETL Design
4.1 Project Creations:
Since MOSA interface will have two new daily feeds which needs to go through DDL processing and
Atlas load , a new source project needs to be introduced for MOSA at the DDL end . The new project
structures would like something below :A new private project named mosa has to be created in the path /Projects/ddl/source and a public
project named com_mosa has to be created at the same path /Projects/ddl/source as depicted
below. The com_mosa common project has to be included in mosa private project.
Version V3.0
Page 17 of 62
Usage: generic_wait_for_file.ksh [-h] -s <sandbox root> -p <project name> [-m] -F <feed prefix> -S
<feed suffix> [-M <metadata file suffix>]
[-D <duration>] [-I <polling interval>] [-P directory parameter] [-H host]
Where:
-h displays this help message.
-s <sandbox root> the name of the folder where all AbInitio projects are rooted
-p <project name> the name of the project whose parameter is to be sourced for folder
information
-F Feed File name prefix. Can take wild cards including unix commands in
-S Feed File Name suffix
-M Optional metadata file suffix. Needs to be provided if there is a metadata file to look for
-D Optional Duration in minutes for which to wait for files. Defaults to 30 mins
-I Optional polling interval in seconds after which files are to be looked for again. Defaults
to 30 secs
-P Parameter name (without $) in the sandbox which points to directory where to look for
files. Defaults to COM_<PROJECT_NAME>_INBOUND
-H Optional host name where to look for file. ssh should be enabled for that. defaults to
localhost
Example:
/ai_serial/sand/ddl/abiprod/com_ddl/bin/generic_wait_for_file.ksh -s /ai_serial/sand/ddl/abiprod/ddl -p
mosa -F campaign_discount -users -S .dat.gz -M .met -D 300 -I 300
Note*:- The polling needs to be done for both the data files and their corresponding KPI files .
The above mentioned feeds undergo the Generic Batch processing and create respective delta and
snapshot files. Since this process is processed through a generic and reusable DDL graphs , new
258731137.doc
Version V3.0
Page 18 of 62
parameter values needs to be supplied for each of the psets mentioned above . The details of the
parameter values for each of the above mentioned psets are provided in the below attached
spreadsheet .
Mosa_interface_ge
neric_batch_parameter_details.xls
Since currently, for any new source the corresponding KPI files will be landed in the source specific
landing directory from where the KPI files will be picked up and processed by the respective private
project process , in this case mosa .
Currently the KPI undergoes a singleton post processing , which means that there is a generic post
processing graph which collects all KPI delta files from $COM_KPI_PROCESSED directory . So
inorder that the existing post processing can be reused or a minimal change is done to the process ,
the KPI files for Mosa should be finally getting created in $COM_KPI_PROCESSED rather than
$COM_MOSA_PROCESSED , which can be achieved by including the common project com_kpi in
the private project of mosa and passing the value $COM_KPI_PROCESSED to the parameter
DELTA_DIRECTORY for the psets :/Projects/ddl/source/mosa/pset/price_reference_kpi_generic_batch.pset
/Projects/ddl/source/mosa/pset/ campaign_discount_kpi_generic_batch.pset
Version V3.0
Page 19 of 62
Option 2 : Create separate plan and graph for MOSA KPI processing .
Advantages :
No dependency upon Unify stream, MOSA KPI can be processed independently.
Disadvantages:
Value
$COM_DDL_ATLAS_PENDING/atlas_kpi_mosa_last_extracted_details_without_status.dat
MOSA,price_reference_kpi|MOSA,campaign_discount_kpi
$COM_DDL_ATLAS_PENDING/atlas_kpi_mosa_last_extracted_details_with_status.dat
It should be taken care that the output file produced as a result of the post processing should be
different than any other regular KPI post processing files created and should be source specific . An
example of the output file which might be created as a result of this process :$COM_DDL_ATLAS_PENDING/ddl_kpi_mosa_<YYYYMMDD>_<seq_no>.dat
Two new plan and graph psets needs to be created as below :atlas_mosa_campaign_discount_generic_post_processing_plan.pset
atlas_mosa_campaign_discount_generic_post_processing.pset
atlas_price_reference_generic_post_processing_plan.pset
atlas_price_reference_generic_post_processing.pset
(ii)
Parameter values for each of the plan pset to be provided as supplied in the below
document :-
MOSA_data_postpr
ocessing_params.xls
258731137.doc
Version V3.0
Page 20 of 62
258731137.doc
Version V3.0
Page 21 of 62
The sub plans price reference deltas and campaign discount deltas would determine the purge
date for each of the corresponding Subscribers and purge them accordingly . The above sub plans
would consist of something as below :-
(i)
Determining Purge date :The first graph task would determine the purge date for the subscriber and the second graph task
would purge the required data in delta files .
Two new psets needs to be created as :$AI_PSET/determine_ddl_data_store_purge_date_cmpgn_disc.pset
$AI_PSET/determine_ddl_data_store_purge_date_price_ref.pset
The above psets would be passed in the Graph parameter of the above mentioned determine purge
date subplan .
The details of parameters of the above pset are as follows :Graph Name called by the psets :/Projects/ddl/com_housekeeping/mp/determine_ddl_data_store_purge_date.mp
SOURCE_SYSTEM
FEED
LKP_LOCATION
MOSA
campaign_discount/price_reference
$COM_DDL_SERIAL_LOOKUP
ii) Removing older delta files:The second graph task would remove the delta files which are older than the PURGE_DATE . This
needs creation of two new psets as follows :$AI_PSET/housekeep_cmpgn_disc_deltas.pset
$AI_PSET/housekeep_price_ref_deltas.pset
The details of the parameters of the psets are as follows :Graph Name called by the psets :/Projects/ddl/com_housekeeping/mp/data_store_copied_files_housekeeping.mp
258731137.doc
Version V3.0
Page 22 of 62
SOURCE_SYSTEM
FEED
FEED_DIRECTORY
FILE_PATTERN
PURGE_DATE_FILE_LOCATION
MOSA
campaign_discount/price_reference
$COM_MOSA_PROCESSED
mosa_cmpgn_disc_delta/mosa_price_ref_delta
$AI_SERIAL
iii) Insertion of onetime records in Control tables :1. The object /Projects/ddl/com_ddl/sql/create_control_tables.sql would be modified to insert
the below sql query.
INSERT
INSERT
INSERT
INSERT
INTO
INTO
INTO
INTO
DDL_HOUSEKEEPING
DDL_HOUSEKEEPING
DDL_HOUSEKEEPING
DDL_HOUSEKEEPING
VALUES
VALUES
VALUES
VALUES
('MOSA,'Log', 30);
(MOSA, Archive ,30 ) ;
('MOSA','Reject',30);
('MOSA','Error',30);
values
values ('MOSA',
values
values ('MOSA',
into
into
into
into
ddl_data_retention_period
ddl_data_retention_period
ddl_data_retention_period
ddl_data_retention_period
values
values
values
values
('MOSA',campaign_discounts,7);
('MOSA',price_reference,7);
('MOSA',campaign_discounts_kpi,7);
('MOSA',price_reference_kpi,7);
iv) Creation of pset for housekeeping of log , reject , error and archive files of MOSA :The following psets needs to be created to be passed to the error/log/reject/archive sub plans
:$AI_PSET/housekeep_mosa_error_files.pset
$AI_PSET/housekeep_mosa_log_files.pset
$AI_PSET/housekeep_mosa_reject_files.pset
$AI_PSET/housekeep_mosa_archive_files.pset
The details of the parameters of the above psets are as below :URL
258731137.doc
Version V3.0
Page 23 of 62
$AI_SERIAL_ERROR/$COM_MOSA_REJECT/
$COM_MOSA_ARCHIVE/$AI_SERIAL_LOG
HOUSEKEEPING_FILE_TYPE|
Error/Reject/Archive
The same details as for housekeeping the feed files also applies for KPI files which are briefly
described below :(i)
Determining Purge date :Two new psets needs to be created as :$AI_PSET/determine_ddl_data_store_purge_date_cmpgn_disc_kpi.pset
$AI_PSET/determine_ddl_data_store_purge_date_price_ref_kpi.pset
Details of the parameter values as below :SOURCE_SYSTEM
FEED
LKP_LOCATION
(ii)
MOSA
campaign_discount_kpi/price_reference_kpi
$COM_DDL_SERIAL_LOOKUP
258731137.doc
Version V3.0
Page 24 of 62
Version V3.0
Page 25 of 62
4.6.2 Polling
A new TWS job needs to be defined during development phase which will poll for the new feed files
and the corresponding KPI files of MOSA , the existing wrapper
/Projects/atlas_dwh/com_atlas/bin/generic_conduct_etl_flow.ksh, needs to be used for the
polling purpose.
258731137.doc
Version V3.0
Page 26 of 62
Name
Example
Description
SOURCE_SYSTEM
MOSA
FEED
campaign_discount/price_reference
SOURCE_DIRECTORY
${COM_ATLAS_SERIAL_PENDING}
FILE_PATTERN
Source system
name
Feed name as per
the pset used
The directory the
source files can be
found in.
The pattern to be
used to locate the
metadata
IS_SOURCE_COMPRESSED
SOURCE_DML_FILE
OUTPUT_DML_FILE
True
Target extract dml file name of each
of the feeds
out :: reformat(in) =
begin
out.* :: in.*;
end;
Same as source dml file
IS_VAL_REQD
True
DBC_FILE
$COM_ATLAS_DB/atlas_ldm.dbc
SCHEMA
$ATLAS_CTRL_OWNER
CREATE_LOOKUP
False
NO_OF_FILES_TO_PICK
SOURCE_TRANSFORM
258731137.doc
Version V3.0
Page 27 of 62
Consolidated file
record format
If record validation
is required
Teradata DBC file
name
Schema name of
atlas registration
table
Whether to use
consolidated file
also as a lookup file
to be used in
downstream
process.
(ii)
KPI files registration process :
The corresponding KPI files of MOSA will undergo file registration process along with metadata file
validation with the revised data file. In order to run this process a new pset per feed needs to be
created for this feed as mentioned below.
pset Location: /Projects/atlas_dwh/source/mosa/pset/
pset Name:
atlas_meta_validation_mv_mosa_price_reference_kpi.pset
atlas_meta_validation_mv_mosa_campaign_discount_kpi.pset
The same graph as mentioned above for processing the data feed files will be called from the above
mentioned psets .The values of the parameters of the pset are as follows :-
Name
Example
Description
SOURCE_SYSTEM
MOSA
FEED
campaign_discount_kpi/price_reference_kpi
SOURCE_DIRECTORY
${COM_ATLAS_SERIAL_PENDING}
FILE_PATTERN
ddl_kpi_mosa*.met
Source system
name
Feed name as
per the pset
used
The directory
the source files
can be found
in.
The pattern to
be used to
locate the
metadata
IS_SOURCE_COMPRESSED
SOURCE_DML_FILE
SOURCE_TRANSFORM
OUTPUT_DML_FILE
True
${COM_DDL_ATLAS_DML}/kpi/ddl_kpicommon_out.dml
out :: reformat(in) =
begin
out.* :: in.*;
end;
${COM_DDL_ATLAS_DML}/kpi/ddl_kpicommon_out.dml
IS_VAL_REQD
True
DBC_FILE
$COM_ATLAS_DB/atlas_ldm.dbc
SCHEMA
$ATLAS_CTRL_OWNER
CREATE_LOOKUP
False
NO_OF_FILES_TO_PICK
258731137.doc
Version V3.0
Page 28 of 62
Consolidated
file record
format
If record
validation is
required
Teradata DBC
file name
Schema name
of atlas
registration
table
Whether to use
consolidated
file also as a
lookup file to be
used in
downstream
process.
Process flow:
Version V3.0
Page 29 of 62
3. Serially. MDW process_key creates all needed surrogate keys. Must run as a singleton to ensure
key consistency and prevent duplicates.
4. In parallel. MDW apply_keys replace natural keys with existing surrogate keys or newly created
surrogate keys.
5. In parallel. MDW model_to_physical maps all the fields in the model to the physical table record
format.
6. In parallel. MDW database_load loads all waiting input files (possibly from many feeds) targeted
for the table and consolidates them. There will be one MDW database load pset per target table.
258731137.doc
Version V3.0
Page 30 of 62
The details of the mapping is as described in the embedded mapping spreadsheet document
below :-
Detailed
Design-Hybrid 2.0_v01.1.xlsx
4.6.10
The consolidated KPI files created by the registration process as mentioned above in the section 4.6.2
would be picked up loading purpose. The KPI data doesnt go through the surrogate key generation
process through the MDW framework and directly gets loaded in the SOURCE_KPI_RECON table.
The current KPI loading process/graph needs to undergo a change to allow the graph to pick up a list
of KPI files created from different process by a pattern naming convention of the KPI files. Currently
the load process looks out for a fixed file name , it should be changed to look for a list of KPI files
created in the $COM_ATLAS_SERIAL_TEMP dir .
The benefit of the change is that a single KPI load process would be responsible for loading all the
KPI load files created in a day .
258731137.doc
Version V3.0
Page 31 of 62
Similar changes are actually taking place in a separate project :- EBU_CVM , and details of the
changes can be referred in the detailed design document of EBU_CVM as mentioned in the related
document list section (Document # 7) .
Note*:- During development phase the Hybrid 2.0 development team needs to coordinate with
the EBU_CVM development team for this change so that a single change satisfies both the
projects need .
4.7
Note*:(i)
Version V3.0
Page 32 of 62
(ii)
(iii)
This onetime change can be done by means of creating a onetime abintio graph or by
using any scripting code .
The above change need not be done if the table PRODUCT_ITEM_HIST is already
extracted by the lookup creation job which runs at the start of the facet jobs in the legacy
system .Needs to be verified during development phase .
Rigorous testing needs to be done during development phase to establish code
robustness for the changes made .
Logical_Data_Mode
l_20140522.xls
Version V3.0
Page 33 of 62
Value
$AI_SERIAL_ARCHIVE/$AI_SERIAL_ERROR
DIR
SYSTEM_NAME
MOSA
HOUSEKEEPING_FILE_TYPE
Archive/Error
LKP_LOCATION
$COM_ATLAS_SERIAL_LOOKUP
258731137.doc
Version V3.0
Page 34 of 62
Description
Dir to be purged
Use the default
value
Source System
Name
Archive dir for
archive
housekeeping
pset and Error
dor for error
housekeeping
pset
Keep the default
value
258731137.doc
Version V3.0
Page 35 of 62
258731137.doc
Version V3.0
Page 36 of 62
258731137.doc
Version V3.0
Page 37 of 62
258731137.doc
Version V3.0
Page 38 of 62
258731137.doc
Version V3.0
Page 39 of 62
6 TWS scheduling :
New TWS jobs needs to be introduced to execute the above illustrated process flows. The TWS
jobs can be broadly classified into DDL jobs and jobs at Atlas end . The possible list of TWS jobs
which would be created and their corresponding dependencies are listed down in the below
attached documents as below :(i)
Hybrid_DDL_job_sc
hedules.xls
(ii)
Hybrid_ATLAS_job_s
chedules.xls
Note*: The jobs are named as per the testing naming conventions, the names need to be
realigned during development as per the environment where the jobs would be
scheduled to run .
The job dependencies with the existing jobs needs to be revalidated during
development phase with the actuals running in prod if incase there are changes to the
job stream during the development phase .
258731137.doc
Version V3.0
Page 40 of 62
7.
Teradata Design
As part of Hybrid 2.0 Project, below are the design covered in this document related to Teradata.
TABLE NAME
EVENT
EVENT_TABLE_TYPE
EVENT_CLASS
INCENTIVE_EVENT
INCENTIVE_RESULT_TY
PE
NEW/EXISTING
REFERENCE
TABLE(Y/N)
ETL
LOAD(Y/N)
EXISTING
EXISTING
N
Y
Y
N
EXISTING
NEW
NEW
N
Y
Y
N
COMMENTS
One Time Manual
Load
One Time Manual
Load
One Time Manual
Load
EVENT_CL
ASS_CD
EVENT_CLA
SS_NAME
IE
Incentive
event
258731137.doc
Version V3.0
Page 41 of 62
EVENT_CLASS_
DESC
Incentive event
ICID
LUID
-1
-1
EVENT_TAB
LE_TYPE_C
D
EVENT_TABL
E_TYPE_NAM
E
EVENT_TABLE_T
YPE_DESC
INEV
Incentive event
Incentive event
ICID
LUID
-1
-1
Data
Type
Description
NUL
L(Y/
N)
Sample
Population
Method
INCENTIVE_E
VENT_ID
INTEGER
Foreign Key to
Event.event_id
ACCS_METH_I
D
INTEGER
CAMPAIGN_ID
INTEGER
PRODUCT_ID
INTEGER
SECONDARY_P
RODUCT_ID
INTEGER
258731137.doc
Version V3.0
Page 42 of 62
Indexes
Unique
Primary
Index
INCENTIV
E_EVENT_I
D
NA
NA
NA
NA
NUL
L(Y/
N)
Column
Data
Type
ACTUAL_CHAR
GE
DECIMAL(8,
2)
INCENTIVE_RE
SULT_ID
INTEGER
CREATION_SO
URCE_TYPE_C
D
MINIMUM_VAL
UE
VARCHAR(3
)
DECIMAL(8,
2)
REDUCTION_A
MOUNT
DECIMAL(8,
2)
ORG_KEY
ICID
VARCHAR(2
55)
SMALLINT
LUID
SMALLINT
Description
Sample
Population
Method
Indexes
NA
Mapped to field
minimum value
from source
feed .This field
can have
datatype as
integer/decimal
but as the source
datatype wasnt
available and no
sample data was
available it will
be advise dto
revisit during
detail design.
Mapped to field
discount value
from source
feed .This field
can have
datatype as
integer/decimal
but as the source
datatype wasnt
available and no
sample data was
available it will
be advise dto
revisit during
detail design.
NA
NA
NA
NA
NA
Through ETL
Process
Through ETL
Process
NA
NA
Version V3.0
Page 43 of 62
Script will be developed to create this table in the in the STG and PROD environment which will have
the definition as mentioned below.
Data
Type
Column
Description
NUL
L(Y/
N)
Sample
INCENTI
VE_RES
ULT_ID
INTEGER
INCENTIV
E_RESUL
T_DESC
VARCHAR(2
55)
Incentive result
description
Reduction Applied
ORG_KEY
VARCHAR(2
55)
SMALLINT
SMALLINT
ICID
LUID
Population
Method
Generated
ID.
Indexes
Unique
Primary
Index
INCENTI
VE_RESU
LT_ID
1=
Reduction
Applied
2=
Reduction
Not applied,
minimum
value not
reached
Straight
move
N
N
NOTE: incentitive_result_cd (As per HLD) has been changed to Incentive_result_id. As per the
existing standard, it should be id not code.
Also column desc (As per HLD) above has been changed to Incentive_result_desc.
Table-name
INCENTIVE_EVENT
INCENTIVE_RESULT_TYPE
Delete/Truncate from
PROD_LDM
Yes
Yes
No change will be required in the COPY_TO_PROD.ksh shell script. However, there will be entries for
the new tables as mentioned above in the control table PROD_DBA.COPY_TO_PROD.
Version V3.0
Page 44 of 62
impacted table in the control table STG_DBA.BACKUP_FACET. Sample data from BACKUP_FACET
table is given below for your reference.
DATABASE
NAME
DATABA
SENAME
_PRE
DATABA
SENAME
_POST
TABLENAM
E
STG_LDM
STG
LDM
EVENT
EVENT_HB
H_B_EVENT
NULL
MAIN
STG_LDM
STG
LDM
EVENT_CLASS
EVENT_HB
H_B_EVENT
_CLASS
NULL
REF
STG_LDM
STG
LDM
EVENT_TABLE
_TYPE
EVENT_HB
NULL
REF
STG_LDM
STG
LDM
INCENTIVE_EV
ENT
NULL
MAIN
STG_LDM
STG
LDM
INCENTIVE_RE
SULT_TYPE
EVENT_IN
CENTIVE_
HB
EVENT_IN
CENTIVE_
HB
H_B_EVENT
_TABLE_TYP
E
H_B_INCEN
TIVE_EVENT
H_B_INCEN
TIVE_RESUL
T_TYPE
NULL
REF
FACET
BACKUPN
AME
EXC
DATA_I
ND
CREATE_
LKP_IND
Sno.
Target view
database
View name
PROD_LDM
INCENTIVE_EVENT
PROD_VIEW
INCENTIVE_EVENT
PROD_LDM
INCENTIVE_RESULT_TYPE
PROD_VIEW
INCENTIVE_RESULT_TYPE
Source
Database
Sno.
Target view
database
View name
STG_LDM
INCENTIVE_EVENT
STG_VIEW
INCENTIVE_EVENT
STG_LDM
INCENTIVE_RESULT_TYPE
STG_VIEW
INCENTIVE_RESULT_TYPE
Source
Database
Target view
database
View name
PROD_VIEW
INCENTIVE_EVENT
PROD_PL_VIEW
INCENTIVE_EVENT
PROD_VIEW
INCENTIVE_RESULT_TYPE
PROD_PL_VIEW
INCENTIVE_RESULT_TYPE
258731137.doc
Version V3.0
Page 45 of 62
MAX_V
ALUE_I
ND
INCENTIVE_RESULT_ID
INCENTIVE_RE
SULT_DESC
Reduction Applied
Reduction Not
applied, minimum
value not reached
ORG_KEY
1
2
ICID
LUID
-1
-1
-1
-1
This is the existing table, one column will be added as part of Hybrid 2.0, the new column would be
Secondary_Product_Id. The row highlighted in yellow shows the new column addition.
Column
Data
Type
Descrip
tion
NULL
(Y/N)
Population
Method
Sample
PRODUCT_ID
INTEGER
PRODUCT_ITEM_ID
INTEGER
PRODUCT_ITEM_STA
RT_DT
DATE
PERIOD_CD
DISCOUNT_METHO
D_CD
TIER_LEVEL_CD
ACTION_CD
CHAR(1)
CHAR(1)
N
N
Lookup on
PRODUCT.PRODUCT
_NAME for Source
field - priceplanid .
Reference to the
Product_item.produ
ct_item_id
For a new
combination of
Priceplanid and
FBSid insert a new
record with sysdate
.If the combination
of
<priceplanid,fbsid>
for a particulr price
changes and a new
value of price
comes in then a
new record will be
populated with the
sysdate.
DATE FORMAT
'yyyy-mm-dd'
0
X
CHAR(1)
CHAR(1)
N
N
X
X
258731137.doc
Version V3.0
Page 46 of 62
383047
2013-12-01
Indexes
Column
Data
Type
Descrip
tion
NULL
(Y/N)
Population
Method
Sample
PRODUCT_ITEM_STA
RT_TM
TIME(0)
PRODUCT_ITEM_EN
D_DT
DATE
PRODUCT_ITEM_EN
D_TM
TIME(0)
PRODUCT_ITEM_CH
ARGE_TYPE_CD
PRODUCT_ITEM_MO
NETARY_AMT
CHAR(1)
For a new
combination of
Priceplanid and
FBSid insert a new
record with systime
.If the combination
of
<priceplanid,fbsid>
for a particulr price
changes and a new
valu of price comes
in then a new
record will be
populated with the
systime.
For a new
combination of
Priceplanid and
FBSID insert a new
record with NULL .If
the combination of
<priceplanid,fbsid>
for a particulr price
changes and a new
value of price
comes in then a
new record will be
populated with the
NULL ,The older
record will be
closed with
sysdate.
For a new
combination of
PriceplanId and
FBSId insert a new
record with NULL .If
the combination of
<priceplanId,fbsId
> for a particulr
price changes and
a new value of
price comes in then
a new record will
be populated with
the NULL ,The older
record will be
closed with
sysdate.
X
EUR*
DECIMAL(
12,5)
INTEGER
CHAR(4)
NA
Y
Y
NA
NULL
SMALLINT
SMALLINT
Through ETL
Process
Through ETL
Process
MONETARY_UNIT_O
F_MEASURE_CD
PRELIMINARY_CHAR
GE
UNIT_DURATION
DURATION_UNIT_OF
_MEASURE_CD
ICID
LUID
258731137.doc
Version V3.0
Page 47 of 62
DECIMAL(
18,6)
CHAR(4)
12/31/1899
Indexes
Column
SECONDARY_PRO
DUCT_ID
Data
Type
Descrip
tion
INTEGER
NULL
(Y/N)
Sample
Population
Method
Indexes
Lookup on
PRODUCT.PRODUCT
_NAME for Source
field fbs
Note: The changes needs to be made in both the staging and target database.
* Verify sample data whether it would be EUR or EURC
NEW/EXISTING
REFERENCE
TABLE(Y/N)
ETL
LOAD(Y/N)
PRODUCT_ITEM_HIST
EXISTING
PRODUCT_ITEM
EXISTING
COMMENTS
There is structural
change in table,
One new column
to be added in this
Table
One Time ETL Load
Data
Type
Descrip
tion
NULL
(Y/N)
PRODUCT_ID
INTEGER
PRODUCT_ITEM_ID
INTEGER
PRODUCT_ITEM_STA
RT_DT
DATE
258731137.doc
Version V3.0
Page 48 of 62
Sample
383047
2013-12-01
Population
Method
Lookup on
PRODUCT.PRODUCT
_NAME for Source
field - priceplanid .
Reference to the
Product_item.produ
ct_item_id
For a new
combination of
Priceplanid and
FBSid insert a new
record with sysdate
.If the combination
of
<priceplanid,fbsid>
for a particulr price
changes and a new
value of price
comes in then a
new record will be
populated with the
sysdate.
DATE FORMAT
'yyyy-mm-dd'
Indexes
Column
Data
Type
Descrip
tion
NULL
(Y/N)
Population
Method
Sample
PERIOD_CD
DISCOUNT_METHO
D_CD
TIER_LEVEL_CD
ACTION_CD
PRODUCT_ITEM_STA
RT_TM
CHAR(1)
CHAR(1)
N
N
0
X
CHAR(1)
CHAR(1)
TIME(0)
N
N
N
PRODUCT_ITEM_EN
D_DT
DATE
PRODUCT_ITEM_EN
D_TM
TIME(0)
PRODUCT_ITEM_CH
ARGE_TYPE_CD
PRODUCT_ITEM_MO
NETARY_AMT
CHAR(1)
X
X
For a new
combination of
Priceplanid and
FBSid insert a new
record with systime
.If the combination
of
<priceplanid,fbsid>
for a particulr price
changes and a new
valu of price comes
in then a new
record will be
populated with the
systime.
For a new
combination of
Priceplanid and
FBSID insert a new
record with NULL .If
the combination of
<priceplanid,fbsid>
for a particulr price
changes and a new
value of price
comes in then a
new record will be
populated with the
NULL ,The older
record will be
closed with
sysdate.
For a new
combination of
PriceplanId and
FBSId insert a new
record with NULL .If
the combination of
<priceplanId,fbsId
> for a particulr
price changes and
a new value of
price comes in then
a new record will
be populated with
the NULL ,The older
record will be
closed with
sysdate.
X
EUR*
NA
Y
Y
NA
NULL
MONETARY_UNIT_O
F_MEASURE_CD
PRELIMINARY_CHAR
GE
UNIT_DURATION
DURATION_UNIT_OF
_MEASURE_CD
258731137.doc
Version V3.0
Page 49 of 62
DECIMAL(
18,6)
CHAR(4)
DECIMAL(
12,5)
INTEGER
CHAR(4)
12/31/1899
Indexes
Column
Data
Type
Descrip
tion
NULL
(Y/N)
ICID
SMALLINT
LUID
SMALLINT
SECONDARY_PRO
DUCT_ID
INTEGER
Sample
Population
Method
Through ETL
Process
Through ETL
Process
Lookup on
PRODUCT.PRODUCT
_NAME for Source
field fbs
Note: The changes needs to be made in both the staging and target database.
* Verify sample data whether it would be EUR or EURC
258731137.doc
Version V3.0
Page 50 of 62
Indexes
DATABA
SENAME
_PRE
DATABA
SENAME
_POST
TABLEN
AME
FACET
BACKU
PNAME
EXC
DATA_I
ND
CREATE
_LKP_IN
D
MAX_VA
LUE_IN
D
MAX_KE
Y
VR_IND
PART
ITIO
N_KE
Y
PRIMARY_
KEY
PRODUCT_ID;
PRODUCT_ITE
M_ID;PRODUC
T_ITEM_START
_DT;
PERIOD_CD;D
ISCOUNT_MET
HOD_CD;TIER
_LEVEL_CD
,ACTION_Cd
PRODUCT_ITE
M_ID
STG_LDM
STG
LDM
PRODUCT
_ITEM_HIS
T
OFFER_H
B
H_B_PRO
DUCT_ITE
M_HIST
NULL
HIST
STG_LDM
STG
LDM
PRODUCT
_ITEM
OFFER_H
B
H_B_PRO
DUCT_ITE
M
NULL
MAIN
PRODUCT
_ITEM_ID
258731137.doc
Version V3.0
Page 51 of 62
PRODUCT_I
TEM_ID
MAX(PRODUC
T_ITEM_ID)+1
PROD
UCT_I
TEM_
GROU
P_CD
UNK
PROD
UCT_I
TEM_S
ALE_I
ND
P
PROD
UCT_I
TEM_
NAME
FLEX
PRICE
PROD
UCT_I
TEM_
DESC
PROD
UCT_I
TEM_
PRICI
NG_N
AME
PROD
UCT_I
TEM_
TYPE_
CD
CREA
TION_
SOUR
CE_TY
PE_CD
FLEX
PRICE
NULL
UNK
LEG
ICID
-1
LUID
-1
258731137.doc
Version V3.0
Page 52 of 62
NEW/EXISTING
EXISTING
REFERENCE
TABLE(Y/N)
Y
ETL
LOAD(Y/N)
N
COMMENTS
One Time Manual
Load
DATABASEN
AME
STG_LDM
258731137.doc
Version V3.0
Page 53 of 62
DATAB
ASENA
ME_PR
E
STG
DATAB
ASENA
ME_PO
ST
LDM
TABLE
NAME
CREATIO
N_SOUR
CE_TYPE
FACET
MISC_H
B
BACKU
PNAM
E
H_B_CR
EATION_
SOURCE
_TYPE
EXC
NULL
DATA_I
ND
CREAT
E_LKP_
IND
MAX_V
ALUE_I
ND
MAX_K
EY
VR_IN
D
PARTIT
ION_K
EY
PRIMA
RY_KE
Y
REF
CREATIO
N_SOUR
CE_TYPE
_CD
CREATION_SOU
RCE_TYPE_NAM
E
CREATION_SO
URCE_TYPE_D
ESC
MOS
MOSA
MOSA
ICID
-1
LUID
-1
258731137.doc
Version V3.0
Page 54 of 62
NEW/EXISTING
EXISTING
EXISTING
REFERENCE
TABLE(Y/N)
N
Y
ETL
LOAD(Y/N)
Y
N
COMMENTS
One Time Manual
Load
DATABASEN
AME
DATAB
ASENA
ME_PR
E
DATAB
ASENA
ME_PO
ST
TABLE
NAME
FACET
BACKU
PNAM
E
EXC
DATA_I
ND
CREAT
E_LKP_
IND
MAX_V
ALUE_I
ND
MAX_K
EY
VR_IN
D
PARTIT
ION_K
EY
PRIMA
RY_KE
Y
STG_LDM
STG
LDM
CAMPAI
GN
CAMPAI
GN_HB
H_B_CA
MPAIGN
NULL
MAIN
CAMPAI
GN_ID
CAMPAI
GN_ID
STG_LDM
STG
LDM
CAMPAI
GN_STR
ATEGY
CAMPAI
GN_HB
H_B_CA
MPAIGN_
STRATE
GY
NULL
DATA
CAMPAI
GN_STR
ATEGY_C
D
258731137.doc
Version V3.0
Page 55 of 62
CAMPAIGN_
STRATEGY_
CD
CAMPAIGN_S
TRATEGY_NA
ME
CAMPAIGN_STRA
TEGY_DESC
HCL
Hybrid
Campaign List
ICID
LUID
-1
-1
Database level
Data Purging
Table Level
No History
Requirement
Database Level
Purging is not
happening at
database level.
Table level
Not in scope
DATABASENA
ME
TABLEN
AME
${DB_ENV1}_LDM
INCENTIV
E_EVENT
${DB_ENV1}_LDM
258731137.doc
Version V3.0
Page 56 of 62
INCENTIV
E_RESULT
_TYPE
COLUM
NTYPE
IDX
IDX
COLUM
NNAME
LIST
INCENTIV
E_EVENT
_ID
INCENTIV
E_RESUL
T_ID
SAMPL
E_STAT
S
INTERV
AL
JOBNA
ME
ACTIVE
NULL
REST
NULL
REST
DATABASENA
ME
TABLEN
AME
${DB_ENV1_LDM
PRODUCT
_ITEM_HIS
T
COLUM
NTYPE
COL
COLUM
NNAME
LIST
PRODUCT
_ID,SECO
NDARY_P
RODUCT_
ID
SAMPL
E_STAT
S
NULL
INTERV
AL
D
JOBNA
ME
REST
ACTIVE
Y
Below table structure shows the data and column mapping for the table COL_STAT_DATA to gather
statistics in Target database.
DATABASENA
ME
TABLEN
AME
${DB_ENV1}_LDM
INCENTIV
E_EVENT
${DB_ENV1}_LDM
Note:
INCENTIV
E_RESULT
_TYPE
COLUM
NTYPE
IDX
IDX
COLUM
NNAME
LIST
INCENTIV
E_EVENT
_ID
INCENTIV
E_RESUL
T_ID
SAMPL
E_STAT
S
INTERV
AL
JOBNA
ME
ACTIVE
NULL
DELTA
NULL
DELTA
7.5 ADQM
KPI file is used for the purpose of reconciliation in ATLAS. The ADQM sub-system within Atlas
provides a reconciliation mechanism between the supplying source systems and the Atlas LDM.
Source systems calculate values for specific KPIs and these values are then re-calculated within
the Atlas LDM.
The file KPI file should be associated with a corresponding metadata file. As part of Hybrid 2.0
project, there would be two KPI files
Priceplan Feed
Discount Feed (HCL feed)
The existing load processes shall be re-used and the KPIs generated from these queries loaded to
the PROD_RECON.ATLAS_KPI_RECON table. The KPIs delivered by the source systems shall be
loaded to the PROD_RECON.SOURCES_KPI_RECON.
The ADQM sub-system consists of a number of scripts that execute SQL against the LDM and load
equivalent data, calculated within Atlas rather than within the Source System, in to the LDM.
No. Name
Type
Description
0
1
String(20)
Number
SYSTEM
KPI_ID
258731137.doc
Version V3.0
Page 57 of 62
2
3
4
5
6
GROUP_LEVEL_1
GROUP_LEVEL_2
KPI_DATE
KPI_VALUE
UOM
String(100)
String(100)
Date
Number
String
Grouping level 1
Grouping level 2
Date for the measurement
Value of the measurement for the specific date
Unit of Measurement
258731137.doc
Version V3.0
Page 58 of 62
KPI_ID
KPI_DESCRIPTION
TARGET_DIFFERE
NCE
TARGET_DIFF
_DATE
DEVIATIO
N
MOSA
1,00
CURRENT
DATE
CURRENT
DATE
15,00
MOSA
1,00
15,00
STG_VIEW.PRODUCT_ITEM_HIST
[IE]
Column Name
SYSTEM
KPI_ID
GROUP_LEVEL_1
GROUP_LEVEL_2
KPI_DATE
KPI_VALUE
UOM
258731137.doc
Version V3.0
Page 59 of 62
Value
MOSA
5
PRICEPPLAN_ID
FBS_ID
Date (XLS.BATCH_DATE)
Count(*)
Number
STG_VIEW.INCENTIVE_EVENT
STG_VIEW.XL2_LOAD_STATUS
STG_VIEW.XL2_LOAD_STATUS_HIST
[IE]
[XLS]
[XLSH]
From the XL2_LOAD_STATUS_HIST, take the latest batch data related to the Call centre
interface.
{XLSH.SOURCE = MOSA }
258731137.doc
Version V3.0
Page 60 of 62
Value
MOSA
6
INCENTIVE_RESULT_ID
CAMPAIGN_ID
Date (XLS.BATCH_DATE)
Count(*)
Number
258731137.doc
Version V2.0
Page 61 of 62
8.
Appendix
258731137.doc
Version V2.0
Page 62 of 62