Sie sind auf Seite 1von 30

Daily Backup, Recovery and Restoration Process Design

Daily Backup, Recovery and Restoration Process Design

Table of Contents

  • 1 Introduction..................................................................................................................................4

    • 1.1 Background..........................................................................................................................4

    • 1.2 Overview..............................................................................................................................4

  • 2 Backup and archive process flow...............................................................................................5

    • 2.1 Process flow diagram...........................................................................................................5

    • 2.2 System flow chart.................................................................................................................6

      • 2.2.1 Backup process..................................................................................................................7

      • 2.2.2 Archive Process.................................................................................................................8

  • 2.3 Pre-job requirements..........................................................................................................14

    • 2.3.1 Backup process prerequisites..........................................................................................14

    • 2.3.2 Archive process prerequisites..........................................................................................14

  • 3 Restoration process...................................................................................................................15

    • 3.1 Checkpoints for Restoration process..................................................................................23

      • 3.1.1 HUB/HUB_REM/IFAI/HMI: restoring data in Reprocessing tables...................................24

      • 3.1.2 OHC/CARM/IFAI/HMI/BASEL/External/Parameter/HUB_REM/XXXXX.Net: Restoring

  • process .........................................................................................................................................25

    • 4 Recovery process......................................................................................................................27

      • 4.1 HUB....................................................................................................................................28

      • 4.2 CARM.................................................................................................................................28

      • 4.3 OHC...................................................................................................................................28

      • 4.4 BASEL................................................................................................................................29

      • 4.5 External..............................................................................................................................29

      • 4.6 XXXXXnet..........................................................................................................................29

      • 4.7 Parameter Files..................................................................................................................29

      • 4.8 HMI, HUB_REM, XXXXXnet..............................................................................................30

    Table of Figures

    Figure 1: Process flow diagram for backup and archive process.......................5 Figure 2: System flow chart for backup process..............................................7

    Table of Tables

    Table 1: Jobs not to be run during Recovery process......................................23

    1

    Introduction

    This document provides an overview of the design related to Change Data Capture (CDC) Daily Data Backup, Storage and Restoration for all the lookup and reprocessing tables under One XXXXX Business Intelligence (OCBB) Integrated Customer Relationship Analysis (ICRA).

    • 1.1 Background

    As per the functional requirement FR5.4, a process to backup and restore the reprocessing and/or lookup tables needs to be designed and implemented so that the OCBB environment can be restored to normal processing easily whenever data issues arise.

    The objective of the design is to take the backup of all the source tables used in Daily Cycle into backup folder and restore it based on the business request.

    Daily cycle data is stored for 32 days.

    This document is specific to daily backup and restoration process.

    • 1.2 Overview

    The scope of this design document is to meet the functional requirement FR5.4 from Global Analytics Detailed Requirements document. The following are the details of the requirement:

    1)

    Retain copies of the Source System and CDC extracts as follows:

    Daily Load data should be retained for 32 days (If parameter is raised, then it can be defined by the user)

    2)

    The OCBB environment and all current data should be available within 24 hours of a system or process failure

    3)

    The system should also support purging of data based on the aforementioned data retention parameters, or additional parameters established at the local or regional level.

    This backup and restore process is mainly to re-run the cycles from the required date. This result and data file is mainly for the Support Team and not for the business users.

    This activity is not applicable for restoring the initial cycle data. If there is any issue with the initial cycle data, it has to be started from the beginning of the cycle.

    It is required to restore HUB data also if any of the Source System requires to recover due to customer and products derived from HUB system.

    • 2 Backup and archive process flow

    This topic illustrates a high level process flow diagram for job design regarding CDC daily data backup and restore process.

    • 2.1 Process flow diagram

    The following figure illustrates the process flow diagram for backup and archive process:

    2 Backup and archive process flow This topic illustrates a high level process flow diagram for

    Figure 1: Process flow diagram for backup and archive process

    Backup and Archive process is implemented with the help of Unix scripts.

    • 2.2 System flow chart

    The following figure illustrates the system flow chart diagram for backup and archive process:

    2.2 System flow chart The following figure illustrates the system flow chart diagram for backup and
    Figure 2: System flow chart for backup process 2.2.1 Backup process HUB : The data feed

    Figure 2: System flow chart for backup process

    • 2.2.1 Backup process

    HUB: The data feed to HUB process comes through Attunity. In this process, the source data is loaded into DB2 Staging tables from Attunity for daily processing into corresponding tables. Thereafter, the script takes the backup of the corresponding DB2 table’s data into flat files and stores it with ‘Process date’ for easy restore process.

    OHC (One XXXXX Cards): The data feed to OHC process comes through the Mainframe. A daily backup of the corresponding input source file (*.txt) is taken and stored with ‘Process date’ for easy restore process.

    CARM (Credit Approval and Risk Management System): The data feed for CARM comes in XML files through the Message Queue (MQ) process. For daily, the script archives all xml files with ‘Process date’.

    GPS: Script takes the backup of the corresponding DB2 table’s data into flat files and stores it with ‘Process date’ for easy restore process.

    HMI (XXXXX Management Information System): The data feed for HMI comes through files used for the CDC process. In this process, the source data is loaded into DB2 Staging tables from DataStage CDC for daily process into corresponding tables. Thereafter, script takes the backup of the corresponding input files and stores it with ‘Process date’ for easy restore process.

    HUB REM – HUB REM Journal: The data feed to HUB REM process comes through DataStage CDC. In this process, the source data is loaded into DB2 Staging tables from DataStage CDC for daily process into corresponding tables. Thereafter, the script takes the backup of the corresponding DB2 table’s data into flat files and stores it with ‘Process date’ for easy restore process.

    XXXXX Net: The data feed to XXXXX.Net process comes through the Mainframe. Therefore, for Daily Cycle, the script archives the corresponding input source files with ‘Process date’ for easy restore process.

    IFAI (Invoice Finance Application International): The data feed for IFAI comes through tables that are based on Manual CDC which was newly introduced as part of CMB versions. Therefore, the script takes backup of the corresponding DB2 table’s data into flat files and stores it with ‘Process date’ for easy restore process.

    BASEL: The data feed for BASEL comes through files and it is processed through BASEL source jobs. The entire source files (.txt) data load into interface table for further process. Script archives the corresponding input source files with ‘Process date’ for easy restore process.

    External System: The data feed to this process comes through the External Source System. Therefore, for Daily Cycle, the script archives the corresponding input source files with ‘Process date’ for easy restore process.

    Parameter Files: The data feed for Parameter Files comes through files (.csv) and it is processed through Parameter Source jobs. Dimension definitions are fed from user by parameter excel files to dimension tables. A simple example would be Customer Segment which is grouping of Customer Classification Code from Source System. They are essential components of fact view key columns and report drilling (up/down).A script archives the corresponding input source files with ‘Process date’ for easy restore process.

    Process date: Process date is nothing but load date (Parm Date) from the cycle.

    • 2.2.2 Archive Process

    This topic describes the various types of archiving processes.

    • 2.2.2.1 HUB REM Data Archiving

    Refer to the following points related to HUB REM Data Archiving process:

    1)

    Retrieves all the files with HUB_REM*.del extension at the backup landing path($prmDirLandingBkp)

    2)

    Combines all the HUB REM files into a single file named (OCBB_Daily_HUB_REM_del.<Process date>.tar) using the Unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file) Compress the tar file using the Unix gzip (OCBB_Daily_HUB_REM_del.<Process date>.tar.gz) command

    3)

    Removes all the HUB REM files from original location

    4)

    If the Archiving process is not successful script is written as an error message to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.2

    HUB Reprocessing Tables (Export data files) Archiving

    Refer to the following points for HUB Reprocessing Tables Archiving process:

    1)

    Retrieves all the files with HUB*.del extension at the backup landing path($prmDirLandingBkp)

    2)

    Combines the entire HUB*.del files into a single file named (OCBB_Daily_HUB_del.<Process date>.tar) using the Unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file)

    3)

    Compresses the tar file using the Unix gzip command (OCBB_Daily_HUB_del.<Process date>.tar.gz)

    4)

    Removes all the HUB*.del files from original location

    5)

    If the Archiving process is not successful, a script writes an error message to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.3

    IFAI Reprocessing Tables (Export data files) Archiving

    Refer to the following points related to IFAI Reprocessing Tables Archiving process:

    1)

    Retrieves all the files with IFAI*.del extension at the backup landing path($prmDirLandingBkp)

    2)

    Combines the entire HUB*.del files into a single file named (OCBB_Daily_IFAI_del.<Process date>.tar) using the unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file)

    3)

    Compresses the tar file using the unix gzip command (OCBB_Daily_IFAI_del.<Process date>.tar.gz)

    4)

    Removes all the IFAI*.del files from original location

    5)

    If the Archiving process is not successful script would be written error message to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.4

    OHC Data Archiving

    Refer to the following points related to OHC Data Archiving process:

    1)

    Retrieves all the OHC files (from OHC_filelist) from landing directory to backup landing directory

    2)

    Combines all the OHC files into a single file named (OCBB_Daily_OHC. < Process date>.tar) using the unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file) Compress the tar file using the unix gzip (OCBB_Daily_OHC.<Process date>.tar.gz) command

    3)

    Validates the Process date from tar file name with parameter file

    4)

    Removes all the OHC files from original location

    5)

    If the Archiving process is not successful, a script writes an error message to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.5

    CARM Files Archiving

    Refer to the following points related to CARM Files Archiving process:

    1)

    Retrieves all the CARM files (from CARM_filelist) from $prmDirXml (parameter from parameter file to backup landing directory path

    2)

    Combines all the CARM files into a single file named (OCBB_Daily_CARM.<Process date>.tar) using the unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file)

    3)

    Compress the tar file using the unix gzip command (OCBB_Daily_CARM.<Process date>.tar.gz)

    4)

    Removes all the CARM files from original location

    5)

    If the Archiving process is not successful, a script writes an error message to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.6

    HMI Data Files Archiving

    Refer to the following points related to HMI Data Files Archiving process:

    1)

    Retrieves all the HMI files (from HMI_filelist) from landing directory to backup landing directory path

    2)

    Combines all the HMI files into a single file named (OCBB_Daily_HMI.<Process date>.tar) using the Unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file)

    3)

    Compresses the tar file using the Unix gzip (OCBB_Daily_HMI.<Process date>.tar.gz) command

    4)

    Removes all the HMI files from original location

    5)

    If the Archiving process is not successful, a script writes an error message to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.7

    XXXXX Net Data Archiving

    Refer to the following points related to XXXXX Net Data Archiving process:

    1)

    Retrieves all the files with XXXXX_Net files (from XXXXX_Net_filelist) from the landing directory to the backup landing directory

    2)

    Combines all the XXXXX Net files into a single file named (OCBB_Daily_XXXXX_Net.<Process date>.tar) using the Unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file)

    3)

    Compresses the tar file using the Unix gzip (OCBB_Daily_XXXXX_Net.<Process date>.tar.gz) command

    4)

    Removes all the XXXXX Net files from original location

    5)

    If the Archiving process is not successful, a script writes an error message on to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.8

    IFAI Data Archiving

    Refer to the following points related to IFAI Data Archiving process:

    1)

    Retrieves all the IFAI files (from IFAI_filelist) from landing directory to the landing backup directory

    2)

    Combines all the IFAI files into a single file named (OCBB_Daily_IFAI.<Process date>.tar) using the Unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file)

    3)

    Compress the tar file using the Unix gzip (OCBB_Daily_IFAI.<Process date>.tar.gz) command

    4)

    Removes all the IFAI files from original location

    5)

    If the Archiving process is not successful, a script writes an error message on to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.9

    BASEL Data Archiving

    Refer to the following points related to BASEL Data Archiving process:

    1)

    Retrieves all the Basel files (from Basel_filelist) from landing directory to landing backup directory

    2)

    Combines all the BASEL files into a single file named (OCBB_Daily_BASEL.<Process date>.tar) using the Unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file)

    3)

    Compress the tar file using the Unix gzip (OCBB_Daily_BASEL.<Process date>.tar.gz) command

    4)

    Removes all the BASEL files from original location

    5)

    If the Archiving process is not successful, a script writes an error message on to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.10 External Systems Data Archiving

    Refer to the following points related to External Systems Data Archiving process:

    1)

    Retrieves all the External files (from External_filelist) from landing directory to landing backup directory

    2)

    Combines all the External system files into a single file named (OCBB_Daily_External.< Process date>.tar) using the Unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file)

    3)

    Compress the tar file using the UNIX gzip (OCBB_Daily_External.<Process date>.tar.gz) command

    4)

    Removes all the External System files from original location

    5)

    If the Archiving process is not successful, the script aborts the process and writes an error message on to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.11

    Parameter Data Archiving

    Refer to the following points related to Parameter Data Archiving process:

    1)

    Retrieves all the Parameter files (from Parameter_filelist) from landing directory to landing backup directory

    2)

    Combines all the Parameter files into a single file named (OCBB_Daily_Parameter.<Process date>.tar) using the unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file) Compress the tar file using the unix gzip (OCBB_Daily_Parameter.<Process date>.tar.gz) command

    3)

    Removes all the Parameter files from original location

    4)

    If the Archiving process is not successful, the script aborts the process and writes an error message on to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <script Directory >/ <country code>.

    2.2.2.12

    *.msgs Files (Reprocessing table backup data message files) Archiving

    Refer to the following points related to *.msgs Files Archiving process:

    1)

    Retrieves all the files with *.msgs extension at the backup landing path($prmDirLandingBkp)

    2)

    Combines all the *.msgs files into a single file named (OCBB_Daily_del_msgs.<Process date>.tar) using the unix tar command and saves at the path $prmDirdatabkp (parameter from parameter file)

    3)

    Compresses the tar file using the Unix gzip (OCBB_Daily_del_msgs.<Process date>.tar.gz) command

    4)

    Removes all the *.msgs files from original location.

    2.2.2.13

    Log Files archiving

    Refer to the following points related to Log Files Archiving process:

    1)

    Retrieves all the log files with log extension present in the following paths:

    • a) $prmDirlogavl (parameter from parameter file) For example, /data/ds/dpr_gacpt_dev/joblogs/bn

    • b) <Script Directory >/ <country code>

    2)

    Combines all the log files into a single file named, OCBB_Daily_log.<Process date>.log.tar using the Unix tar command and saves at the path $prmDirlogbkp (parameter from parameter file) For example, /data/ds/dpr_gacpt_dev/scripts/bn/bkp/logs

    3)

    Compress the tar file using the Unix gzip command (OCBB_Daily_log.<Process date>.log.tar.gz)

    4)

    Removes all the log files from original location

    5)

    If the Archiving process is not successful, the script aborts the process and writes an error message on to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path, <script Directory >/ <country code>.

    2.2.2.14 List of Archive files

    At the end of backup and archive process, a script generates the following list of Archive files:

    $prmDirdatabkp/OCBB_Daily_HUB_REM_del.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_HUB_del.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_IFAI_del.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_CARM.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_OHC.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_External.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_HMI.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_HUB_REM.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_XXXXX_Net.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_IFAI.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_BASEL.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_Parameter.<Process date>.tar.gz

    $prmDirdatabkp/OCBB_Daily_del_mesg.<Process date>.tar.gz

    $prmDirlogbkp/OCBB_Daily_log.<Process date>.tar.gz

    The script performs the following functions:

    1)

    Checks whether any of the aforementioned archived gzip files are older than the Retention

    1)

    Period Removes all the gzip files that are older than the Retention Period (32 days)

    2)

    If the Archiving process is not successful, the script aborts the process and writes an error message on to the log file, OCBB_Archive_script_log_Daily.<Process date>.log at the path <landing path>/backup/<country code>/logs.

    Process date is the load date from the cycle.

    All the gzip files that are older than the Retention Period (32 days) are removed.

    • 2.3 Pre-job requirements

    Refer to the following prerequisites before the backup and archive scripts run.

    • 2.3.1 Backup process prerequisites

    CDC jobs that populate daily data should be completed successfully to take the backup of the Staging (pSeries) tables from the previous cycles.

    • 2.3.2 Archive process prerequisites

    Refer to the following prerequisites for the archive process:

    1)

    djpHewr2XFMCARM_MQ_XML which generates XML file should be completed to take the backup of XML files

    2)

    All the scripts should be completed to archive log files

    3)

    OCBB_CDC_BKP_Process.sh should be completed and .del files should be generated

    4)

    successfully OHC data and file (OHC_filelist)should be available

    5)

    HMI file (HMI_filelist) should be available for Daily target table population

    6)

    XXXXX Net file (XXXXX_Net_filelist) should be available

    7)

    BASEL file (Basel_filelist) should be available

    8)

    External data file (External_filelist) should be available

    9)

    HUB_REM file (HUB_REM_filelist) should be available

    10) IFAI file (IFAI_filelist) should be available 11) Parameter file (Parameter_filelist) should be available 12) CARM file (CARM_filelist) should be available 13) In case any of the files are not available, please create dummy files before the script execution.

    • 3 Restoration process

    Please refer to the following steps for implementing the restoration process:

    1)

    Execute the script, OCBB_GA_SP_EXPORT.sh before Restore process for a particular day

    Script name: <Script_path>/<country_cde>/ OCBB_GA_SP_EXPORT.sh

    Command line:

    sh OCBB_GA_SP_EXPORT.sh <Script_path> <parameter file name>

     

    2)

    Please run the script, OCBB_Archive_Process.sh to take the backup of the all the source files that are landed on Cycle Hold Day 1

    Script name: <Script_path>/<country_cde>/ OCBB_Archive_Process.sh

    Command line:

    sh OCBB_Archive_Process.sh <Script_path> <parameter file name>

    For example:

    sh OCBB_Archive_Process.sh /data/ds/dpr_bir2_dev/scripts bn

    bir2_wh.param

    This script should be run everyday to take the backup of the source files that fall under Cycle Hold Days.

    This script should be run with proper prmDate and landing directories.

    For update prmDate and landing directories, the following script should be run before the aforementioned script:

    Script name: <Script_path>/<country_cde>/OCBB_Recovery_PrmFileUpdate.sh

    Command line:

    sh OCBB_Recovery_PrmFileUpdate.sh <Script path> <parameter file path> bir2_wh.param InputFile_OCBB_Recovery_PrmFileUpdate o

    You should use O letter at the end of the command line. O indicates Original (Regular Daily Cycle) process.

    To run the cycles that are on hold; parameter files should be updated with proper prmDate and recovery directories as all cycles that are on hold should be run from recovery process directories.

    For example:

    • a) Assume 1st May 2010 to 5th May 2010 data was corrupted or bad data was loaded; therefore data needs to recover back through the recovery process

    • b) Assume 3 Man Days has consumed to recover the 1st May 2010 to 5th May 2010 data; so cycle has hold on for 3 days. That is the cycle is held up to 8th May 2010. Therefore, the 3 days regular Daily cycle which is on hold should run from Recovery process directory (/data/ds/dpr_bir2_dev/landing/recovery).

    Regular Daily Control-M draft can be used to run the three days data mentioned here.

    • c) Assume 1 and half days have been consumed to run the 3 days data (point 2). That is Cycle is held up to 10th May 2010 noon.

    • d) Hence, again 1 cycle data should run from Recovery process directories (/data/ds/dpr_bir2_dev/landing/recovery).

    Regular Daily Control-M draft can be used to run the one day data mentioned here.

    • e) 10th May 2010 noon onwards we can run actual Regular Daily Cycle from original landing directory (/data/ds/dpr_bir2_dev/landing).

    3)

    OCBB_Recovery_nickname.ksh script should be run before recovery process. To run this script DBA privileges is required.

    Script name: <Script_path>/<country_cde>/OCBB_Recovery_nickname.ksh

    Command line:

    sh OCBB_Recovery_nickname.ksh

     

    (<DBNAME>, <STAGING_COUNTRY_SCHEMA>,<server_name>,<reg>,<cty> should be changed before running the script)

    This script should be run on only Recovery process Day1.

     

    For example: Assuming we need to recover 5 days data; this script needs to execute on Day1 only and not on the remaining days.

    4)

    OCBB_Recovery_stg_load.ksh script should be run before recovery process. To run this script DBA privileges is required. Before running this script OCBB_Recovery_nickname.ksh should be completed.

    Script name: <Script_path>/<country_cde>/ OCBB_Recovery_stg_load.ksh Command line:

     

    sh OCBB_Recovery_stg_load.ksh

    (<dbname>, <staging_country_schema>,'<country_code>' should be changed before running the script)

    Script OCBB_Recovery_stg_load.ksh would populate the iSeries data to pSeries Staging tables (Lookup tables). This script should be run on only Recovery process Day1.

     

    For example: Assuming we need to recover five days data; this script needs to be executed on Day1 only and not on the remaining days.

    5)

    Update the following parameters and add parameter file names to InputFile_OCBB_Recovery_PrmFileUpdate. This file is available at $prmDirLandingBkp.

    • a) Initial_load_dt.param

    • b) bir2_cmb_int.param

    • c) bir2_code.param

    • d) bir2_conversion.param

    • e) bir2_indv.param

    • f) bir2_initial_src_int.param

    • g) bir2_initial_src_int_sitehubfp.param

    • h) bir2_mq.param

    • i) bir2_src_app.param

    • j) bir2_src_inter.param

    • k) bir2_wh.param

    • l) src_hub_date_control.param

    Where,

    prmDirLanding =

    /data/ds/dpr_gacpt_dev/landing/bkp

    prmDirXml

    =

    /data/ds/dpr_gacpt_dev/landing/bkp

    prmDate

    =

    YYYY-MM-DD

    startdt

    =

    YYYY-MM-DD

    enddt = YYYY-MM-DD After updating the inputs; please validate the aforementioned file. Script name: <Script_path>/<country_cde>/OCBB_Recovery_PrmFileUpdate.sh

    OCBB_Recovery_PrmFileUpdate.sh script would update all parameter files that are listed in InputFile_OCBB_Recovery_PrmFileUpdate.

    The following script should be run before the Recovery process to update the parameter files:

    Command line:

    sh OCBB_Recovery_PrmFileUpdate.sh <Script path> <parameter file path> bir2_wh.param InputFile_OCBB_Recovery_PrmFileUpdate r

    You should use the letter ‘R’ at the end of the command line. ‘R’ indicates Recovery process.

    The following script should be run at the end of the Recovery process to delete all files [Source files, .del files [Source table backup data], and Temporary files] that are restored from backup files:

    Command line:

    sh OCBB_Recovery_PrmFileUpdate.sh <Script path> <parameter file path> bir2_wh.param InputFile_OCBB_Recovery_PrmFileUpdate d

    You should use the letter ‘D’ at the end of the command line to delete the files that were recovered on a particular day.

    It is recommended to delete files everyday end of recovery cycle. ‘D’ indicates Delete.

    The following script should be run end of the Recovery process to update the parameter files to Resume the Regular Daily Run:

    Command line:

    sh OCBB_Recovery_PrmFileUpdate.sh <Script path> <parameter file path> bir2_wh.param InputFile_OCBB_Recovery_PrmFileUpdate o

    You should use the letter ‘O’ at the end of the command line. ‘O’ indicates Original (Regular Daily Cycle) process.

    6)

    Untar all or required files to get the sources files into the backup directory based on requirement. This can be done with the following script:

    Script name: <Script_path>/<country_cde>/OCBB_CDC_Untar_Process.sh

    For example:

    sh OCBB_CDC_Untar_Process.sh /data/ds/dpr_gacpt_dev/scripts bn

    bir2_wh.param

    ALL/HUB/OHC/CARM/XXXXX_NET/IFAI/HMI/HUB_REM/GPS/EXTERNAL/PARAMETER/BAS

    EL

    7)

    Manual delete process required:

    • a) Please make sure there is no data file from the previous cycle. It may give wrong result in the target table.

    • b) The following list of Warehouse tables has write method as INSERT/APPEND only. It is required to take the following tables with manual deletion process to avoid any key Primary Key violation:

      • i) HEW_CUST_HIST

    ii) HEW_ACCT_ARR_HIST

    iii) HEW_CRED_CARD_ARR_HIST

    iv) HEW_CRED_CARD_ACCT_DLQ_PRD_HIST

    • v) HEW_CRED_CARD_ACCT_COLL_ACTV_HIST

    vi) HEW_LOAN_ARR_HIST vii) HEW_SEC_ARR_HIST viii)HEW_DEPST_ARR_HIST

    ix) HEW_MEMO_ARR_HIST

    • x) HEW_RELN_MGR_HIST

    This can be achieved with help of this script, please make sure of the list of tables before execution of the script:

    Script name: <Script_path>/<country_cde>/OCBB_CDC_InsertMode_tables.sh

    For example:

    sh OCBB_CDC_InsertMode_tables.sh /data/ds/dpr_gacpt_dev/scripts bn bir2_wh.param InputFile_OCBB_CDC_InsertMode_tables

    This script should be run only on Recovery process Day1.

    For example: Assuming we need to recover 5 days data; this script needs to be executed on Day1 only and not on the remaining days.

    8)

    Clean up all the Reprocessing Source tables where REJECT_IND is set to N and whose LOAD_DT falls within the recovery date range. This should be done for all the Source Systems (HUB/HUB_REM/HMI/IFAI) or the required Source System based on the system(s) to be recovered. This can be done with the help of the following script:

    Script name: <script_path>/<country_cde>/OCBB_CDC_Rep_Table_RejectRecord_Del.sh For example:

    sh OCBB_CDC_Rep_Table_RejectRecord_Del.sh /data/ds/dpr_gacpt_dev/scripts bn bir2_wh.param

     

    InputFile_OCBB_CDC_Rep_Table_RejectRecord_Del ALL/HUB/HUB_REM/HMI/IFAI

     
     

    This script should be run only on Recovery process Day1.

     

    For example: Assuming we need to recover 5 days data; this script needs to be executed on Day1 only and not on the remaining days.

    9)

    Once the aforementioned points are taken care, start importing the *.del files to the respective tables based on the requirement. This can be done with help of the following script:

    Script name: <script_path>/<country_cde>/ OCBB_CDC_Recovery_Process.sh For example:

    sh OCBB_CDC_Recovery_Process.sh /data/ds/dpr_gacpt_dev/scripts bn bir2_wh.param InputFile_OCBB_CDC_BKP_Process ALL/HUB/HUB_REM/IFAI

    10) Make sure you have copied all the required data from Source System Day1 cycle and placed into recovery directory, $prmDirdatabkp (/data/ds/dpr_gacpt_dev/landing/bkp)

    • a) Restore process again based on the impact; it should validate and process based on the current situation

    • b) Further, based on the affected tables and Source System to be recovered, first restore the corresponding Source System files from the backup directory and run all the required source to interface jobs to populate Day 1 worth of data with proper LOAD_DT to corresponding Interface tables.

    11) Validate required Source System data OHC, CARM, IFAI, HMI and XXXXX.net Source System Day 1 input files and location. Please find the here 12) You can run multiple days of data but it is required to execute till source to Interface table with one day data to get correct LOAD_DT for all the records 13) After successful execution of Source to Interface job with single day data; you need to run OCBB_Recovery_PrmFileUpdate.sh

    This script should be run at the end of the Recovery process to delete all files [Source files, .del files (Source table backup data), and Temporary files] that are restored from backup files.

    Command line:

    sh OCBB_Recovery_PrmFileUpdate.sh <Script path> <parameter file path> bir2_wh.param InputFile_OCBB_Recovery_PrmFileUpdate d

    You should use the letter ‘D’ at the end of the command line to delete the files that were recovered on a particular day.

    It is recommended to delete files everyday at the end of the recovery cycle.

    ‘D’ indicates Delete.

    14) After successful execution of source to interface job with single day data, start run CDC Backup/Archive script process manually and take that particular day’s data backup

    15) Repeat the same process for the remaining days

    The following scripts should be run on only Day1; please do not run these scripts on the remaining days.

    • a) OCBB_Recovery_nickname.ksh

    • b) OCBB_Recovery_stg_load.ksh

    • c) OCBB_CDC_InsertMode_tables.sh

    • d) OCBB_CDC_Rep_Table_RejectRecord_Del.sh.

    16) Please do not run the following jobs while the Recovery process is in progress:

    Job names

    Group No. 1

    djpHewr2XFM_CDC_PRODUCT_UPS

    djpHewr2XFM_CDC_INVOLVED_PARTY_UPS_1

    djpHewr2XFM_CDC_ACCOUNTING_UNIT_UPS_1

    djpHewr2XFM_CDC_CHG_TABLES

    djpHewr2XFM_CDC_LOCATION_UPS

    djpHewr2XFM_CDC_ARR_CHG_TABLES_1

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_1

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_2

    djpHewr2XFM_CDC_ARRANGEMENT_UPS3

    djpHewr2XFM_CDC_EVENT_UPS

    djpHewr2XFM_CDC_ARR_CHG_TABLES_2

    djpHewr2XFM_CDC_ARR_CHG_TABLES_3

    djpHewr2XFM_CDC_ARR_CHG_TABLES_4

    djpHewr2XFM_CDC_INVOLVED_PARTY_UPS_2

    djpHewr2XFM_CDC_INVOLVED_PARTY_UPS_4

    Job names

    djpHewr2XFM_CDC_ACCOUNTING_UNIT_UPS_2

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_3

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_4

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_5

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_6

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_7

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_8

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_9

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_10

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_11

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_16

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_17

    djpHewr2XFM_CDC_ARR_CHG_TABLES_7

    djpHewr2XFM_CDC_ARR_CHG_TABLES_8

    djpHewr2XFM_CDC_EVENT_ODBC_HUBODS

    djpHewr2XFM_CDC_IP_ODBC_HUBODS

    djpHewr2XFM_CDC_EVENT_UPS_SITEHUBFP

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_10_SITEHUBFP

    djpHewr2XFM_CDC_ARRANGEMENT_UPS3_ATMFP

    djpHewr2XFM_CDC_IFAI_ARRANGEMENT_UPS

    djpHewr2XFM_CDC_ARRANGEMENT_UPS_18

    djpHewr2XFM_CDC_INVOLVED_PARTY_UPS_3

    djpHewr2XFM_CDC_PCM_CHG

    djpHewr2XFM_CDC_PCM_UPS

    Group No. 3

    djpHewr2XFM_VIEW_IP

    djpHewr2XFM_VIEW_EVENT

    djpHewr2XFM_VIEW_ARR

    djpHewr2XFM_CHG_HIE

    djpHewr2XFM_CDC_REMIT_ODBC

    djpHewr2XFM_CDC_PRODUCT_ODBC

    Job names

    djpHewr2XFM_CDC_LOCATION_ODBC

    djpHewr2XFM_CDC_IP_ODBC

    djpHewr2XFM_CDC_IFAI_ARRANGEMENT_CHG

    djpHewr2XFM_CDC_EVENT_ODBC_SITEHUBFP

    djpHewr2XFM_CDC_EVENT_ODBC

    djpHewr2XFM_CDC_AU_ODBC

    djpHewr2XFM_CDC_ARR_ODBC_SITEHUBFP

    djpHewr2XFM_CDC_ARR_ODBC_ATMFP

    djpHewr2XFM_CDC_ARR_ODBC

    Group # 5

    djpHewr2XFM_CDC_EVENT_DEL

    djpHewr2XFM_CDC_ARRANGEMENT_DEL3

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_17

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_16

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_15

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_14

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_13

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_5

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_4

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_12

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_11

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_3

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_2

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_7

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_6

    djpHewr2XFM_CDC_ACCOUNTING_UNIT_DEL

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_1

    djpHewr2XFM_CDC_LOCATION_DEL

    djpHewr2XFM_CDC_INVOLVED_PARTY_DEL_2

    djpHewr2XFM_CDC_PRODUCT_DEL

    djpHewr2XFM_CDC_EVENT_DEL_SITEHUBFP

    Job names

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_10_SITEHUBFP

    djpHewr2XFM_CDC_IFAI_ARRANGEMENT_DEL

    djpHewr2XFM_CDC_ARRANGEMENT_DEL_18

    djpHewr2XFM_CDC_INVOLVED_PARTY_DEL

    djpHewr2XFM_CDC_PCM_DEL

    Table 1: Jobs not to be run during Recovery process

    17) Once all the recovery days of data is available in the interface tables, run all the WH jobs and populate data into WH tables

    18) Validate the records in the target tables and continue the cycle

    19) The same approach cannot be implemented for IFAI Source System, which processes CDC data population with manual process instead through Attunity. For those tables only latest records are populated instead of all the records.

    20) After successful completion of restore and recovery process, the following script updates the parameter files. Script mentioned here should be run at the end of the Recovery process to update the parameter files to resume the Regular Daily Run

    Command line:

    sh OCBB_Recovery_PrmFileUpdate.sh <Script path> <parameter file path> bir2_wh.param InputFile_OCBB_Recovery_PrmFileUpdate

    You should use the letter ‘O’ at the end of the command line. ‘O’ indicates Original (Regular Daily Cycle) process.

    21) After successful completion of restore and recovery process, make sure to validate the corrected data and continue the ongoing Daily Cycles.

    • 3.1 Checkpoints for Restoration process

    Refer to the following steps for restoration process:

    1)

    Clean the data from the corresponding reprocessing source tables, before starting the restoration process

    2)

    Restore OHC/CARM/HUB/BASEL/Parameter/HMI/GPS/HUB REM/XXXXX Net/IFAI and other data files properly to backup directory from archive directory

    3)

    Validate the file size and check the presence of file in backup directory; from where the job is reading these input files

    4)

    Ensure restored files are having required permissions for executing the jobs

    5)

    Please don’t run any other process along with this recovery process otherwise it may lead to confusion and create unknown issues

    6)

    Execute the corresponding jobs based on the predefined dependency after successful restoration of required files (You can use Control M Dependency document for this)

    7)

    Validate the data population to Staging tables after completion of the source jobs and monitor the ETL job logs for warnings/record drop

    8)

    While running Recovery Day1 cycle; please make sure the entire refresh /look up tables are having valid data. Need to follow the same approach for all Recovery days (Which fall under Recovery date range)

    9)

    After successful validation of source jobs (All days which are fall in Recovery date range); execute the load jobs and monitor job logs

    10) All the backup data files are present in the backup directory $prmDirdatabkp (parameter from parameter file).

    • a) OCBB_Daily_HUB_REM_del.<Process date>.tar.gz

    • b) OCBB_Daily_HUB_del. <Process date>.tar.gz

    • c) OCBB_Daily_IFAI_del. <Process date>.tar.gz

    • d) OCBB_Daily_CARM.<Process date>.tar.gz (CARM Source Data)

    • e) OCBB_Daily_OHC.<Process date>.tar.gz (WHIRL Source Data)

    • f) OCBB_Daily_External.<Process date>.tar.gz (Insurance Data)

    • g) OCBB_Daily_HMI. <Process date>.tar.gz

    • h) OCBB_Daily_HUB_REM.<Process date>.tar.gz

    • i) OCBB_Daily_XXXXX_Net.<Process date>.tar.gz

    • j) OCBB_Daily_IFAI.<Process date>.tar.gz

    • k) OCBB_Daily_BASEL.<Process date>.tar.gz

    • l) OCBB_Daily_Parameter.<Process date>.tar.gz

    • 3.1.1 HUB/HUB_REM/IFAI/HMI: restoring data in Reprocessing tables

    Refer to the following points for restoring data in the Reprocessing tables:

    1)

    Take backup from the same tables with existing data before importing the restored data into interface tables and delete all the rejected records from the previous cycles from Reprocessing and Interface tables (this can help us to restore the latest image if recovery process is not completed successfully as those rejects may happen again with backup data).

    2)

    Unzip and untar the OCBB_Daily_<SRCE>_del.<process_date>.tar.gz files that have been archived based on the Source System to be recovered. This could be for a single or multiple systems. The following script is used for this process:

    Script name: <Script_path>/<country_cde>/ OCBB_CDC_Untar_Process.sh For example:

    sh OCBB_CDC_Untar_Process.sh /data/ds/dpr_gacpt_dev/scripts bn OCBB_initial_src_int.param ALL/HUB/OHC/CARM/XXXXX_NET/IFAI/HMI/ HUB_REM/GPS/EXTERNAL/PARAMETER/BASEL

     

    3)

    Once the Unzip and Untar process is done, the files are available in the backup directory; the aforementioned script also removes the tar files from landing directory

    4)

    The *.del files that have been untarred by the Untar script are used for importing data into the reprocessing tables based on the Source System required are to be restored. This is done with the help of the following script:

    Script name: <Script_path>/<country_cde>/OCBB_CDC_Recovery_Process.sh

    For example:

    sh OCBB_CDC_Recovery_Process.sh /data/ds/dpr_gacpt_dev/scripts bn OCBB_initial_src_int.param InputFile_OCBB_CDC_BKP_Process ALL/HUB/HUB_REM/HMI/IFAI

    5)

    Once data is imported into the required Source System tables, execute the required jobs. Run the corresponding Source Jobs and/or Groups and Force OK the remaining source Jobs and Groups.

    6)

    If there is any data from External Source System data files or any other files available, continue the corresponding Combine Jobs and/or Groups and Force OK the remaining Source System Combine Jobs and Groups if there is no data file.

    7)

    After validating recovery data into Interface tables (required), continue the corresponding WH Jobs/Groups and Force OK the remaining Source System WH Jobs/Groups.

    8)

    Validate the recovery data and repeat this process till completion of recovery and latest data in the target date.

    • 3.1.1.1 Validation Process

    Note the number of records that are updated/inserted/rejected in Reprocessing, Interface and WH tables. This activity is mainly for verification only.

    3.1.2

    OHC/CARM/IFAI/HMI/BASEL/External/Parameter/HUB_REM/XXXXX.N et: Restoring process

    Refer to the following points for restoring data in the Reprocessing tables:

    1)

    Take backup from the same tables with existing data before importing the restored data into interface tables and delete all the rejected records from the previous cycles from Reprocessing and Interface tables (this can help us to restore the latest image if recovery process is not completed successfully as these rejects may occur again with backup data).

    2)

    Unzip and untar the OCBB_Daily_<SRCE>.<process_date>.tar.gz files that have been archived based on the Source System to be recovered. This could be for a single or multiple systems. The following script is used for this process:

    Script name: <Script_path>/<country_cde>/ OCBB_CDC_Untar_Process.sh For example:

    sh OCBB_CDC_Untar_Process.sh /data/ds/dpr_gacpt_dev/scripts bn OCBB_initial_src_int.param ALL/HUB/OHC/CARM/XXXXX_NET/IFAI/HMI/ HUB_REM/GPS/EXTERNAL/PARAMETER/BASEL

     

    3)

    Once the Unzip and Untar process is done, the files are available in the backup directory; the aforementioned script also removes the tar files from the landing directory

    • a) Calculate the number of days for the recovery process based on impact analysis

    • b) List the number of days/date file for Recovery process

    • c) Pack identified required files from the backup folder that were processed daily with Process Date (LOAD_DT) and stored back in the backup folder with Tar/zip format during backup process

    • d) Start run the recovery process. Each day of data recovery process is an independent process and cycle for the other days.

    • e) One execution should have one day data only and LOAD_DT is required to be taken from Backup Tar file name itself. This way we can ensure particular days of data with the

    corresponding process date. As BASEL does not have the facility to identify the particular days of data, this process is very important for further activity.

    For example:

    If recover process is to be done from 5th to 10th day of data on a particular month, take 5th of Backup Tar file from backup folder and place in the required BASEL landing directory; run all the corresponding Combine jobs. However, ensure LOAD_DT, it should update prm_date(LOAD_DT) value from Tar file name itself and it is required to update Combine job parameter file to get correct LOAD_DT in the corresponding Interface table for each process.

    • f) After successful completion of 5th day data till interface level, again the 6th day data is processed with 6th day process date (LOAD_DT) same as 5th day process and it is repeated till the 10th day of month data.

    • g) Ensure interface table level process is repeated till 10th day process and continued for the remaining job execution to populate data into warehouse table in single time execution

    • h) Source file to Interface table process is a multiple process based on number of days. However, Interface table to WH process is a single process and processes latest records at the end.

    • i) Ensure the process date matches with the required LOAD_DT(Prm_date) during recovery

    • j) Remove all the Source files from original location after successful completion of each execution

    • k) Validate the records in interface table and WH tables based on impact analysis

    • l) Execute all the required jobs. Run corresponding Source Jobs/Groups and the remaining Jobs and Groups

    • m) After validation, recover data into Interface tables(required) and continue the corresponding WH jobs based on the dependency with HUB data due to Customer and some Product Details derived from HUB Source System and Force OK the remaining Source System WH Jobs/Groups.

    • n) Validate data recovery; repeat this process till completion of recovery and latest data in the target date.

    • 3.1.2.1 Validation Process

    Make a note of the number of records updated/inserted/rejected in Reprocessing, Interface and WH tables. This activity is for verification only.

    • 4 Recovery process

    This topic covers the Recovery process which is followed after successful Restoration process.

    During the original run, all the external files are read from following directory Server Path:

    \<project>\landing\<CountryCode>\

    However, during recovery process all the files are available in the recovery folder. Therefore, it is necessary to update all the parameters required to read the data from the following folder to be revoked after the recovery process.

    Server Path: \<project>\landing\<CountryCode>\ <Recovery>

    Before starting this process; you should read ‘Restoration process’ point 2.

    As per prerequisites, during recovery and restoration process all the Daily cycles should be on hold till completion of recovery process. After the completion of the recovery cycle the hold data would be processed as follows. The following scripts do the Attunity export and import which helps in processing the back log data:

    <Script_path>/<Country_cde>/OCBB_GA_SP_EXPORT.sh

    <Script_path>/<Country_cde>/OCBB_GA_SP_IMPORT.sh

    For example:

    sh OCBB_GA_SP_EXPORT.sh /data/ds/dpr_gacpt_dev/scripts <Attunity parameterfilename>

    sh OCBB_GA_SP_IMPORT.sh /data/ds/dpr_gacpt_dev/scripts <Attunity parameterfilename>

    Before running the aforementioned scripts, run the following script to update the prmDate and some directory paths

    Command line:

    sh OCBB_Recovery_PrmFileUpdate.sh <Script path> <parameter file path> bir2_wh.param InputFile_OCBB_Recovery_PrmFileUpdate o

    You should use the letter ‘O’ at the end of the command line. ‘O’ indicates Original (Regular Daily Cycle) process. Please validate all the parameter files for prmDate and directory paths.

    4.1

    HUB

    Refer to the following steps pertaining to the recovery process for HUB:

    1)

    Reload the Attunity Stream Position table from the first day of backup files with help of

    OCBB_GA_SP_EXPORT.sh

    2)

    OCBB_GA_SP_EXPORT.sh - Taking backup from Attunity OCBB_GA_SP_IMPORT - Reloading data from backup file to Attunity Validate the Attunity STREAM POSTION tables after loading from backup files

    3)

    Retrieve PARM_DATE from STREAM_POSTION table backup filename

    4)

    Do not run HUB_DATE_CTRL jobs during back log

    5)

    Update the parm_dt to the parameter file to get correct LOAD_DT for all the records. This can be done with the help of OCBB_Recovery_PrmFileUpdate.sh script. It can be ignored if this process is done for other Source Systems as well.

    6)

    Run the CDC jobs to load the day 1 data to the Staging tables

    7)

    Repeat the same process for more days of hold.

    4.2

    CARM

    Refer to the following steps pertaining to the recovery process for CARM:

    1)

    Restore the Day 1 file from the archive directory using the restore process

    2)

    Retrieve parmDate from the backup filename

    3)

    Update parm_dt to the parameter file to get correct LOAD_DT for all the records

    4)

    Run the CARM jobs to process the Day1 data

    5)

    Repeat the same process for more days of hold.

    4.3

    OHC

    Refer to the following steps pertaining to the recovery process for OHC:

    1)

    Restore the Day 1 file from the archive directory using the restore process

    2)

    Retrieve parm_date from the backup filename

    3)

    Update the parm_dt to the parameter file to get correct LOAD_DT for all the records

    4)

    Run the OHC combine jobs to process the Day1 data

    5)

    Repeat the same process for more days of hold.

    4.4

    BASEL

    Refer to the following steps pertaining to the recovery process for BASEL:

    1)

    Restore the Day 1 file from the archive directory using the restore process

    2)

    Retrieve parm_date from the backup filename

    3)

    Update the parm_dt to the parameter file to get correct LOAD_DT for all the records

    4)

    Run the OHC combine jobs to process the Day1 data

    5)

    Repeat the same process for more days of hold.

    4.5

    External

    Refer to the following steps pertaining to the recovery process for External files data:

    1)

    Restore the Day 1 file from the archive directory using the restore process

    2)

    Retrieve parm_date from the backup filename

    3)

    Update the parm_dt to the parameter file to get correct LOAD_DT for all the records

    4)

    Run the External jobs to process the Day1 data

    5)

    Repeat the same process for more days of hold.

    4.6

    XXXXXnet

    Refer to the following steps pertaining to the recovery process for XXXXXnet:

    1)

    Restore the Day 1 file from the archive directory using the restore process

    2)

    Retrieve parm_date from the backup filename

    3)

    Update the parm_dt to the parameter file to get correct LOAD_DT for all the records.

    4)

    Run the OHC combine jobs to process the Day1 data.

    5)

    Repeat the same process for more days of hold.

    4.7

    Parameter Files

    Refer to the following steps pertaining to the recovery process for Parameter files:

    1)

    Restore the Day 1 file from the archive directory using the restore process

    2)

    Retrieve parm_date from the backup filename

    3)

    Update the parm_dt to the parameter file to get correct LOAD_DT for all the records

    4)

    Run the Parameter jobs to process the Day1 data

    5)

    Repeat the same process for any more days of hold.

    • 4.8 HMI, HUB_REM, XXXXXnet

    For all the aforementioned systems, the data comes through the DataStage CDC or Manual CDC process. Through the CDC process we get the latest data cumulative of all the days on hold for which, you need to run the required Daily Cycle jobs to get one day CDC data which was hold on during recovery process.

    Refer to the following steps for recovery of HMI, HUB_REM and XXXXXnet data:

    1)

    Clean the existing data files or previous day input files from all the folders for re-running the cycle with one day data

    2)

    Copy all the required Source System Day1 cycles and place into respective folders

    3)

    It is not required to complete end to end full cycle with one day image. It can run multiple days of data in the Interface table and continue one Load Job cycle. However, the Source to Interface groups should be complete with required LOAD_DT. It again depends whether it can be run end to end Daily Cycle with one day worth of data.

    4)

    All the days of data are available in the interface tables. Run all the WH jobs and populate data into WH tables.