Beruflich Dokumente
Kultur Dokumente
BW 7
Use
• Loading data from PSA to InfoProvider(s).
• Transfer of data from one InfoProvider to another within BI.
• Data distribution to a target outside the BI system; e.g. Open HUBs, etc.
In the process of transferring data within BI, the Transformations define mapping and logic of data updating to
the data targets whereas, the Extraction mode and Update mode are determined using a DTP.
NOTE: DTP is used to load data within BI system only; except when they are used in the scenarios of Virtual
InfoProviders where DTP can be used to determine a direct data fetch from the source system at run time.
Extraction
There are two types of Extraction modes for a DTP – Full and Delta.
Full:
Delta:
Unlike InfoPackage, delta transfer using a DTP doesn’t require an explicit initialization. When
DTP is executed with Extraction mode Delta for the first time, all existing request till then are
retrieved from the source and the delta is automatically initialized.
The below 3 options are available for a DTP with Extraction Mode: Delta.
• Only Get Delta Once.
• Get All New Data Request By Request.
• Retrieve Until No More New Data.
‘Only Get Delta Once’ does this job in a much efficient way; as it loads only the latest request
(Delta) from a PSA to a Data target.
1. 1. 1. Delete the previous Request from the data target.
2. Load data up to PSA using a Full InfoPackage.
3. Execute DTP in Extraction Mode: Delta with ‘Only Get Delta Once’ checked.
The above 3 steps can be incorporated in a Process Chain which avoids any manual intervention.
NOTE: If ‘Retrieve Until No More New Data’ is unchecked, the above option automatically changes to
‘Get One Request Only’. This would in turn get only one request from the source.
Also, once DTP is activated, the option ‘Retrieve Until No More New Data’ no more appears in the DTP
maintenance.
Package Size
The number of Data records contained in one individual Data package is determined here.
Default value is 50,000.
Filter
The selection Criteria for fetching the data from the source is determined / restricted by
filter.
Multiple selections
OLAP variable
ABAP Routine
A on the right of the Filter button indicates the Filter selections exist for the DTP.
Semantic Groups
Choose Semantic Groups to specify how you want to build the data packages that are read from the source
(DataSource or InfoProvider). To do this, define key fields. Data records that have the same key are combined
in a single data package.
This setting is only relevant for DataStore objects with data fields that are overwritten. This setting also defines
the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be
updated in the target in the correct order once the incorrect data records have been corrected.
A on the right side of the ‘Semantic Groups’ button indicates the Semantic keys exist for the
DTP.
Update
Error Handling
• Deactivated:
If an error occurs, the error is reported at the package level and not at the data record level.
The incorrect records are not written to the error stack since the request is terminated and has to be updated
again in its entirety.
This results in faster processing.
• No Update, No Reporting:
If errors occur, the system terminates the update of the entire data package. The request is not released for
reporting. The incorrect record is highlighted so that the error can be assigned to the data record.
The incorrect records are not written to the error stack since the request is terminated and has to be updated
again in its entirety.
This option allows you to update valid data. This data is only released for reporting after the administrator
checks the incorrect records that are not updated and manually releases the request (by a QM action, that is,
setting the overall status on the Status tab page in the monitor).
The incorrect records are written to a separate error stack in which the records are edited and can be updated
manually using an error DTP.
Valid records can be reported immediately. Automatic follow-up actions, such as adjusting the aggregates, are
also carried out.
The incorrect records are written to a separate error stack in which the records are edited and can be updated
manually using an error DTP.
Error DTP
Erroneous records in a DTP load are written to a stack called Error Stack.
Error Stack is a request-based table (PSA table) into which erroneous data records from a data transfer
process (DTP) are written. The error stack is based on the data source (PSA, DSO or Info Cube), that is,
records from the source are written to the error stack.
In order to upload data to the Data Target, we need to correct the data records in the Error Stack and manually
run the Error DTP.
Execute
Processing Mode
This mode is ideal for simulating the DTP execution in Debugging mode. When this mode is
selected, we have the option to activate or deactivate the session Break Points at various stages
like – Extraction, Data Filtering, Error Handling, Transformation and Data Target updating.
You cannot start requests for real-time data acquisition in debug mode.
Debugging Tip:
When you want to debug the DTP, you cannot set a session breakpoint in the editor where you write the ABAP
code (e.g. DTP Filter). You need to set a session break point(s) in the Generated program as shown below:
There are special data transfer options when the Data is sourced from a DTP to other Data Target.
The data is read from the DSO active table and from the archived data.
• Change Log
The data is read from the change log and not the active table of the DSO.