Sie sind auf Seite 1von 18

Hi All...

here is the list of SAP BI Production Support issues along with the
resolution. Hope it will be helpful for the people who are working in support
environment.

1. DTP Failure

Select the step-> right click and select Display Message-> there we will get
the message which gives the reason for ABEND.

A DTP can failure due to following reasons, in such case we can go for
restarting the job.
System Exception Error
Request Locked
ABAP Run time error.
Duplicate records
Erroneous Records from PSA.

Duplicate records:

In case of duplication in the records, we can find it in the error


message along with the Info Providers name. Before restarting the job after
deleting the bad DTP request, we have to handle the duplicate records. Go to
the info provider -> DTP step -> Update tab -> check handle duplicate
records -> activate -> Execute DTP. After successful competition of the job
uncheck the Handle Duplicate records option and activate.

DTP Log Run:

If a DTP is taking log time than the regular run time without having the
back ground job, then we have to turn the status of the DTP into Red and
then delete the DTP bad request (If any), repeat the step or restart the job.
Before restarting the Job/ repeating the DTP step, make sure about the
reason for failure.
If the failure is due to Space Issue in the F fact table, engage the DBA
team and also BASIS team and explain them the issue. Table size needs to be
increased before performing any action in BW. Itll be done by DBA Team.
After increasing the space in the F fact table we can restart the job.

Erroneous Records from PSA:

When ever a DTP fails because of erroneous records while fetching the
data from PSA to Data Target, in such cases data needs to be changed in
the ECC. If it is not possible, then after getting the approval from the
business, we can edit the Erroneous records in PSA and then we have to
run the DTP.

Go to PSA -> select request -> select error records -> edit the records
and save.

Then run the DTP.

2. INFO PACKAGE FAILURE:

The following are the reasons for Info Pack failure.


Source System Connection failure
tRFC/IDOC failure
Communication Issues
Processing the IDOC Manually in BI

Check the source system connection with the help of SAP BASIS, if it is
not fine ask them to rebuild the connection. After that restart the job (Info
Pack).

Go to RSA1 -> select source system -> System -> Connection check.

In case of any failed tRFCs/IDOCs, the error message will be like Error
in writing the partition number DP2 or Caller 01, 02 errors. In such case
reprocess the tRFC/IDOC with the help of SAP BASIS, and then job will finish
successfully.
If the data is loading from the source system to DSO directly, then
delete the bad request in the PSA table, then restart the job

Info Pack Long Run: If an info pack is running long, then check whether
the job is finished at source system or not. If it is finished, then check
whether any tRFC/IDOC struck/Failed with the help of SAP BASIS. Even after
reprocessing the tRFC, if the job is in yellow status then turn the status into
Red. Now restart / repeat the step. After completion of the job force
complete.

Before turning the status to Red/Green, make sure whether the load is
of Full/Delta and also the time stamp is properly verified.

Time Stamp Verification:

Select Info Package-> Process Monitor -> Header -> Select Request ->
Go to source System (Header->Source System) -> Sm37-> give the
request and check the status of the request in the source system -> If
it is in active, then we have to check whether there any struck/failed
tRFCs/IDOCs

If the request is in Cancelled status in Source system -> Check the Info
Pack status in BW system -> If IP status is also in failed state/cancelled
state -> Check the data load type (FULL or DELTA) -> if the status is full
then we can turn the Info Package status red and then we can
repeat/restart the Info package/job. -> If the load is of Delta type then
we have to go RSA7 in source system-> (Compare the last updated
time in Source System SM37 back ground job)) Check the time stamp
-> If the time stamp in RSA7 is matching then turn the Info Package
status to Red -> Restart the job. Itll fetch the data in the next iteration

If the time stamp is not updated in RSA7 -> Turn the status into Green
-> Restart the job. Itll fetch the data in the next iteration.

Source System BW System Source Source Action


System System
RSA7 SM37

I/P Status I/P Status Time stamp Time stamp Turn the
RED(Cancelled (Active) matching matching I/P Status
) with SM37 with RSA7 into Red
last time stamp and
updated Restart
time the Job

I/P Status I/P Status Time stamp Time stamp Turn the
RED(Cancelled (Cancelled) matching matching I/P Status
) with SM37 with RSA7 into Red
last updated time stamp and
time Restart
the Job

I/P Status I/P Status Time stamp Time stamp Turn the
RED(Cancelled (Active) is not is not I/P status
) matching matching into Green
with SM37 with RSA7 and
last updated time stamp Restart
time the job

I/P Status I/P Status Time stamp Time stamp Turn the
RED(Cancelled (Cancelled) is not is not I/P status
) matching matching into Green
with SM37 with RSA7 and
last updated time stamp Restart
time the job

Processing the IDOC Manually in BI:


When there is an IDOC which is stuck in the BW and successfully
completed the background job in the source system, in such cases we
can process the IDOC manually in the BW.

Go to Info Package -> Process Monitor -> Details -> select the IDOC
which is in yellow status(stuck) -> Right click -> Process the IDOC
manually -> itll take some time to get processed.

******Make sure that we can process the IDOC in BW only when the
back ground job is completed in the source system and stuck in the BW
only.

3. DSO Activation Failure:

When there is a failure in DSO activation step, check whether the data is
loading to DSO from PSA or from the source system directly. If the data is
loading to DSO from PSA, then activate the DSO manually as follows

Right click DSO Activation Step -> Target Administration -> Select the
latest request in DSO -> select Activate -> after request turned to green
status, Restart the job.

If the data is loading directly from the source system to DSO, then
delete the bad request in the PSA table, then restart the job

4. Failure in Drop Index/ Compression step:

When there is a failure in Drop Index/ compression step, check the


Error Message. If it is failed due to Lock Issue, it means job failed
because of the parallel process or action which we have performed on
that particular cube or object. Before restarting the job, make sure
whether the object is unlocked or not.

There is a chance of failure in Index step in case of TREX server issues.


In such cases engage BASIS team and get the info reg TREX server and
repeat/ Restart the job once the server is fixed.

Compression Job may fail when there is any other job which is trying to
load the data or accessing from the Cube. In such case job fails with
the error message as Locked by ...... Before restarting the job, make
sure whether the object is unlocked or not.

5. Roll Up failure:

Roll Up fails due to Contention Issue. When there is Master Data load
is in progress, there is a chance of Roll up failure due to resource contention.
In such case before restarting the job/ step, make sure whether the master
data load is completed or not. Once the master data load finishes restart the
job.

6. Change Run Job finishes with error RSM 756

When there is a failure in the attribute change run due to Contention, we


have to wait for the other job (Attribute change run) completion. Only one
ACR can run in BW at a time. Once the other ACR job is completed, then we
can restart/repeat the job.

We can also run the ACR manually in case of nay failures.


Go to RSA1-> Tool -> Apply Hierarchy/Change Run -> select the appropriate
Request in the list for which we have to run ACR -> Execute.

7. Transformation In-active:

In case of any changes in the production/moved to the production without


saving properly or any modification done in the transformation without
changing, in such cases there is a possibility of Load failure with the error
message as Failure due to Transformation In active.

In such cases, we will have to activate the Transformation which is inactive.

Go to RSA1 -> select the transformation -> Activate

In case of no authorization for activating the transformation in production


system, we can do it by using the Function Module - RSDG_TRFN_ACTIVATE

Try the following steps to use the program "RSDG_TRFN_ACTIVATE here you
will need to enter certain details:

Transformation ID: Transformations Tech Name (ID)

Object Status: ACT

Type of Source: Source Name

Source name: Source Tech Name

Type of Target: Target Name

Target name: Target Tech Name

A. Execute. The Transformation status will be turned into Active.

Then we can restart the job. Job will be completed successfully.


8. Process Chain Started from the yesterdays failed step:

In few instances, process chain starts from the step which was failed in the
previous iteration instead of starting from the Start step.

In such cases we will have to delete the previous days process chain log, to
start the chain form the beginning (from Start variant).

Go To ST13-> Select the Process Chain -> Log -> Delete.

Or we can use Function Module for Process Chain Log Deletion:


RSPROCESS_LOG_DELETE.

Give the log id of the process chain, which we can get from the ST13 screen.

Then we can restart the chain.

Turning the Process Chain Status using Function Module:

At times, when there is no progress in any of the process chains which is


running for a long time without any progress, we will have to turn the status
of the entire chain/Particular step by using the Function Module.

Function Module: RSPC_PROCESS_FINISH

The program "RSPC_PROCESS_FINISH" for making the status of a particular


process as finished.

To turn any DTP load which was running long, so please try the following
steps to use the program "RSPC_PROCESS_FINISH" here you need to enter
the following details:

LOG ID: this id will be the id of the parent chain.

CHAIN: here you will need to enter the chain name which has failed process.

TYPE: Type of failed step can be found out by checking the table
"RSPCPROCESSLOG" via "SE16" or "ZSE16" by entering the Variant &
Instance of the failed step. The table "RSPCPROCESSLOG" can be used to find
out various details regarding a particular process.
INSTANCE & VARIANT: Instance & Variant name can be found out by right
clicking on the failed step and then by checking the "Displaying Messages
Options" of the failed step & then checking the chain tab.

STATE: State is used to identify the overall state of the process. Below given
are the various states for a step.

R Ended with errors

G Successfully completed

F Completed

A Active

X Canceled

P Planned

S Skipped at restart

Q Released

Y Ready

Undefined

J Framework Error upon Completion (e.g. follow-on job missing)

9. Hierarchy save Failure:

When there a failure in Hierarchy Save, then we have to follow the below
process...

If there is an issue with Hierarchy save, we will have to schedule the Info
packages associated with the Hierarchies manually. Then we have to run
Attribute Change Run to update the changes to the associated Targets.
Please find the below mentioned the step by step process...

ST13-> Select Failed Process Chain -> Select Hierarchy Save Step -> Rt
click Display Variant -> Select the info package in the hierarchy -> Go to
RSA! -> Run the Info Package Manually -> Tools -> Run Hierarchy/Attribute
Change Run -> Select Hierarchy List (Here you can find the List of
Hierarchies) -> Execute.
Dec
19

Roles and responsibilities in support project


Monitoring Data load activities and recovered failures in Dev, Quality system as a
part of unit testing.
Preparing the process documents for each and every enhancement.
Involving in solving high-prioritized tickets regarding extractions,performance issues
and data load failures.
Working on process chains to automate the Deleting indexes in Info cubes,
Process of info packages, creating indexes, ODS activation, further updating, PSA
request deletion and Attribute change run so on.
Using open hub to export data from BI to External systems( flat file, table , 3rd party
tool).
Involving in Unit testing, Integration testing, Regression testing and User
acceptance tests.
Enhancing the existing reports as per the new requirement.
Monitoring the daily loads to various data targets in the system.
Monitoring the process chains on the basis of daily, weekly & monthly.
Manual loading and rolling up the data into data targets.
Maintaining the logs for each manual load.
Creating aggregates to improve the query response.
Performance tuning of queries using aggregates, indexing of Info cubes.

SAP BW Data Extraction Consultant: The BW Data Extraction Consultant is responsible to identify and
obtain the data required to satisfy the requirements of the BW project. This data may include:

SAP R/3 data

New Dimension product data


Data external to SAP within the organization (legacy data)

Data external to SAP from outside the organization (provider data D&B, Nielson)

The BW Data Extraction Consultant role has a broad range of responsibilities and may require multiple individuals to
satisfy the role depending on the scope of the BW project and the complexity and quality of the data.

If SAP R/3 and New Dimension data only is required to satisfy requirements and if this data is included in the
standard Business Content of BW, this role may be combined with the BW Application Consultant role. This standard
Business Content allows for extraction of R/3 and New Dimension data in a straightforward and rapid manner.

If non-SAP data is required, if standard Business Content must be enhanced significantly, if BAPI interfaces are being
used, and/or if the data quality from the source system is insufficient, this role can be quite complex and can required
significant resources. This complexity and quality of data is a primary contributor to the size and scope of the BW
project.

If legacy data is being extracted a close relationship is required with the legacy extraction expert. In some cases, the
legacy extraction expert may assume this responsibility.

Specifically, the BW Data Extraction Consultant is responsible for:

Designing the data solution to satisfy defined business requirements

Identifying the data in the source environment

Mapping the data to the BW environment

Identifying data quality gaps

Developing a plan to close data quality gaps

Developing the required extraction programs, if necessary


Developing the associated interface programs, if necessary

Testing of all developed programs

Ensuring integration testing of data from various sources

Developing a production support plan

SAP BW Data Access Consultant: The BW Data Access Consultant is responsible to assess the business
requirements, and design and develop a data access solution for the BW project. This solution may include use of:

BWs Business Explorer

Non-SAP Data Access tools (e.g., Business Objects, Cognos, Crystal Reports, and other certified
data access tools)

Visual Basic development

Web development

WAP (wireless) development

R/3 drill-through

The BW Data Access Consultant role has a broad range of responsibilities and may require multiple individuals to
satisfy the role depending on the scope of the BW project and the requirements associated with data access.

The BW Data Access Consultant should work closely with the individuals responsible for business requirements
gathering and analysis and have a thorough understanding of the way the data will be used to make business
decisions.

Often significant change management issues are generated as a result of modifications required by end users to the
data access design and implementation. As a result the BW Data Access Consultant is in a key position to provide
valuable information to the change agent or change management process.
Specifically, the BW Data Access Consultant is responsible for designing the data access solution to include:

Understanding the data that will be available in BW in business terms

Identifying the way end users want to analyze the data in BW

Designing the data access solution to satisfy defined business requirements

The BW Data Access Consultant is also responsible for developing the data access solution to include:

Developing options for data access (i.e. web solution, R/3 drill through, ODS reporting, master data
reporting, 3rd party tools)

Developing prototypes of data access for review with end users

Developing the required data access solutions

Developing the associated interface programs and/or customized web enhancements, if necessary

Configuring the Reporting Agent, if necessary

Configuring the GIS

Testing of all developed solutions

Ensuring integration testing of data access solution

Developing a production support plan

Working with training development to include data access solution in BW course materials

SAP BW Data Architect: The BW Data Architect is responsible for the overall data design of the BW project. This
includes the design of the:

BW InfoCubes (Basic Cubes, Multi-cubes, Remote cubes, and Aggregates)

BW ODS Objects
BW Datamarts

Logical Models

BW Process Models

BW Enterprise Models

The BW Data Architect plays a critical role in the BW project and is the link between the end users business
requirements and the data architecture solution that will satisfy these requirements. All other activities in the BW
project are contingent upon the data design being sound and flexible enough to satisfy evolving business
requirements.

The BW Data Architect is responsible for capturing the business requirements for the BW project. This effort includes:

Planning the business requirements gathering sessions and process

Coordinating all business requirements gathering efforts with the BW Project Manager

Facilitating the business requirements gathering sessions

Capturing the information and producing the deliverables from the business requirements gathering
sessions

Understanding and documenting business definitions of data

Developing the data model

Ensuring integration of data from both SAP and non-SAP sources

Fielding questions concerning the data content, definition and structure

This role should also address other critical data design issues such as:

Granularity of data and the potential for multiple levels of granularity

Use of degenerate dimensions


InfoCube partitioning

Need for aggregation at multiple levels

Need for storing derived BW data

Ensuring overall integrity of all BW Models

Providing Data Administration development standards for business requirements analysis and BW
enterprise modeling

Provide strategic planning for data management

Impact analysis of data change requirements

As stated above, the BW Data Architect is responsible for the overall data design of the BW project. This includes the
design of the:

BW InfoCubes (Basic Cubes, Multi-cubes, Remote cubes, and Aggregates)

BW ODS Objects

BW Datamarts

Logical Models

BW Process Models

BW Enterprise Models

SAP BW Application Consultant: The BW Application Consultant is responsible for utilizing BW to satisfy the business
requirements identified for the project. As provided in the other roles, if the scope of the BW project is tightly
controlled and can use standard BW Business Content, InfoCubes, and Queries, the BW Application Consultant may
assume the responsibility to perform several roles concurrently to include:

BW Data Architect

BW Data Access Consultant


BW Data Extraction Consultant

SAP Project Manager

Business Process Team Lead

Authorization Administrator

If this occurs, the BW Application Consultant must have a broad range of skills and this position will be under
significant pressure during the course of the BW project. In this situation, the BW Application Consultant inherently
must be responsible for the overall integrated design and realization of the BW solution.

If the project scope is broad and must extend Business Content, InfoCubes and/or Queries, then the project warrants
resources being assigned to the roles identified above. In this case, the BW Application Consultant is responsible for
the overall integrated design and coordinated realization of the BW solution.

If this role is assumed by an SAP Consultant, often the expectations are that they are familiar with all components
and functionality of Business Information Warehouse. This role often naturally becomes a focal point for all design
consideration related to BW.

The BW Application Consultant (or one of the resources identified above) uses the BW Administrator Workbench to
perform the functions provided by BW:

Establish connections to the BW sources

Activate the standard Business Content

Enable the standard InfoCubes and Queries

Enhance the InfoCubes as required by the BW Data Architect

Enhance the Queries as required by the BW Data Access Consultant

Define authorization profiles and access


Evaluate statistical performance and make recommendations to Basis support for optimization
where possible

Manage the CTS layer

SAP BW Basis Consultant: The BW Basis Person must be able to advise on BW Landscape issues, Transport
environment, Authorisation, Performance Issues of Database and BW, Installation of BW Server, Plug Ins and
Frontend (For all layers there are patches / support packages) that should be regularly installed.

This role can be assumed by the Basis Consultant (However, additional BW skills are absolutely necessary)

Das könnte Ihnen auch gefallen