Sie sind auf Seite 1von 17

How To Run SAP BW Faster

We Build Robust Reporting Solutions


01

Introduction
Three Tools I Use On Every SAP BW Project

Want to build credibility and deliver results that will amaze your
customers?

Here are three tools that I use on every project to quickly and easily
provide value to my clients while winning more work.

Reporting How To Find Long Running Queries
Modeling How To Find Bad Data Models
Loading How To Find Sub-Optimal Code
02

Reporting
How To Find Long Running Queries

But theres a big dierence between, say, reporting


on a story and simply making up a story.
Errol Morris
03

Reporting
How To Find Long Running Queries

The rst step in quickly troubleshoo@ng your customers pain-points are


the reports.

Luckily, the metrics related to the query run@mes are captured in two
predened views RSDDSTAT_OLAP and RSDDSTAT_DM. See transac@on
SE16 to view the contents.

Simply sort by RUNTIME (measured in seconds) and nd the trouble
query in OBJNAME.

In the screen-capture, there are two dierent run@mes for two dierent
sessions (SESSIONUID). This dierence may be due to the users
selec@on parameters.
04

Reporting
Design Considerations for Long Running Queries

True Story During my early days with SAP BW (version 3.0), a major Contrary to my ini6al percep6on, the majority of the issues were not
mo@on pictures studio was experiencing issues with their repor@ng related to anything that I had learned in my undergraduate courses on
performance. programming, algorithms, or design.

Because repor@ng impacts analysts across business lines, the client and Many of the issues were caused by users who ran their reports without
consul@ng partner assembled a SWAT Team to iden@fy and resolve ltering for a value. That is, the users were selec@ng all data in a model
these highly visible issues. (which may contain 20-40 million records) and wondering why it took so
long to complete.
A dedicated room as procured for the team and there was a high level of
ac@vity for a couple of weeks. Team members were constantly on the A majority of my colleagues @me were spent training the users how to
phone with analysts to iden@fy the long running reports, understand the properly run a report by entering values on the selec@on prompt.
specic repor@ng criteria and ac@ons, and inves@gate the root cause of
the issues. The lesson is to rst understand the issue, not from a technical stand-
point, but from that of the users experience. Once you have iden@fy the
Once the ac@vity se\led down, I approached a close college and member issue, then choosing an item on the next page may be an op@on.
of the SWAT Team to understand commo@on. So what happened?
Were the reports poorly wri\en? Were the data models badly designed?
What did you nd?





05

Reporting
How To Resolve Long Running Queries

Prefill the OLAP Cache


Using the BEx Broadcaster, you can store the results of a long running query in the system. This task can be
scheduled to run in the morning so that subsequent calls to the query will leverage the stored results.

Pre-calculate the Calculated Key-Figures (CKF)


A CFK formula is performed during run@me and many instances can slow the query performance. To remedy this
issue, consider calcula@ng these values during the data loading process and storing the value in the infoprovider.

Declare Global Restrictive Key-Figures (RFK)


Similar to a CKF, a RFK selec@on is performed during run@me. If needed, declare the RKF at the global infoprovider
level instead of the local query level. Selec@ons declared at the infoprovider level are processed during run@me
and stored in cache while selec@ons declared at the query level are processed during each naviga@onal step.

There are many techniques that can be applied at the repor@ng layer to improve performance. I have found
the above 3 items to be most eec@ve in crea@ng a be\er user-experience for the report analyst. If you
have applied the my recommenda@ons and need even more performance gains, then let me know.
hau@summerlinanaly@cs.com
06

Modeling
How To Find Bad Data Models

This modeling thing, its pretty easy,


but actually its also really tough.
Cara Delevingne
07

Modeling
How To Find Bad Data Models

The second step in quickly troubleshoo@ng your customers pain-points


are the data models.

Poorly designed models hinder the front-end repor@ng performance
and lengthen the back-end data loading @mes. Heres how you can
easily nd which infocubes are imbalanced.

In Transac@on SE38, run the program SAP_INFOCUBE_DESIGNS. Look
for lines in red font for imbalanced dimensions, ones that contain
record counts that exceed SAPs guideline for dimension-to-fact ra@os.

As a rule of thumb, you should aim for a dim-to-fact ra@o of 10%.

08

Modeling
How To Resolve Bad Data Models

Now what? So you found a bad data model, but there are too many @ps Now imagine that the table of contents were poorly designed and it
on the developer forums to wade through. Where you do start? spanned 25 pages in addi@on to the body of the 100-page book (for a
total of 125 pages)? Finding your desired sec@on would be more
Do you build aggregates? What about database level op@miza@ons? dicult.
Should you par@@on the info-provider and provide mul@-provider hints
based on a common repor@ng metric such as Fiscal Period? In our scenario, we are looking to reduce the size of the dimension
tables (table of contents) in our star-schema diagram. We can group
Before you dive too deeply into the various op@miza@on techniques, I similar objects into their own dimensions.
would like to share with you the three most-eecGve (and simple) steps
that you can take to ensure a quick repor@ng and loading experience. In some cases, we can nd signicant performance gains by elimina@ng
the dimension table all-together. Ill cover that concept in the next page.
The approach is centered around the idea that searching a small area is
generally faster than searching a large area. The lesson is to shrink our model as much as we can before diving into
more @me-consuming techniques.
Imagine reading a 100-page book.

What if you were interested in only a sec@on of the book? For most
books, the table of contents span only a handful of pages (perhaps 5
pages) but it will contain the reference (or page number) of your desired
sec@on.


09

Modeling
How To Resolve Bad Data Models

Create Line-Item Dimensions


For detailed informa@on, like a document number, move the info-object into its own dimension.
Declare this new dimension as a line-item dimension. Use sparingly.

Rebalance Dimensions
More art than science, group like items into a dimension. For example, group @me related objects into one
dimension and group organiza@onal objects into another dimension.

Remove Unnecessary Objects


Compare the objects in the info-provider to the objects used on the repor@ng layer.
Remove objects that are not used.

Line-Item Dimensions stores the value in the fact-table and eliminate table joins. Rebalanced Dimensions
will give you a smaller table. Both approaches will reduce unnecessary reads and result in be\er
performance. S@ll have ques@ons? Let me know. hau@summerlinanaly@cs.com
10

Loading
How To Find Sub-Optimal Code

Every increased possession


loads us with a new weariness.
John Ruskin
11

Loading
How To Find Sub-Optimal Code

The third step in quickly troubleshoo@ng your customers pain-points


are the transforma@ons.

Long-running data loads impact your ability to deliver relevant data to
your users. Here is how you can use SAPs sta@s@c tables to nd the
long running jobs that need tuning.

In Transac@on SE16, view the table RSDDSTATDTP for entries where
TLOGO=TRFN. The TSTMP_START and TSTMP_FINISH are the respec@ve
start and end @mes in long format.

Export this data into an Excel sheet and calculate the dierence.


12

Loading
How To Resolve Sub-Optimal Code

Long-Running Data Loads. This nal sec@on is highly customer specic. The Quick & Dirty

While some environments are more customized than others, I would 1 Copy the Data Target
like to give you a few general guidelines when approaching a 2 If needed, copy the Data Source
transforma@on or extrac@on program op@miza@on project. 3 Copy the Transforma@on
4 Revise the logic in the Copy Transforma@on (Step 3)
Lets say that we have iden@ed a long-running program that is an ideal
candidate for some performance tuning. Where should we being? How The guiding principle is to op@mize the program at a technical level
do we eec@vely improve the program in a risk-free manner? What without changing any of the business logic. In the end, your results
about the before and aler improvement benchmarks? should match but the program will run much faster.

Heres my Quick & Dirty approach to performance op@miza@on. In
prac@ce, this is also the most-ecient use of @me and least risk-prone
methods to remodeling business cri@cal solu@ons. It just doesnt have
the same ring.
13

Loading
How To Resolve Sub-Optimal Code

Remove Unnecessary Database Access


Database reads are the most @me-intensive ac@vi@es on a system. For be\er performance, use Start or End
Rou@nes to populate internal tables with look-up data.

SORT & READ


The cost burden of LOOP AT WHERE is much higher than a SORT & READ WITHKEY or READ BINARY.
Consider using an ini@al READ followed by a LOOP AT INDEX.

Use Field-Symbols
There is a cost to moving the contents of a table in LOOP INTO statement (depending on the table line).
Use eld-symbols to access internal table data and omit movement of content.

Copying the transforma@on (include the source and target) is a very quick approach to tes@ng your coding
op@miza@ons. The benets are two-fold: 1) You have dis@nct objects for BEFORE/AFTER tes@ng and 2) The
original objects remain unaected. Have ques@ons? Let me know. hau@summerlinanaly@cs.com
14

Conclusion
Three Tools I Use On Every SAP BW Project

Thats it! These are the three tools that I use on every SAP BW
project to quickly iden@fy issues and start the conversa@on for next
steps.

Try it out and let me know how it has helped you.

For addi@onal insights on how you can build faster and deliver more
successful business intelligence projects consistently, join my
exclusive newsle\er h\p://bit.ly/successfulsapbi .
15

Meet Hau Ngo


Sharing his expertise in Business Intelligence

About Hau Skills


98%
Experienced BI Solu@on Architect with strong Data Modeling
interpersonal communica@on skills and ability
to work through complex problems.

98%
Hau has been integrally involved in numerous Requirements & Analysis
full life cycle implementa@ons with domes@c
and interna@onal projects
Hau Ngo
FuncGonal Knowledge
90%
Principle Consultant
SAP Business Intelligence
Solu@on Architect with 15+
90%
years of project experience. CommunicaGon
Thanks for reading
Have a great day!

Das könnte Ihnen auch gefallen