Sie sind auf Seite 1von 3

The Business Forecasting Deal - Exposing bad practices in business forecasting... and pro...

Page 1 of 3

Blogs | The Business Forecasting Deal

ABOUT THIS BLOG


Friday, July 30. 2010
The Argument for Max(Forecast,Actual)
There is a long running debate among forecasting professionals, on whether to use Forecast or
Actual in the denominator of your percentage error calculations. The Winter 2009 issue of
Foresight had an article by Kesten Green and Len Tashman, reporting on a survey (of the
International Institute of Forecasters discussion list and Foresight subcribers) asking:

What should the denominator be when calculating percentage error?

This non-scientific survey's responses were 56% for using Actual, 15% for using Forecast, and
29% for something other, such as the average of Forecast and Actual, or an average of the
Actuals in the series, or the absolute average of the period-over-period differences in the data Product Marketing Man
(yielding a Mean Absolute Scaled Error or MASE). One lone respondent favored using the Forecasting
Maximum (Forecast, Actuals). Business
to help expose the seam
Fast forward to the new Summer 2010 issue of Foresight (p.46): of the forecasting practi
provide practical solutio
vexing problems.
Letter to the Editor Mike and his rogues ga
Bloggers
I have just read with interest Foresight’s article "Percentage Error: What Denominator" (Winter
2009, p.36). I thought I’d send you a note regarding the response you received to that survey from
SYNDICATE
one person who preferred to use in the denominator the larger value of forecast and actuals.

I also have a preference for this metric in my environment, even though I realize it may not be •
academically correct. We have managed to gain an understanding at senior executive level that
forecast accuracy improvement will drive significant competitive advantage. QUICKSEARCH

I have found over many years in different companies that there is a very different executive
reaction to a reported result of 60% forecast accuracy vs. a 40% forecast error— even though
they are equivalent! Reporting the perceived, high error has a tendency to generate knee-jerk
ARCHIVES
reactions and drive to the creation of unrealistic goals. Reporting the equivalent accuracy metric
tends to cause executives to ask the question “What can we do to improve this?” I know that this • September
is not logical but it is something I have observed time and again and so I now always recommend • August
reporting forecast accuracy to a wider audience. • July 2010
• Recent...
But if you are going to use forecast accuracy as a metric then, if you have specified the • Older...
denominator to be either actuals or forecast, you will always have some errors that are greater
than 100%. When converting these large errors to accuracy (accuracy being 1 – error) then you CATEGORIES
end up with a negative accuracy result; this is the type of result that always seems to cause
misunderstanding with management teams. A forecast accuracy result of minus 156% just does g
c
d
e
f
not seem to be intuitively understandable. c
d
e
f
g

When you use the maximum of forecast or actuals as the denominator, the forecast accuracy Go!
metric is constrained between 0 and 100%, making it conceptually easier for a wider audience,
including the executive team, to understand. All categories

If the purpose of the metric is to identify areas of opportunity and drive improvement actions, TAGS
using the larger value as the denominator and reporting accuracy as opposed to error enables the
proper diagnostic activities to take place and reduces disruption caused by misinterpretation of
the “correct” error metric.

http://blogs.sas.com/forecasting/index.php?/archives/44-The-Argument-for-MaxForecast,... 9/27/2010
The Business Forecasting Deal - Exposing bad practices in business forecasting... and pro... Page 2 of 3

To summarize, I use the larger value methodology for ease of communication to key personnel
who are not familiar with the intricacies of forecasting process and measurement. The blog content appearing o
necessarily represent the
use of this blog is governed b
David Hawitt
SIOP Development Manager for a global technology company
davidhawitt@hotmail.co.uk

I have long favored "Forecast Accuracy" as a metric for management reporting, defining it as:

FA = {1 – [ ∑ |F – A| / ∑ Max (F,A) ] } x 100

where the summation is over n observations of forecasts and actuals. FA is defined to be 100%
when both forecast and actual are zero. Here is a sample of the calculation over 6 weeks for two
products, X and Y:

As all forecasting performance metrics do, calculating Forecast Accuracy using Max (F,A) in the
denominator has its flaws -- and it certainly has an army of detractors. Yet the detractors miss the
point that David so nicely makes. We recognize that Max(F,A) is not "academically correct." It
lacks properties that would make it useful in other calculations. There is virtually nothing a self-
respecting mathematician would find of value in it except it forces the Forecast Accuracy metric to
always be scaled between 0-100%, thereby making it an excellent choice for reporting
performance to management! If nothing else, it helps you avoid wasting time explaining the weird
and non-intuitive values you can get with the usual calculation of performance metrics.

Posted by Mike Gilliland at 07:00 | Comments (0) | Trackbacks (0)

Trackbacks
Trackback specific URI for this entry

No Trackbacks

Comments
Display comments as (Linear | Threaded)

No comments

Add Comment

Name
Email
Homepage

http://blogs.sas.com/forecasting/index.php?/archives/44-The-Argument-for-MaxForecast,... 9/27/2010
The Business Forecasting Deal - Exposing bad practices in business forecasting... and pro... Page 3 of 3

In reply to [ Top level ] 6


Comment 5

6
Enclosing asterisks marks text as bold (*word*), underscore are made via
_word_.
Standard emoticons like :-) and ;-) are converted to images.
E-Mail addresses will not be displayed and will only be used for E-Mail
notifications

To prevent automated Bots from commentspamming, please enter the string


you see in the image below in the appropriate input box. Your comment will
only be submitted if the strings match. Please ensure that your browser
supports and accepts cookies, or your comment cannot be verified correctly.

Enter the string from the spam-


prevention image above:

gRemember Information?
c
d
e
f
cSubscribe to this entry
d
e
f
g
Submitted comments will be subject to moderation before being displayed.
Submit Comment Preview

Privacy Statement | Terms of Use and Legal Information

http://blogs.sas.com/forecasting/index.php?/archives/44-The-Argument-for-MaxForecast,... 9/27/2010

Das könnte Ihnen auch gefallen