Sie sind auf Seite 1von 15

A Call Center Uses Simulation to Drive

Strategic Change

Robert M. Saltzman Information Systems and Business Analysis


saltzman@sfsu.edu Department
San Francisco State University
1600 Holloway Avenue
San Francisco, California 94132

Vijay Mehrotra Onward, Inc.


vijay@onwardinc.com 888 Villa Street, Suite 300
Mountain View, California 94041

A large, customer-focused software company relied on simula-


tion modeling of its call center operations in launching a new
fee-based technical-support program. Prior to launching this
rapid program, call center managers were concerned about the
difficulty of meeting a proposed guarantee to paying custom-
ers that they would wait less than one minute on hold. Manag-
ers also wanted to know how the new program would affect
the service provided to their existing base of regular, nonpay-
ing customers. We quickly developed an animated simulation
model that addressed these concerns and gave the managers a
good understanding for the impact on system performance of
changes in the number of customers purchasing the rapid pro-
gram and in the number of agents. The one-minute guarantee
would be fairly easy to achieve, even if the percentage of call-
ers in the rapid program became quite high. Managers also
gained confidence that, with appropriate staffing levels, they
could successfully implement the new program, which they
soon did.

C all centers are locations “where calls


are placed, or received, in high vol-
ume for the purpose of sales, marketing,
customer service, telemarketing, technical
support, or other specialized business ac-
tivity” [Dawson 1996, p. 35]. Inbound call
Copyright 䉷 2001 INFORMS SIMULATION—APPLICATIONS
0092-2102/01/3103/0087/$05.00 INDUSTRIES—COMPUTER-ELECTRONIC
1526–551X electronic ISSN
This paper was refereed.

INTERFACES 31: 3, Part 1 of 2, May–June 2001 (pp. 87–101)


SALTZMAN, MEHROTRA

centers, which predominantly receive calls an economic value associated with cus-
rather than initiate them, face the classical tomer waiting time and abandonment.
planning problems of forecasting and Furthermore, such callers will with some
scheduling under uncertainty, with a probability call back and thereby help sus-
number of industry-specific complications tain the load on the system [Hoffman and
[Mehrotra 1997]. Well-known cases in- Harris 1986]. Thus, call center managers
clude L. L. Bean’s phone-order business must balance service level (a chief driver
[Andrews and Parsons 1989], the IRS’s of customer satisfaction) with the number
toll-free taxpayer information system of agents deployed to answer the phone
[Harris, Hoffman, and Saunders 1987], (comprising 60 to 80 percent of the cost of
and AT and T’s operational design studies operating a call center).
[Brigandi et al. 1994]. A Specific Call Center Problem
In their planning, most call centers tar- We conducted our analysis for the man-
get a specific service level, defined as the agers of a technical support call center of a
percentage of callers who wait on hold for major software company. At the time, Sep-
tember 1995, it was the only company in
Eighty percent of calls waited its industry that provided free technical
five to 10 minutes. support over the phone to its software
customers. Agents often provided a com-
less than a particular period of time. For bination of software solutions and busi-
example, a sales call center may aim to ness advice to customers, and the custom-
have 80 percent of callers wait for less ers perceived the service provided to be
than 20 seconds. A related measure often valuable. The software company’s cus-
used to assess call center performance is tomer base and sales volume were grow-
the average time customers wait on hold, ing rapidly.
or average speed to answer (ASA). Unfortunately, its call center had very
Another key performance measure is poor service levels, with 80 percent of calls
the abandonment rate, defined as the per- waiting five to 10 minutes, well above the
centage of callers who hang up while on firm’s three-minute target. Predictably,
hold before talking to an agent. It is both abandonment rates were high, approach-
intuitive and well known throughout the ing 40 percent on some days.
call-center industry that customer aban- To senior management, this situation
donment rates and customer waiting times was intolerable.
are highly correlated. High abandonment At first glance, the solution seemed ob-
rates are bad for several reasons, most no- vious: simply add more agents on the
tably because a customer who hangs up is phones, reducing waiting times and agent
typically an unsatisfied customer and is utilization. However, because this would
much less likely to view the company fa- increase the percentage of total revenues
vorably [Anton 1996]. Many analysts (for dedicated to free telephone technical sup-
example, Andrews and Parsons [1993] and port, it would adversely affect the bottom
Grassmann [1988]) have directly modeled line.

INTERFACES 31:3, Part 1 of 2 88


CALL CENTER

Another obvious solution, supported in relief.


different parts of the organization, was to The call center managers, however, had
charge all customers for technical sup- to determine very quickly whether to im-
port—as the competition did—to cover the plement the rapid program. Specifically,
cost of more agents. However, there were they had to determine how many agents
two huge problems with this approach. they would need to meet service goals for
First, the company had aggressively mar- both rapid and regular customers. They
keted the software as having free technical were particularly concerned with the im-
support. Second, to abruptly begin charg- pact of priority queuing for rapid custom-
ing all customers for support would create ers on the already-abysmal waiting times
a very negative impression, particularly in for the regular customers.
light of the currently poor level of service They commissioned us to conduct an
being provided. analysis to help them understand the im-
To address what was widely perceived pact of the new rapid program on the call
as a crisis, the company conducted many
high-level meetings involving various The one-minute guarantee to
parts of the organization, including the rapid customers would be
call center, product marketing, finance, easy to achieve.
and information systems. Within a few
days, a proposed compromise solution center’s overall performance. The analysis
emerged: the rapid program. had to be done in less than a week be-
Company managers conceived of the cause the decision to launch the program
rapid program as an optional service to be was imminent. Our clients had very little
offered to customers who needed quick knowledge of quantitative modeling, and
telephone support. For a fee, customers they made it very clear to us that they
would receive priority in the telephone were not interested in any particular
queues. The company would guarantee methodology. Rather, they needed specific
that their wait to speak to an agent would results that would help them make a
be less than one minute; if they waited sound decision on time.
longer, their calls would be free. The re- Solution Approach
mainder of the customer population, Based on our understanding of the pro-
which we refer to as regular customers, posed rapid program, we thought that de-
would continue to receive free technical veloping a small, animated simulation
support over the telephone. model would be the best way to tackle the
The marketing group drafted a cus- problem. We had several reasons for this.
tomer mailing describing the program as First, the model would have to allow for
an added benefit. The information-systems two priority classes of customers. Second,
group put a team together to examine a simulation model could represent one of
how to modify the call-center agents’ the most important dynamic features of
desktop systems to include billing capabil- this system, call abandonment, while gath-
ities. The finance group let out a sigh of ering output on a variety of performance

May–June 2001 89
SALTZMAN, MEHROTRA

measures of interest to management, in- that a multiserver queuing model with


cluding abandonment rates and service certain assumptions about reneging can be
levels. Finally, the transient phase of the developed to predict the distribution of
system accounts for a significant portion customer waiting time. These results, how-
of the day and would have to be included ever, do not apply to the transient phase
in the overall measures of performance. nor do they address the issue of customer
Pilot simulation runs showed that, under priority classes.
most scenarios, the system did not reach Moreover, at the time we were much
steady state for at least an hour after the more familiar with the latest advances in
opening of business. Most analytical queu- simulation modeling than with those in
ing models, on the other hand, assume the queuing theory. Consequently, we were
system is in steady state. confident we could conduct the necessary
We were unaware of any analytic meth- analysis in a week with simulation but
ods that could accommodate multiple pri- were (and still are) unsure that it could be
ority classes, call abandonment, and both done with a queuing model.
transient and steady-state phases. While Key Inputs and Data Sources
the model results that we ultimately pre- In modeling call center operations, data
sented to management did not rely collection is a multifaceted challenge. One
heavily on the transient phase, the ani- primary data source is the automatic call
mated simulation approach allowed us to distributor (the ACD or phone switch).
demonstrate how the call center traffic While the typical ACD stores a huge
within one time period (for example, 8:00 amount of raw data, our client had access
am to 8:30 am) would influence abandon- to this information only through a handful
ment and waiting times for each class of of prepackaged reports, which provide ag-
customers in other intervals (for example, gregated results and little distributional
8:30 am to 9:00 am). We thought that man- information. Other model inputs were de-
agers seeing system dynamics like this rived from (1) the results of management
would be more likely to understand our decisions, (2) raw data captured through
model and thus be more likely to adopt its other systems, and (3) business-planning
results. This advantage of animated simu- assumptions.
lation has been noted by many research- For the rapid simulation model, we
ers, for example, Brigandi et al. [1994]. used five key types of input data.
In hindsight, we realize it may be possi- The service-level targets we used in this
ble to approximate this call center reason- model were provided by call center man-
ably well with an analytical queuing agement. The target service level for rapid
model. Green and Kolesar [1991], for ex- callers (80 percent of calls answered
ample, explored how to estimate long-run within one minute) was driven by market-
average performance measures for a mul- ing’s desire to provide an attractive guar-
tiserver queuing system with nonstation- antee to entice customers to purchase the
ary arrival patterns over a range of param- rapid-service option. The regular-caller
eter values. Whitt [1999] demonstrated target was far longer (eight minutes) and

INTERFACES 31:3, Part 1 of 2 90


CALL CENTER

was viewed by management as an unfor- Determining a distribution to represent


tunate consequence of the current budget customer abandonment behavior was dif-
constraints. ficult. This is a standard problem in call
There was a great deal of uncertainty center modeling, because observed aban-
about what proportion (P1) of callers donment behavior is an output of specific
would purchase the rapid-service option. conditions (waiting times, customer toler-
Market research showed that many cus- ance) but must be represented as an input
tomers claimed to be willing to pay for for planning models like ours.
faster service, but the many variables We started with two basic assumptions.
(pricing, seasonality in call volume and First, no matter how long the waiting
call content, new-product sales, and so time, no more than 40 percent of custom-
forth) prevented it from generating any- ers would ever abandon the queue. This
thing more than a broad range of values. was based historical data for time periods
Call center management forecast call in which customer hold times were ex-
volumes. Although we recognized their tremely long, when abandonment rates
importance, we omitted time-of-day and had peaked at about 40 percent. Manage-
ment’s feeling was that the rest of the cus-
A decision delayed can be tomers simply had no choice but to wait
worse than a bad decision. to get technical support because of the im-
portance of the company’s software. Sec-
day-of-week arrival patterns from our ini- ond, some customers (five percent) would
tial study for two reasons. First, we had to abandon the queue as soon as they were
conduct the initial analysis quickly. Sec- put on hold. These can be viewed as cus-
ond, the client’s primary emphasis was to tomers whose problems are not very ur-
understand the dynamics and trade-offs gent or important. We made this assump-
between key variables in the system. tion because historical data showed this
We took the value of the average call- level of abandonment even when cus-
handling time (15 minutes) from the call tomer waiting times were very short.
center’s ACD system; this value includes Using these data, we modeled the likeli-
time spent talking to the caller and time hood of a customer’s abandoning the
spent afterwards on after-call work, or queue as a linear function of the time
wrap-up time. Unfortunately, we could spent on hold. Specifically, for every two
not obtain detailed distribution data from additional minutes spent on hold, our
the ACD’s reporting system, which pro- model assumed that the probability of
vided only average handling times for dif- abandoning the queue would increase by
ferent periods. However, we assumed that 3.5 percent, up to a maximum of 40 per-
the call-handling-time distribution was ex- cent for a wait of 20 minutes or longer
ponential, an assumption we validated by (Figure 1).
plotting and analyzing call-by-call data We would have liked to have had more
from another database within the call detailed historical data about customer
center. abandonment from which to generate a

May–June 2001 91
SALTZMAN, MEHROTRA

have greater credibility with management.


In other respects, we kept the structure of
the model simple because we had less
than a week for design, programming, de-
bugging, and analysis.
We built the simulation model using
Arena, a module-based graphical simula-
tion software package [Kelton, Sadowski,
and Sadowski 1998]. Figure 2 shows the
Figure 1: The hollow bars show the cumula- entire model as it appears on the screen in
tive probability that a caller will abandon the the Arena environment prior to execution.
queue after being on hold for various lengths When executed, the model generates an
of time. We assumed that five percent of call-
animated picture of the call center’s key
ers would abandon as soon as they were put
on hold. The solid bars show that for every operations.
two additional minutes spent on hold, the We defined several essential run charac-
probability of abandoning increases by 3.5 teristics, such as the number of replica-
percent, up to a maximum of 40 percent for a
wait of 20 minutes or longer. This implies
tions and the length of each replication, in
that 60 percent of callers would never aban- the Simulate module. The Variables mod-
don the queue, no matter how long their wait- ule contains model parameters, which
ing time. were held constant for a given scenario,
more sophisticated distributional model. such as the percentage of rapid callers (P1)
However, we discussed our approach with defining the call mix, the service-level-
the call center managers, and they found it target answer times (SLTarget) by call
to be a reasonably good representation of type, and the mean interarrival time
actual customer behavior. (MeanIATime). In the Expressions module,
Model Description we specified random variables, for exam-
Simulation modelers are generally ad- ple, the time to serve a call (HandleTime)
vised to avoid building models that make and the amount of time a caller waits on
a one-to-one correspondence with the ele- hold before abandoning (AbandonTime).
ments of the real system under study The Statistics module defines and writes
[Pegden, Shannon, and Sadowski 1995, p. out performance measures from each rep-
32]. Our model, however, does represent a lication to data files that can be examined
few key aspects of the call center essen- later using Arena’s output analyzer. Fi-
tially at full scale, that is, we used S ⳱ 80 nally, related objects, such as counters for
to 90 agents and a large volume of callers the number of calls served by customer
typical of the actual system. We did this class, are grouped in the Sets module; this
for two reasons: first, queuing systems are allows specific set elements to be refer-
known to exhibit nonlinear behavior (for enced during execution by using the
example, in terms of mean customer wait- customer-class attribute as an index (1 for
ing time) as the number of servers in- rapid customers, 2 for regular customers).
creases, and second, the model would During execution, the model tracks a

INTERFACES 31:3, Part 1 of 2 92


CALL CENTER

Figure 2: We analyzed the call center using an Arena simulation model. The figure shows the
entire model as it appears on the screen in the Arena environment. Five data modules at the
top define essential run characteristics of the model, variables, expressions, and performance
measures about which statistics will be gathered, such as the number of callers of each type
who abandon the queue and who reach an agent. The 3 ⴒ 4 table continually updates the val-
ues of these performance measures. Calls are animated in the Call Center area where they can
be seen waiting in line, occasionally abandoning, and being served. The modules at the bottom
of the figure frequently check each call on hold to see if it is ready to abandon the queue.

number of system performance measures an agent (Served), the average number of


and continually updates their values in the minutes served callers spent on hold (Ave.
3 ⳯ 4 table: the number of callers who Q Time), and the percentage of served
hung up without being served (Aban- customers who spent less than the target
doned), the number of callers who reached time on hold (Service Level). The model

May–June 2001 93
SALTZMAN, MEHROTRA

also plots the average agent utilization, ex- Once per six simulated seconds, the
pressed as a percentage of the total num- model creates a logical entity to examine
ber of servers. all of the calls on hold at that particular
The main flow and animation of calls instant. If the Search module finds a call
occurs in the call center area of the model. that has waited longer than its abandon-
Calls enter the Arrive module, with the ment time, the Remove module pulls the
time between arrivals being exponentially call from the call center queue and moves
distributed, and they are immediately as- it to the Depart module labeled Aband-
signed random values for three key attrib- Stats. Checking the queue for abandon-
utes, handle time, abandonment time and ment more often than 10 times per minute
customer class. During execution, rapid would probably be more accurate but lead
calls appear on screen as small green tele- to considerably longer run times (which
phones moving through the center while we had to avoid).
regular calls appear as blue telephones. The two modules in the mean-
After arrival, a call is either put through interarrival-time-schedule part of the
to an available agent or put on hold to model allow the arrival rate to vary by
wait in queue until an agent is assigned to time of day. Periodically, the model cre-
handle it. Rapid customers have priority ates a logical entity (Arrive) to update the
over regular customers and, if they have
to wait at all, appear at the head of the Failing to offer its suffering
queue. customers an alternative for
The call center’s phone system had es- technical support would have
sentially unlimited capacity to keep calls
been extremely damaging.
on hold, so we specified no queue length
limit in the model. The client’s telecommu- value of the mean-interarrival-time vari-
nications department stressed that the call able MeanIATime based on the values
center had an exceptionally large trunk contained in the MeanIAT vector. How-
line capacity because of the firm’s existing ever, because we held MeanIATime con-
long-term contracts with its service stant throughout the day at 0.15 minutes
provider. to simplify the analysis, we did not take
During their wait in queue, some calls advantage of this feature.
may abandon the system while the rest Verification and Validation
eventually reach the Server module. There, Because we had so little time, we veri-
each call is allocated to one of the S fied and validated the model quickly and
agents, its waiting time on hold is re- informally. To verify that the model was
corded (Assign and Tally), a counter for working as intended, we relied on our ex-
the number served is increased by one perience with building and testing simula-
(Count), and the call is delayed for the du- tion models [Mehrotra, Profozich, and
ration of its handle time. After service is Bapat 1997; Saltzman 1997], and on Arena’s
completed, the agent is released and the many helpful features. For example, Arena
call departs from the system. has a completely graphical user interface,

INTERFACES 31:3, Part 1 of 2 94


CALL CENTER

many automated bookkeeping features values of S and six values of P1 led to a to-
that greatly reduce the likelihood of pro- tal of 36 scenarios. Since we had to run
gramming error, and interactive debug- and analyze many scenarios in just a few
ging capabilities that allow the user to days, only 10 replications were executed
stop execution and examine the values of for each scenario. (We made later runs
any variable or caller attribute. with 50 replications for several scenarios
To validate that the model’s operation and obtained results that differed from
and its output represented the real system those for 10 replications by less than five
reasonably well, we relied on the second percent.)
author’s experience as a call center consul- Arena’s output analyzer calculated sum-
tant and intimate knowledge of the client’s mary statistics across the 10 independent
operations. Based on the model’s anima- replications [Kelton, Sadowski, and
tion and its average output over many sce- Sadowski 1998]. We report here only the
narios, he made a preliminary judgement means across these replications for each
that the model was valid. scenario. Although Arena generates confi-
Subsequently, the model passed the dence intervals for the mean, we did not
most important test: the call center manag- report these to the client for fear of com-
ers embraced it. They compared the queue plicating the presentation of results
lengths, waiting-time statistics, and aban- (Table 1).
donment rates in the base case to those in Under the circumstances we tested,
the ACD reports for specific time periods rapid customers waited very little (less
and found the simulated values largely than half a minute on average among
consistent with what they saw in the call those who did not abandon) because of
center. Once they were comfortable with their priority over regular customers. Con-
the base case, they were eager to under- sequently, few rapid callers (3.3 to 4.2 per-
stand the impact of different adoption lev- cent) abandoned the system. The service
els for the rapid program with various level provided to rapid callers was above
staffing configurations. 95 percent in all but a few cases, achieved
Experimentation and Results at the expense of a much lower level of
The call center is a terminating system service for regular callers.
that begins each morning empty of calls Regular customers waited between 3.35
and ends hours later when agents go minutes in the best case (P1 ⳱ 0%, S ⳱ 90)
home after serving their last calls. For sim- to 18.5 minutes in the worst case (P1 ⳱
plicity, we took each replication of the 50%, S ⳱ 80). When rapid callers made up
model to be exactly 12 hours, so even 20 percent of the callers (P1 ⳱ 20%), the
though the calls in the system at the end average waiting time for regular calls can
of the day were not served to completion be improved by 57.5 percent, from 10 to
we counted them as served. 4.25 minutes, by increasing the number of
We defined scenarios as specific combi- agents from 80 to 90. This would also re-
nations of S, the number of agents, and P1, duce the abandonment rate of regular cus-
the percentage of rapid callers. Testing six tomers by 51 percent, from 21.5 to 10.5

May–June 2001 95
SALTZMAN, MEHROTRA

Number of agents, S

P1 Performance measure 80 82 84 86 88 90

Ave. queue time (min.)—rapid 0.43 0.40 0.38 0.36 0.33 0.30
Ave. queue time (min.)—regular 18.50 14.30 11.50 9.47 7.68 6.12
50% Abandonment rate (%)—rapid 4.2 4.1 4.1 3.8 3.8 3.5
Abandonment rate (%)—regular 32.0 28.4 24.6 21.0 17.6 14.5
Service level (%)—rapid 89.5 91.2 92.3 92.9 94.0 95.1
Service level (%)—regular 16.8 22.7 30.8 40.1 52.2 64.1

Ave. queue time (min.)—rapid 0.33 0.32 0.31 0.29 0.27 0.25
Ave. queue time (min.)—regular 13.10 11.30 9.55 7.99 6.62 5.32
40% Abandonment rate (%)—rapid 4.1 4.0 3.9 3.8 3.7 3.5
Abandonment rate (%)—regular 27.3 24.2 21.0 18.0 15.3 12.6
Service level (%)—rapid 94.6 94.9 95.5 95.9 96.6 97.2
Service level (%)—regular 21.0 28.5 38.0 48.3 60.7 72.1
Ave. queue time (min.)—rapid 0.27 0.27 0.25 0.24 0.22 0.21
Ave. queue time (min.)—regular 11.20 9.69 8.35 7.06 5.87 4.66
30% Abandonment rate (%)—rapid 4.0 4.0 3.8 3.8 3.7 3.5
Abandonment rate (%)—regular 23.8 21.0 18.4 16.0 13.7 11.2
Service level (%)—rapid 96.9 97.1 97.9 98.1 98.5 98.4
Service level (%)—regular 24.3 33.5 44.4 56.6 68.5 80.9

Ave. queue time (min.)—rapid 0.23 0.23 0.21 0.21 0.20 0.18
Ave. queue time (min.)—regular 10.00 8.78 7.61 6.50 5.30 4.25
20% Abandonment rate (%)—rapid 3.9 4.0 3.9 3.7 3.8 3.3
Abandonment rate (%)—regular 21.5 19.0 16.8 14.7 12.5 10.5
Service level (%)—rapid 98.3 98.7 98.9 99.0 98.9 99.2
Service level (%)—regular 28.9 40.1 51.1 62.9 76.0 86.3
Ave. queue time (min.)—rapid 0.21 0.19 0.19 0.18 0.17 0.16
Ave. queue time (min.)—regular 8.97 7.86 6.82 5.87 4.70 3.74
10% Abandonment rate (%)—rapid 3.9 4.1 4.0 3.8 3.8 3.6
Abandonment rate (%)—regular 19.3 17.2 15.3 13.4 11.4 9.5
Service level (%)—rapid 99.1 99.3 99.4 99.3 99.6 99.5
Service level (%)—regular 35.4 46.8 60.0 72.3 84.3 92.4

Ave. queue time (min.)—regular 8.19 7.26 6.31 5.28 4.25 3.35
0% Abandonment rate (%)—regular 17.8 16.0 14.2 12.4 10.5 8.8
Service level (%)—regular 42.4 54.1 68.2 80.4 90.7 96.4
Table 1: We ran the model for 36 scenarios. Entries in the table represent average values across
the 10 replications run per scenario. An abandonment rate for each class of callers was derived
from the number of abandoned and served calls, expressed as a percentage, that is, Abandon-
ment Rate ⴔ 100* Abandoned/(Served ⴐ Abandoned). Service level for rapid callers is the per-
centage of calls answered within one minute; for regular callers it is the percentage of calls an-
swered within eight minutes.

INTERFACES 31:3, Part 1 of 2 96


CALL CENTER

percent.
Another key strategic question was
what level of agent staffing would keep
average waiting times reasonable for regu-
lar calls, that is, at or below eight minutes,
while providing superior service for rapid
callers (Figure 3). The model showed, for
example, that if rapid callers made up 20
percent of the callers, 84 agents would be
needed. In this scenario, rapid callers
would have a very high service level of
about 99 percent. Alternatively, if just 10 Figure 3: The lines represent (from top to bot-
tom) 50, 40, 30, 20, 10, and zero percent rapid
percent of callers were rapid customers, customer calls. For each percentage, we can
only 82 agents would be needed to see how the average wait in queue for regular
achieve the eight-minute target average callers decreases as the number of agents in-
creases. The figure can be used to determine
for regular customers.
the number of agents required to keep aver-
Call center managers could use the age waiting times for regular calls at or below
graph to determine staffing levels if they the eight-minute target, while providing supe-
changed the target average wait to another rior service for rapid callers. For example, if
rapid callers made up 20 percent of the call-
value, such as six minutes, or if more cus-
ers, 84 agents would be needed.
tomers participated in the rapid program
(Figure 3). for the impact on system performance of
Implementation changes in the number of customers pur-
Once we had built an initial version of chasing the rapid program (which they
the model, we presented our preliminary could not control) and in the number of
results to a management team that had no agents (which they could control). In par-
previous experience with simulation. We ticular, the average queue time results
explained the underlying concept of a made it clear that the one-minute guaran-
simulation model as a laboratory for look- tee to rapid customers would be fairly
ing at different call center configurations easy to achieve, even if the percentage of
and examining the impact of design callers in the rapid plan became quite
changes on key performance measures. high, which helped drive the revenue pro-
Based on both the animation and the jections for the program.
preliminary results, the managers were ex- They were surprised at the dramatic im-
cited about the answers our model could pact adding agents would have in cutting
provide. They asked us to quickly run the waiting time of regular customers. In
some additional scenarios, which we ana- our presentation, we gave the managers
lyzed and then reviewed with them a few some general guidelines about how to
days later. make adjustments to staffing based on the
The results of the simulation analysis proportion of customers who purchased
gave the managers a good understanding the program. Most important, the results

May–June 2001 97
SALTZMAN, MEHROTRA

of our analysis gave managers confidence to launch the rapid program was inher-
that they could successfully implement the ently difficult and contentious. Debate
rapid program. about the program’s merits and risks
We conducted our analysis and pre- could have caused weeks of delay. This
sented the results during one week in Sep- delay would have been very expensive in
tember 1995. Within a few days, the man- terms of rapid-program revenues, because
agers decided to introduce the rapid the call center’s busiest months were De-
program. The firm did a lot of internal cember, January, and February. Our simu-
work very quickly to prepare for the start lation analysis, however, provided a di-
of the program. The company trained verse group of decision makers (from the
agents to handle rapid calls and changed call center, finance, marketing) with a
call center desktop information systems common empirical understanding of the
(changing the user interface and modify- potential impact of the rapid program on
ing the back-end database and integrating customer service, which in turn facilitated
it with the credit-checking and billing sys- a much faster decision.
tems). Finally, it altered the call center’s Lessons for Simulation Practitioners
ACD logic to establish different call rout- This case study highlights several
ing and priorities for rapid customers. themes from which simulation practition-
By November, the company had fin- ers can benefit (see Chapter 3 of Profozich
ished its preparations. Launched with a [1998] for further discussion). This applica-
major direct-mail campaign, the program tion of simulation is an excellent example
was eventually adopted by more than 10 of embedding a mathematical analysis in
percent of customers, and generated the midst of a larger decision-making pro-
nearly $2 million of incremental revenues cess. In our simulation model, we did not
within nine months. Because of the success try to determine which service program
of this program, the company initiated a the call center should offer to customers
much more comprehensive fee-based because many business factors (including
technical-support service at the start of its marketing, financial, and technical issues)
next fiscal year. influenced the definition of the rapid pro-
Our simulation model made a major gram. However, the results of our analysis
contribution to the launch of this program enabled managers to evaluate the pro-
and the generation of this revenue stream. posed solution empirically under different
The results of our analysis played a key conditions.
role in the company’s decision to launch The managers saw our simulation anal-
the program; in particular, they validated ysis as a vehicle for mitigating risk. They
the feasibility of the “one minute or free” did not have to hold their breaths while
guarantee to rapid customers, which was they experimented on live customers with
an important part of marketing the real revenues on the line. Instead they re-
program. lied on the simulation for insights into
Our analysis had another more subtle what would happen when they changed
effect on the rapid program. The decision the configuration of the call center. In

INTERFACES 31:3, Part 1 of 2 98


CALL CENTER

many industries, managers increasingly in the representation of the call center


see call centers as the customers’ windows would have caused the number of scenar-
into their firms; poor strategic decisions ios to explode. The business question of
can be very costly. By helping to prevent how the adoption rate of the rapid pro-
such errors, simulation can be very gram would influence service levels for
valuable. regular customers would very likely have
Finally, the managers found our simula- been obscured by a host of lower-level is-
tion analysis valuable because it helped sues. While decisions about schedules and
them to launch the rapid program quickly. skills are tactically important, they were
In today’s competitive business climate, a not particularly relevant to deciding
decision delayed can be worse than a bad whether to launch the rapid program at
decision. For our client, failing to offer its all. (Mehrotra [1999] discusses different
suffering customers an alternative for tech- call center planning horizons.)
nical support during the December-to- Conclusions
February busy season would have been This project was successful in influenc-
extremely damaging. The simulation re- ing a major business decision for several
sults helped the decision makers to reasons. This project benefited from (1) a
quickly give the rapid program their clear focus on specific business decisions
blessing. and direct access to decision makers; (2)
Retrospective Assessment intimate understanding of key perfor-
Looking back on our study, we can see mance measures and data sources; and (3)
ways in which we could have improved quick delivery of the model and timeliness
our modeling effort. First, we could (and of the results. Also important was our ex-
probably should) have incorporated a non- perience in developing simulation models,
stationary arrival pattern based on the his- a capability that most call centers do not
torical call patterns. Although the model have.
had the capability to generate a nonsta- In conducting our analysis we faced two
tionary arrival pattern, we did not use it. serious constraints: a limited amount of
With a few additional days for the time and a powerful but general-purpose
study, we would have run more replica- simulation package. Indeed, these con-
tions of the model. With more time and straints limit the use of simulation in the
resources, we might have allowed for the call center industry. The dynamic and fre-
specification of time-varying agent sched- netic nature of call center operations
ules (showing a variety of starting times, means that most decisions are made either
break times, and lunch times). Agent in a reactive manner or with very limited
groups are another dimension we might lead time at best. Consequently, creating
have added, that is, some agents might be simulation models from scratch, even with
able to handle rapid calls only, others reg- user-friendly packages such as Arena, is
ular calls only, and the rest capable of usually not feasible.
handling both. Since we completed this project, several
Including these additional dimensions simulation packages designed specifically

May–June 2001 99
SALTZMAN, MEHROTRA

for call centers have appeared on the mar- the rapid program.
ket, including the Call$im package [Sys- Call$im is a far superior tool for creat-
tems Modeling Corporation 1997] that is ing call center models than Arena and its
built on top of Arena. They enable the peers, but they are far better tools for
rapid development of call center simula- building simulation models than general-
tion models. Call$im, for example, is used purpose programming languages (which
by over 70 call centers around the world were once the state of the art). Arena’s
for planning and analysis studies. availability and flexibility were crucial to
Released initially in 1997, Call$im has a our successful analysis.
user interface built around call center ter- As call centers proliferate—there are an
minology and includes such features as estimated 60,000 in the United States
out-of-the-box integration with Visual Ba- alone—and with simulation’s potential to
sic, animation, and customized output to help managers configure and plan their
spreadsheets. All of this shields users from operations, we expect the use of simula-
the underlying programming, greatly sim- tion will grow substantially in this
plifying the process of building call center industry.
models, importing data, conducting exper- References
Andrews, B. H. and Parsons, H. L. 1989, “ L. L.
iments, and analyzing results.
Bean chooses a telephone agent scheduling
Would we have used Call$im for this system,” Interfaces, Vol. 19, No. 6, pp. 1–9.
analysis if it had been available then? Yes. Andrews, B. H. and Parsons, H. L. 1993, “Es-
Call$im’s industry-specific objects—with tablishing telephone-agent staffing levels
through economic optimization,” Interfaces,
such module names as Calls, Agents, and Vol. 23, No. 2, pp. 15–20.
Schedules—contain detailed logic that mir- Anton, J. 1996, Customer Relationship Manage-
rors the way call centers operate and de- ment, Prentice Hall, Upper Saddle River,
New Jersey.
fault values that call center personnel find
Brigandi, A. J.; Dargon, D. R.; Sheehan, M. J.;
intuitive and reasonable. In addition, each and Spencer, T. 1994, “AT&T’s call process-
module includes a variety of parameters ing simulator (CAPS) operational design for
that provide flexibility in model building inbound call centers,” Interfaces, Vol. 24, No.
1, pp. 6–28.
and enable realistic representation of Dawson, K. 1996, The Call Center Handbook,
nearly all call centers. Automatically gen- Flatiron Publishing, New York.
erated output statistics are also defined in Grassmann, W. K. 1988, “Finding the right
number of servers in real-world queuing sys-
call center terms, such as service level,
tems,” Interfaces, Vol. 18, No. 2, pp. 94–104.
abandonment rates, and agent utilization. Green, L. and Kolesar, P. 1991, “The pointwise
A knowledgeable Call$im user could cre- stationary approximation for queues with
ate and test the model shown in Figure 2 nonstationary arrivals,” Management Science,
Vol. 37, No. 1, pp. 84–97.
in less than one hour, using just six or Harris, C. M.; Hoffman, K. L.; and Saunders,
seven modules for input data, model logic, P. B. 1987, “Modeling the IRS taxpayer infor-
and output creation. This package would mation system,” Operations Research, Vol. 35,
No. 4, pp. 504–523.
have allowed us to add more model de-
Hoffman, K. L. and Harris, C. M. 1986, “Esti-
tail, to spend more time on analysis, and mation of a caller retrial rate for a telephone
to test different scenarios for adoption of information system,” European Journal of

INTERFACES 31:3, Part 1 of 2 100


CALL CENTER

Operations Research, Vol. 27, No. 2, pp. 39–50. this new initiative.
Kelton, W. D.; Sadowski, R. P.; and Sadowski,
D. A. 1998, Simulation with Arena, McGraw-
“The simulation analysis conducted by
Hill, New York. the Onward—SFSU team helped us get a
Mehrotra, V. 1997, “Ringing up big business,” much better understanding of the contin-
OR/MS Today, Vol. 24, No. 4, pp. 18–24. gencies that we were likely to face. In par-
Mehrotra, V.; Profozich, D.; and Bapat, V. 1997,
“Simulation: The best way to design your ticular, by quantifying the impact of Rapid
call center,” Telemarketing and Call Center So- customers and staffing/skilling decisions
lutions, Vol. 16, No. 5, pp. 28–29, 128–129. on service levels and costs, the simulation
Mehrotra, V. 1999, “The call center workforce
analysis gave us the confidence to launch
management cycle,” Proceedings of the 1999
Call Center Campus, Purdue University Cen- the program. The results also gave us an
ter for Customer-Driven Quality, Vol. 27, pp. excellent idea of how best to staff the dif-
1–21. ferent queues and how to adapt our staff-
Pegden, C. D.; Shannon, R. E.; and Sadowski,
R. P. 1995, Introduction to Simulation Using
ing as we learned more about the cus-
SIMAN, second edition, McGraw-Hill, New tomer population’s acceptance of the
York. Rapid program.
Profozich, D. 1998, Managing Change with Busi- “I am proud to report that this program
ness Process Simulation, Prentice Hall, Upper
Saddle River, New Jersey. was a great success for our business,
Saltzman, R. M. 1997, “An animated simulation grossing nearly $2 million in a little bit
model for analyzing on-street parking is- less than nine months. The work done by
sues,” Simulation, Vol. 69, No. 2, pp. 79–90.
the Onward—SFSU team played a big part
Systems Modeling Corporation 1997, Call$im
Template User’s Guide, Systems Modeling in making it happen successfully.”
Corp., Sewickley, Pennsylvania. (www.
sm.com)
Whitt, W. 1999, “Predicting queueing delays,”
Management Science, Vol. 45, No. 6, pp. 870–
888.

An officer of the client firm wrote as fol-


lows: “At the time that the authors con-
ducted the study, our technical support
operations were in reasonably poor condi-
tion. While considering the decision that is
described in this paper, we had a lot of
concerns about its impact on all of our
customers. Frankly, given the lack of data
on hand about customer adoption of the
new “Rapid” program, the major com-
puter systems changes required to imple-
ment Rapid, and re-education of call cen-
ter staff to handle these changes, we did
not have a good understanding of what
would happen if and when we launched

May–June 2001 101

Das könnte Ihnen auch gefallen