Beruflich Dokumente
Kultur Dokumente
N.Astyl
1/16/2013
Policy
Purpose
The objective of this process is to measure the effective agreed indicators and
translate the outcome results into action plans that will consequently enhance and
optimize our Service delivery standards.
Generally, the standard indicator list is applied but could be customized at client
request (if applicable) leading to a new indicators set when necessary.
In the following paragraphs, we will figure out how these indicators are defined,
measured and used to enhance service delivery quality.
NA
1/16/2013
2 Definition (SLA)
3 Definition (KPI)
Process Diagram
Definitions
1. Understanding Expectations (Scope Of Work)
In order to set effective Service Level Agreements the process should begin
with an in-depth understanding of the service output expectations.
This understanding will be reached by organizing site visits to understand the
clients business environment and by initiating workshops with the clients.
2. Service Level Agreements ( SLA )
The SLA will define the level of the provided service based on the output
specifications.
It is an agreement between the service provider and the client which quantifies
the minimum quality of service that meets the business need.
Its also concerns the standards on how the services will be operated, how
fast and when they will be available.
3. Key Performance Indicators (KPI)
The Key Performance Indicators will be based on the SLAs and are set to
measure whether the agreed standard is reached.
4. Set Scoring Target
For each indicator a scoring target card needs to be set up in order to provide
a measurement target status.
NA
1/16/2013
A
D
Key Indicators
B Indicator Measurement
Action Plan
C Reporting
A. Key indicators
Generally, we can differentiate five different indicator groups like for instance:
Compliance:
Indicators to measure whether the service delivery is compliant to the
contractual terms
- Are the service level agreements were strictly complied to?
- Are all preventive actions handled as forecasted?
-
Quality :
Indicators to measure the service performance quality level
- How is the service provided?
- Is the provision innovative & proactive?
- Is the customer satisfied with the output? ..
Cost
Indicators to measure financial figures
- Is the activity provided within agreed budget/quotation?
- Is the audit report is satisfactory?
- How to identify potentiality to reduce cost? ..
NA
1/16/2013
Quantity :
Indicators to measure recorded number of activities
- Planned Maintenance programme adhered to?
- Number of Incidents registered & prevented
- Total visitors/ Occupancy (per day) ..
Timeliness
Indicators to measure the time of service delivery
- Is the information delivered at the appointed time?
- Is the service delivered within the agreed period of availability?
- Are emergency calls out procedures, followed, recorded
accurately? ..
B. Indicator Measurement
The information needed to measure the indicator can either come from:
-
C. Reporting
The Reporting is fundamental to reflect the (Service Delivery) and (Contract
Management) performance.
Two conditions are to be applied:
o First the consolidation of measurement in a scorecard gives in each
field a very clear output on performance result.
o Secondly, the results are to be analysed, checked as well as internally
and historically benchmarked.
NA
1/16/2013
D. Action Plan
Based on the score outcome, analysis and bench marking, clear key points
are generated on what type of improvement actions to be set.
Next step of this action is to control the effectiveness of these actions and
possible extra actions to be defined, if necessary.
These procedures should be focalized as a living document with a
continuous improvement focus objective. Sodexho and the client will
continue to look for new and improved ways of conducting the business and
to develop and explore new initiatives which will enable to increase value to
the customers (Share Benefits, Payback, etc)
Service Delivery which covers all operations related to Soft Services and
Hard Services.
SCOPE
OF WORK
(SOW)
SERVICE
DELIVERY
CONTRACT
MANAGEMENT
STANDARD
OPERATING
MODEL
This process will most likely be automated via an online IT system; however
these spreadsheet files were created here to illustrate the working process.
NA
1/16/2013
Soft Services
Hard Services
Service Delivery
(SOW)
Service Level
Agreement ( SLA)
Key Performance
Indicators ( KPI)
Performance
Measurment Tool
Activity &
Frequency
Completed Preventive
Planning
Record / IFMS
Response Time
Response Time
Priority Classification
Record / IFMS
Client Quality
Perception
Joined Inspection
Customer Quality
Perception
Survey
Corrective Orders
Financial
Performance
Reporting
Customer
Satisfaction
Financial
Management
Contract
Management
HSE Compliant
BCP Compliant
Compliant
Operation
Audit
Client Rules &
Regulation Compliant
Standard Operating
Model & Reporting
Source ILA
NA
1/16/2013
Service
Task
SLA N
Frequency
KPI
X times per
Seasons/ All year
% PPM completed
Measured by
95 % preventive planning
completed
90 % preventive planning
completed
85 % preventive planning
completed
80 % preventive planning
completed
75 % preventive planning
completed
IFMS
IFMS
IFMS
IFMS
IFMS
IFMS
Response Time ( CM )
SLA ( Response time )
No.
Priority
Response
Time *
Condition
KPI
IFMS
X day, hours,
minutes
SLA N
Priority 1 - Emergency
Priority 2 - Urgent
Priority 3 - Normal
Priority 4 - Other
Measured
by
Respond within 10
minutes during
normal working
hours and less then
1 hour outside of
these hours
95 % completed within
the set response time
IFMS
90 % completed within
the set response time
IFMS
85 % completed within
the set response time
IFMS
80 % completed within
the set response time
IFMS
Source ILA
NA
1/16/2013
Service Sub
Category
Output
KPI
Measured by
Satisfaction
Customers
provision
services in
Sodexhos
Scope
x % of
Satisfaction
Soft Service,
Survey
Satisfaction
Customers
Corrective Order
handling
x % of
Satisfaction
handled
corrective
orders
Inspection
Satisfaction
Clients provision
services in
Sodexhos
Scope
X % of
Satisfaction
Joined Inspection
SLA N
Contract Management
Compliant Operation
SLA Compliant Operation
No.
Service Sub
Category
Expectation
KPI
Measured by
Finance
95 %
Complaint
Reporting
SLA N
BCP Complaint
100 %
Complaint
Inspection
95 %
Complaint
Inspection
95 %
Complaint
Inspection
95 %
Complaint
Inspection
Source ILA
NA
1/16/2013
Indicator
Critical
Related
SLA No.
1. Response Time
(CM )
NO
SLA N
2. Completed
Preventive
Planning (PM)
NO
SLA N
1.Service Quality
NO
SLA N
2. Client Quality
Perception
NO
SLA N
3. Customer
Quality
Perception
NO
SLA N
1. Finance
Yes
SLA N
2. HSE
NO
SLA N
3. Business
Continuity plan
Yes
SLA N
4. Client Compliant
NO
SLA N
5. SOM Compliant
NO
KPI N
KPI N
KPI N
SLA N
Target
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
Assessment
Method
Frequency
Measurement
Via table or
IFMS Software
(% responded in time)
Measurement
Via table or
IFMS Software
(% of plan that was
accomplished)
Inspection
(% of satisfaction)
Joined
Inspection
(% of satisfaction)
Survey
(% of Satisfaction)
Reporting
Audit
Audit
Audit
Audit
Source ILA
NA
10
1/16/2013
KPI
No
Related
SLA
Response
Time
KPI N
SLA N
Preventive
Planning
KPI N
SLA N
Scorecard
Critical/
Central
Service
Delivery
Weight
Customer
Satisfaction
X % of 100%
Contract
Management
Weight
KPI N
SLA N
Client
Quality
Perception
KPI N
SLA N
Customer
Quality
Perception
KPI N
SLA N
X% of 100%
Central
Assessment
Method
Freque
ncy
IFM Software
IFM Software
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
Inspection
Joined
Inspection
Survey
Min Meets
Critical
Central
Financial
KPI N
SLA N
HSE
KPI N
SLA N
Business
Continuity
KPI N
SLA N
Critical
Central
Customer
Compliant
KPI N
SLA N
Central
SOM
Compliant
KPI N
SLA N
Central
X% of 100%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
Score
Min Meets
Service
Quality
Weight
Central
Scoring
Critical
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
1-Unacceptable: X %
2-To be improved: X%
3- Expected: X%
4- Outstanding: X%
Reporting
Audit
Audit
Audit
Audit
Min Meets
Total Score
Source ILA
NA
11
1/16/2013
NA
12
1/16/2013
NA
13
1/16/2013
NA
14
1/16/2013
FM 01
FM 02
Overall
Target
Sep-07
Oct-07
Nov-07
Dec-07
Jan-08
Feb-08
Mar-08
Apr-08
May-08
Jun-08
Jul-08
Aug-08
89%
81%
90%
81%
89%
83%
89%
81%
90%
80%
89%
82%
100%
5%
100%
100%
5%
100%
100%
20%
100%
100%
85%
100%
85%
100%
85%
100%
85%
100%
80%
100%
85%
100%
20%
100%
90%
80%
90%
80%
90%
80%
90%
80%
90%
80%
90%
80%
100%
50%
100%
85%
80%
85%
80%
85%
80%
85%
80%
85%
80%
85%
80%
100%
25%
100%
25%
100%
25%
100%
25%
Customer Satisfation
TBD upon
analysis of
results from
first survey.
100%
95%
90%
90%
83%
91%
83%
90%
88%
83%
90%
84%
82%
90%
84%
85%
100%
85%
100%
100%
85%
100%
85%
85%
100%
85%
90%
80%
90%
80%
90%
90%
80%
90%
80%
80%
90%
80%
80%
85%
80%
85%
80%
80%
85%
80%
85%
80%
80%
85%
95%
80%
100%
34%
n.a.
90%
100%
34%
n.a.
95%
100%
32%
n.a.
95%
Financial Management
85%
2A
2B
2B
2C
1. Sodexho's Third Party Contractors are 1. Audit of Third Party Contractor qualifications and
appropriately qualified and insured - SEE
insurances
NOTE 1
2. Sodexho fulfils its statutory
2A. Provision of OH&S reports
compliance obligations under the
Agreement
2. Sodexho fulfils its statutory
compliance obligations under the
Agreement
2. Sodexho fulfils its statutory
compliance obligations under the
Agreement
85%
85%
83%
78%
85%
83%
93%
15
83%
85%
85%
83%
85%
95%
100%
80%
99,75%
95%
5%
99,75%
90%
80%
75%
90%
80%
90%
90%
80%
90%
90%
80%
90%
100%
5%
99,75%
80%
85%
80%
80%
85%
80%
80%
85%
80%
80%
85%
80%
100%
5%
99,75%
90%
100%
5%
99,75%
80%
90%
80%
90%
90%
80%
90%
90%
80%
90%
90%
80%
90%
90%
80%
90%
90%
80%
90%
80%
90%
90%
80%
88%
90%
100%
25%
100%
12
100%
25%
100%
100%
25%
100%
90%
100%
25%
100%
80%
90%
85%
Statutory Compliance
2A
95%
93%
100%
100%
NA
Weighting
FM 05
Target
FM 04
FREQ.
KPI'S
FM 03
1/16/2013
FM 06
2A
2B
1. Report completeness
1. All complete
Reports for item 1, 2 and 3 above include NOTE: In relation to the UGS financial report, if the
those reports listed in Schedule 24 and
timely delivery of this report is missed on two
any additional ad hoc reports agreed to
occasions in the contract year, the whole Service
by Westpac and Sodexho
Level is deemed to have failed.
2. Report timelines (furnished by due
date) Reports for item 1, 2 and 3 above
include those reports listed in Schedule
24 and any additional ad hoc reports
agreed to by Westpac and Sodexho
2. All on time
NOTE: In relation to the UGS financial report, if
the timely delivery of this report is missed on two
occasions in the contract year, the whole Service
Level is deemed to have failed.
Weighting
Overall
Target
Sep-07
Oct-07
Mar-08
Apr-08
May-08
Jun-08
Jul-08
Aug-08
90%
90%
90%
100%
15%
100%
90%
90%
90%
90%
100%
35%
100%
90%
90%
90%
90%
100%
50%
100%
90%
90%
90%
90%
20%
No more than
three failures
in a contract
year.
1. All complete
2. All on time
20%
3. All accurate
20%
4. All of the
above
No more than
three failures
in a contract
year.
No more than
three failures
in a contract
year.
30%
No more than
three failures
in a contract
year.
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
98,15%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
90%
17%
98,15%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
100%
17%
98,15%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
100%
17%
98,15%
95%
85%
90%
90%
95%
85%
90%
90%
95%
85%
90%
90%
100%
17%
98,15%
100%
17%
98,15%
90%
90%
90%
95%
85%
90%
90%
90%
95%
85%
90%
90%
97%
10%
89,20%
90%
90%
90%
95%
85%
90%
90%
90%
95%
85%
90%
90%
2A
2B
Feb-08
15%
1A
Jan-08
99%
Dec-07
90%
1. Service Line service is available as per 1. PABX report confirms availability within standard
Schedule 1 [1.12(ii)]
Serviceline operating hours
Nov-07
Service Line
3B
NA
3A
FM 10
1. Sodexho's Third Party Contractors are 1. Audit of procurement activity shows compliance
procured under agreed procedures with agreed procedures
SEE NOTE 1
3. Report accuracy
3. All accurate
Reports for item 1, 2 and 3 above include NOTE: In relation to the UGS financial report, if the
those reports listed in Schedule 24 and
timely delivery of this report is missed on two
any additional ad hoc reports agreed to
occasions in the contract year, the whole Service
by Westpac and Sodexho
Level is deemed to have failed.
FM 09
Target
Reporting
FM 08
FREQ.
KPI'S
Procurement
FM 07
90%
85%
95%
90%
85%
90%
1B
Severity 2
90%
40%
89,20%
90%
90%
90%
95%
85%
90%
90%
90%
95%
85%
90%
90%
1C
Severity 3
85%
40%
89,20%
90%
90%
90%
95%
85%
90%
90%
90%
95%
85%
90%
90%
1D
Scheduled
95%
10%
89,20%
90%
90%
90%
95%
85%
90%
90%
90%
95%
85%
90%
90%
90%
90%
90%
95%
85%
87%
90%
90%
95%
85%
90%
90%
85% (based on
Retail national
average)
30%
80,50%
90%
90%
90%
95%
85%
90%
90%
90%
95%
85%
90%
90%
Cleaning Services
1. Average national result on cleaning
audit show that service meets standard
1
2
70% (average of
those surveyed)
50%
80,50%
90%
85%
95%
90%
3A
100%
10%
80,50%
90%
85%
95%
90%
3B
100%
10%
80,50%
90%
85%
95%
90%
16
1/16/2013
FM 11
FREQ.
KPI'S
Target
Weighting
Overall
Target
Westpac Place
Westpac Place ( Cleaning )
A
Result of QA
NA
88%
90%
92%
88%
93%
90%
88%
91%
85%
92%
89%
90%
90%
90%
95%
85%
90%
90%
90%
91%
85%
90%
89%
90%
90%
90%
95%
85%
90%
90%
90%
95%
85%
90%
90%
90%
90%
90%
90%
90%
90%
85%
100%
30%
100%
90%
90%
90%
90%
93%
88%
90%
91%
90%
95%
91%
88%
91%
90%
92%
91%
89,20%
90%
90%
90%
95%
85%
90%
90%
90%
95%
85%
90%
90%
10%
89,20%
95%
85%
90%
90%
90%
95%
95%
85%
90%
90%
90%
95%
85%
40%
89,20%
90%
90%
90%
90%
90%
95%
85%
90%
90%
90%
95%
85%
95%
40%
89,20%
95%
85%
90%
90%
90%
95%
95%
85%
90%
90%
90%
95%
93%
88%
90%
90%
90%
95%
90%
88%
90%
80%
93%
89%
20%
TBD upon
analysis of
results from
first survey.
95%
85%
90%
90%
90%
95%
95%
85%
90%
80%
90%
95%
20%
TBD upon
analysis of
results from
first survey.
90%
90%
90%
90%
90%
95%
85%
90%
90%
80%
95%
85%
95%
85%
90%
90%
90%
95%
95%
85%
90%
80%
90%
95%
90%
90%
90%
90%
90%
95%
85%
90%
90%
80%
95%
85%
17
92%
90%
90%
Aug-08
100%
Jul-08
100%
4 . Ratings as agreed
Jun-08
30%
May-08
20%
10%
3 . Ratings as agreed
Apr-08
70%
97%
Mar-08
100%
2 . Ratings as agreed
Feb-08
Jan-08
Dec-07
100%
1 . Ratings as agreed
Nov-07
20%
Oct-07
85%
Sep-07
to be set
to be set
to be set
20%
TBD upon
analysis of
results from
first survey.
to be set
20%
TBD upon
analysis of
results from
first survey.
20%
TBD upon
analysis of
results from
first survey.
100%
90%
1/16/2013
95%
90%
85%