Sie sind auf Seite 1von 30

Patch Mana gement

usin g S ix S ig ma
PatchPhoenix
Management
ISSA Chapter
July 11, 2006
Process using the Six
Sigma Methodology
Phoenix ISSA Chapter meeting – July
11, 2006

Dee Ramon, CISSP


LIRM
CSC Proprietary 09/17/09 05:11 AM 3115_FMT 1
Agenda
 CompanyA, CompanyB and CSC
 Six Sigma
 Six Sigma Patch Management - Case Study
 Measured Results – CompanyB only
 Summary comparisons - before & after

 Conclusion

 Questions?

 Real life challenges interspersed throughout

2
CompanyA, CompanyB and CSC
 CompanyA Inc. spun off One Sector business into a
completely separate publicly traded company.

 CompanyB has
• 22,000 Employees worldwide in 30 countries
 CompanyA outsourced infrastructure Business to CSC in
2003 in a 10 year agreement. CompanyB signed a similar
agreement
 Variety of players, new roles, significant change of people
and roles

3
Timeline
March 2004 13 Jan, 2005
CompanyB name announced Network Separation
20 Jan, 2005
Project ends
15 May, 2004
6 Sigma Project Initiation

May 2004 - January 2005 February 2005 - March 2006


6 Sigma Project CompanyB Only Patch Management

January 2005 January 2006


March 2004 March 2006

13 July 2004

• Competing activities made priorities interesting 4


What is Six Sigma?
Six Sigma is a process improvement methodology using data and statistical analysis to identify and fix
problem/opportunity areas. Six Sigma also refers to a deployment model that aligns employees with a
series of high-impact projects.

Over the past ten years, Six Sigma has delivered a variety of benefits to companies, e.g.,
• reducing costs
• increasing revenue
• improving process speed
• raising quality levels
• deepening customer relationships

In addition, Six Sigma has been used across a variety of industries and business models, from
manufacturing to services.

Companies using Six Sigma include:


• General Electric
• AlliedSignal
• Dow
• DuPont
• Ford CompAor Company
• Merrill Lynch
• Toshiba

Six Sigma has provided billions of dollars of top-line growth and bottom-line earnings improvement.

5
Real life

 Six Sigma was very ingrained in


CompanyA culture
 Six Sigma was new to CSC

 However……

6
Sigma and it’s practical use in the
real world
 CompanyA pioneered sigma use in 1986 to improve product quality by driving variance out of the
manufacturing processes

 1 sigma equates to 68% of values being within 1 standard deviation

 Original goal was six sigma product quality


• Under 3.4 defects per million opportunities (practically 99.999%)
• 100x quality improvement in five years

 Methodology was later applied to business processes

 CompanyA has saved $16 billion to date

Units: 1,000 circuit boards Units: 9.9 million airline flights in 2004

Opportunities: 58 Opportunities: 1
(1 board + 13 resistors + 4 capacitors + 2 diodes + 38 solder points)
place graphic place graphic
Defects: 25 crashes resulted in fatalities
Defects: in this area
18 boards in this area
Sigma: 6.52
Sigma: 4.92

7
Digital Six Sigma Process Roadmaps

DMADDD

Define • Alignment & mobilization


• Clarity & CompAivation
Measure • Profound understanding
• Breaking tradition
Analyze • Institutionalizing change
DMAIC place graphic DMADV
in this area

Improve Design Design

Control Digitize Verify


Product Or
Process
Draw Down Process
Improvement
Design
Productivity
Increase

Slide 8
TM CompanyB™ and the CompanyB logo are trademarks of CompanyB Semiconductor, Inc. All other
product or service names are the property of their respective owners. © CompanyB Semiconductor, Inc.
2005.
D M A I C
Patch Management Process

D M A I C

Wh at is Ho w a re Wh at is Wh at ne ed s Ho w d o we
im por ta nt we wro ng ? to be Guar an te e
? doin g? do ne? pe rf orm an ce?

9
D M A I C
Phase/Activity Target Date Comp Date

Define
 Schematic (Yyx Alignment)
 Team Charter
 SIPOC
 “AS-IS” Process Map
 Voice Of Business/Voice of Customer to CTQ’s
 Cause-n-Effect Diagram
 Quick Wins Identified
Measure
 Data Collection Plan
 Operational Definition
 Source of Variation Study
 Sigma Analysis
 Process Capability

10
D M A I C
Phase/Activity Target Date Comp Date

Analyze
 Pareto Analysis & Stratification
 Regression Analysis
 Root Cause Analysis
Improve
 Cost Benefit Plan
 Alternative Solutions Identified
 “SHOULD BE” Process Map
 Change Plan
 Pilot Plan and Results
Control
 Digitization Plan
 Standardization/Adoption Plan
 Lessons Learned and Feedback

11
“Define” phase Schematic (Yyx alignment)
D M A I C

What i s the Big Goal? Reduce Threat to CompanyA Business through Patch Management
(The Bi g Y )

What are Cycle Time Impact Patch Compliance


7 days
the inputs? 0 100%
(The
litt le y’ s) Automation Quality
Standards & Policy
What
driv es Process
Efficiency Team
the y’s? Downtime Scope (Asset Database)
(Th e vita l x’ s)
Communications

Timing Environmental Capability (“patch-


ability”) (SMS failure root causes
Customer’s or Business’ and criteria for alternative)
Schedule (system shutdowns,
closures)

IT Resources (bandwidth,
vacation backups)

Shortly after we finished this, the six sigma


trained project leader was reassigned….. 12
D M A I C
Business Case
“Define” phase
Opportunity Statement
Team Charter
The rate and frequency of security vulnerabilities and the exploitation
of those vulnerabilities that have disrupted CompanyA business
operations in the past year has been increasing steadily. Delays in
Risk to CompanyA’s business assets, productivity and reliability can
be decreased by reducing the cycle time of the patch management
process and increasing the compliance rate.
patching vulnerable systems continue to represent significant risk and Cost of remediation can be decreased as a result of patch
cost to CompanyA. management cycle time reduction as less CompanyA resources are
diverted from their jobs to address issues resulting from vulnerable
systems.
Goal Statement Project Scope
Reduce patch management cycle time and impact and increase Each step in the process from the CompA/CSC agreement to patch…
compliance. … to……100% of systems identified as vulnerable being patched.

Project Governance:
Targets Sponsors: CISO VP GIS VP
Total cycle time from Patch Availability to Implementation: 7 days Champions: CSC VP CSC Ops Manager
Impact : 0
Compliance: 100% Project Steering Committee: CSC Security Ops Director, CompA
Security Ops Director
Process Owners: TBD after SIPOC

Project Plan Team Selection


One End-to-End Project Project Leader: CompA IT Security Black belt
Digital Six Sigma DMAIC process methodology will be used CSC Co-Leader: CSC SMS Manager
Joint team of CompanyA, CSC and Foundstone representatives CompanyA Master Black Belt: GIS VP Assistant
Identification of sub-project dependent upon acceleration of program Finance: TBD
(TBD)
Team:
DEFINE stage completion: June 11th, 2004 * SMS: package lead
MEASURE stage completion: July 9th, 2004 Servers: CSC Server Manager
ANALYZE stage completion: August 13th, 2004 Field Services: CSC regional manager
IMPROVE stage completion: September 1, 2004 Help Desk: TBD
Non-Managed: MIPS specialist
Dependent on resources being allocated appropriately. Exceptions: MIPS Sector manager
CSC Security: CSC Ops manager, CSC Europe 13security manager
CompA Security: MIPS Microsoft specialist
D M A I C

“Define” phase - SIPOC


Start Boundary: Decision to Patch End Boundary: Patching Completed

5 4 3 1 2
SUPPLIERS: INPUTS: PROCESS OUTPUTS: CUSTOMERS:
MCERT Team Create
Dep loyment Plan / SMS / System
(CSC, CompA, MS) MCERT Discussion Techni cal Co mmun icati on
deployment plan Admins
Create / Certify Certified Package SMS / System
Microsoft .cab file Admins
package
MCERT Team Deployment Plan/ Create / Send Communication
Test All CompanyA / CSC /
Certified Package Email Contractors

CSC / SMS Team Communication / Patch result Desktop Owners / Field


Certified package Deployment Service / MIPS
MI PS/ CSC / Sect or
MIPS Specialist / CSC Foundscan / HINV / Identification of Tracking DB CIOs
Scan administrator Various DBs / Art Jr. Vuln systems / Sy s Ad mi ns /
MIPS
MI PS Sect
/CSCor /Man
Sectagorer
MIPS Sector Failed Patch Process/ Alternate Upda ted T rack ing DB CIO s
Managers Updated Tracking DB remediation & Man ua ll y patc h s ys tems / Sys Ad min s /
MIPS Sect or Man ag er
MIPS Sector CIO Approval / La st di tc h effort Desk top Own ers /
MI PS Sec tor Mang ers /
Managers MIPS Sector mangers Closure (Segreg ati on / Disc onne ctiCIO
on ) s / App licatio n O wne rs

This did help define high level process and 14


D M A I C
“Define” phase
AS-IS process map

This was a lot of work, but valuable as


few people knew the entire process;
most people understood just their piece 15
D M A I C

“Define” phase - Quick Wins


DSS Solution Kit
Project: Equipment Returns Rate Reduction
Answer yes or no as to whether the condition applies
to the potential quick win opportunity.

Easy to Fast to Cheap to Within the Benefits


Potential Quick Win Opportunity Implement Implement Implement Team’s will be Reversible Implement
Increase Penetration of SMS tool YES NO YES NO YES YES NO
Create standard SMS document for SMS admins YES YES YES YES YES YES YES

Define Parameters of a heathly SMS client YES YES YES YES YES YES YES
Publish the policy for Infrastructure compliance YES YES YES NO YES YES NO
Weekly compliance reporting status Note: we can do YES YES YES YES YES YES YES
this easily from the central server web reporting.

Weekly Healthy Client Compliance. Note: we can YES YES YES YES YES YES YES
generate the reports today, Is this just reporting?

Define a standard communications policy YES YES YES NO YES YES YES
Standard user FAQ area YES YES YES YES YES YES YES
Publish policy for patching - cycle time NO NO YES NO YES YES NO
Make available all bundles in one place YES YES YES YES YES YES YES
List of workstations to be spoonfeed into sms - this
will not include remediation of any issue YES YES YES YES YES YES YES
Pilot announcement to include specific directions on
how to open tickets - specific subject like MS04-028
pilot issue YES YES YES YES YES YES YES

16
Hard to get priorities to do quick wins – but most got done
D M A I C
“Measure” phase - Data collection plan
Other Data
That Should
Be Collected
Data Source and Who Will Collect When Will Data Be How Will Data Be at the Same Historical available
Performance Measure Operational Definition Location the Data? Collected? Collected? Time Comments Due date Yes/No
Describe Defect Continuous
Impact This won't reflect people
Number of users with who have installed on
SMS clients who have their own and are not in
Number of formal pilot installed at the middle Via SMS query at SMS
users who have installed SMS deployment and the end of the pilot middle and end of Dee needs to check if we
Number of pilot users patch during pilot period SMS team period pilot can do this historically 8-Oct N
Number of unique issues reportedThis includes both the
during deployment number of distinct issues
reported about the patch view of Monet tickets
during the deployment opened during Consistent Monet profile -
period and the number of deployment period Try to see if we can get
total issues reported about daily for first week of that were reported history on this and provide
the patch during the deployment; after that against the patch number of issues
deployment period. Monet weekly deployment published via FAQ 8-Oct N
Cycle-time
Pilot cycle-time Time measured in days sms. SMS team end of pilot via SMS.CompA.com 30-Sep Y
from pilot start date to pilot database for Nov
end date ( F - E) 2003 to current if
available
Patch deployment cycle-time Time measured in days VirusUpdate MIPS End of deployment in daily/weekly scan 30-Sep Y
from package deployment (scanning data), period process for Nov 2003
to start to when it passed. Update expert to current if available
A approx 1,000 ( for all records, SMS central
systems) are vulnerable - server
approx 98%
Compliance
Unhealthy SMS clients - SMS Total number of machines SMS for machines SMS Team snapshot from last Total number of Don't think we can collect8-Oct ????
cannot patch these systems. that don't have healthy that have SMS week and this week reachable SMS historically
SMS clients (has SMS butinstalled and do not reports managed machines
not returned inventory have a healthy client that have not
within X days) reported inventory in
X days ( X = 7, 14,
21, 30 not reporting
in inventory).
Total Windows machines on Total Windows machines Foundscan, CSC = Foundscan,snapshot Superset of: - Machine name and if Oct 8 for list fo fields next week to start
CompA network that should be on CompA network that CompCheck, Browse CompCheck Foundscan, BDNA, necessary domain or Oct 15 have data completion by Oct 8,
SMS managed should be SMS managed -lIst, BDNA CompCheck and workgroup - minus Oct 20 have data ready review Oct 1
includes CompA CompA IT = Browse browse list. Will duplicates and servers for review
owned/leased not lIst determine what ideal we are not going to do
personaly owned / short set of fields to collect. historically as the data
term contractor machines CompA IT = BDNA doesn't change that much;
should start on this next
week
Fair amount of work, but useful as it defines what data if available 17
Real life - Measure
 What we selected and originally started to
measure was based from Vulnerability
scanning
 Accurate, but this wasn’t available for every
vulnerability; also point in time versus
continuous
 Ended up using SMS, not as good coverage,
but could be used consistently
 Had to do ‘measure’ twice

18
D M A I C
“Analyze” phase
Pareto and root cause
 This shows the
best areas to focus 50%
Pareto Diagram
1.2000
on are: 45%
1.0000
40%

• Unmanaged 35%
0.8000
systems Rel. Freq.
30%

25% 0.6000

• Broken / unhealthy 20%

clients 15%
0.4000

10%
0.2000

• Scheduling 5%

exception 0% 0.0000

s
ed

e
SP

on
g

n
ay

re
s

ay

ht
s
r)*
s

ur
in
nt

io
tu
ac
re
ag

si
rig

i lu
el

S/

el
ro

os
nd

pt
ie

ta
lu

is
sp
/d

fa
an

O
cl

ce
er

cl

in
s
ai

m
pe

s
ns

sk

ng
m

m
d
y

om

ex
es
n
or
Failure Mode
th

rte
q

w
io
t

ad
i
io

 oo
S

nd

rti
re

oc
no
al

ch
c
at
pt
M

po

s/

po
de
e-

eb
he

ve

pr
ic

nk
ce

at
S

on
up
Pr

re
pl

d/
R

d(
un

p
S
U
ot

ex

si
ap

le
ns

Effect Analysis

SM
ile

y
't
/
N

is
en

an

or
ab
g

fa

rm
g
l in

nt
C
ok

in

is
h

ve
Pe
du

D
nd
tc
Br

In
he

Pa

identified high risk


Pe

S
SM
Sc

areas
Measuring this challenged some long held assumptions 19
D M A I C
1. Standard change control 16. Standard disconnection policy 21. CSC Patchmanager
window for servers 22. Standard procedures for Field Services
17. Server ownership process 23. Replace machines for which automated
2. Standard change window housekeeping
for labs/factories 18. Network team to provide 24. Automate housekeeping of machines with
network list low disk space
3. Login script to
communicate patch 19. Report on both pkg. success
status and applicable
High reports for pending
20. Standard
4. Group policy to enforce
SMS client reboot
16
5. Move all machines into AD
domain
1
6. Ongoing SMS client health
monitoring
Benefit 2
7. SMS auto-discovery tools 9
for machines in AD 21
domains 11 3
20 4
8. Predictable Reboot delay 22 6
19 8 12 5
9. Allow SMS pull during 24
14 10 13 7 17
communications time
Low 18 15 23
10. Add SMS installer to
image Low Effort High
11. Increase hardware for
Foundstone scanning
20
12. Foundstone Enterprise
“Improve” phase
Long Term recommendations D M A I C
1. Network Admission Control 13. Notification via e-mail on
patch success
2. Patch Management Tool vs.
SMS
3. Group policy to prevent
login when not patched
4. Group policy to enforce
standards High 1

5. Forced reboot if patched


and pending reboot
10
6. Open tickets for any
machines that don’t have 3
SMS client
7. Consistent means of
Benefit 2
detection of all Windows
machine on network
8. Select business users be 9
part of Microsoft pilot 5 12
4
13
program
6
8 11
9. Internal Windows update 10 7
infrastructure Low
10. IPS
Low Effort High
11. Proactive enforcement of
OS/SP standards
12. Proactive enforcement of
IE standards 21
“Improve” phase Compliance projection after 7 days
D M A I C

Compliance Projection
5
4
Sigma

3
2
1
0 Ms03-043 Ms04-007 Ms04-011 Ms04-022 Ms04-022* ms04-028* Ms04-032* Ms04-040* Short term Long term - Long term -only
changes w ithout netw ork
netw ork admission
admission control
control

Deployment
 Sigma compliance by the number of systems not patched 7 days after
deployment.
 6 sigma is achievable if network admission control keeps non-compliant
systems off the network.
 CompanyA estimated a cost of poor quality reduction by $800K22 per year.
D M A I C
“Improve” phase
Cycle time projections
 Projection of cycle time Estimated cycle time changes
reductions by
implementing changes:
• Short term 66
• Long term - without 47
60
network admission 14
• Long term – just
network admission 0 10 20 30 40 50 60 70
Cycle time in days
 Next steps:
• Quantifiable benefits Present
• ROI calculations Short term changes
• Expected results Long term changes without network admission
Long term changes - just network admission control

Network admission control was put off to a


23
separate project for CompanyA
Measured results – CompanyB only

Emergency Patch Compliance - 2005


Approx. 17,500 systems
100%

90% J anuary(HTML)

February(SMB)
80%
February(Multiple)
70%
J une (SMB)
% Patched

60% J uly(Multiple)

August (PnP)
50%
October (MSDTC)
40% November (Graphics)

30% December (IE)

20%

10%

0%
Days
1
90%3 can 5be achieved
7 9
within
11
7 days
13
during
15
accelerated
17 19
scheduling
21 23
workdays only

24

Measured results
Improved High & Critical patch
– CompanyB only
deployments
• High risk patch achieves 90%
• 24 days in 2005 -> 12 days in 2006 90% = 2.78 sigma
• Critical risk patch achieves 90%
• 12 days in 2005 -> 7 days in 2006

Emergency Patch Compliance - Jan2006


Approx. 17,500 systems
100%
90% (5) 2005
High
80%
(4) 2005
% Patched

70% Critical
60% January
50% (WMF)

40% January
(TNEF)
30%
20%
10%
0% Days
1 3 5 7 9 11 13 15 workdays only
25
Measured results – CompanyB only
CompanyB Microsoft Patch Compliance - 2005 / 06
(approx. 17,500 SMS managed workstations)
100 100.0%

98.2% 98.4%
97.9% 98.0% 98.5%
97.3% 97.5% 1
80 1
95.4% 2 2
Average # of patches

95.0%
2
per workstation

60 3 2 % Compliance
9
90.0%
11
83 87
40 81 82
72 73
64
85.8% 59
49 85.0%
20 42
Missing

80.9% Installed
0 80.0%
March May June July August September October November December January Compliance

 1 in 5 security patches missing prior to Patch Manager implementation


in May 2005.
26
 1 in 87 today, 17x improvement
CompanyB Patch Management
Process Internal Source

Notification of Report with


FIS Analyst Matches High or Risk Assessment
Security Update deployment plan

Monthly patch process follows


categorizes Critical criteria? meeting.
 is now available options.

Vendor release,

Microsoft release schedule CERT, etc

Medium / Low
 Reboot delay options High Critical

• 3 times, up to 60 minutes Regular patch


bundle
Patch certification
and Lab test Patch bundle Special advisory
CISO / CIO
change control
bypass approval
Accelerated
certification, lab

each based on the


assessment.
and pilot initiated

 Limited use of disconnections Pilot Test Advisory updated Mandatory


Patch certification
with single EXE Deployment

Incorporated High risk patches into


and Lab test
 patch. commences

the monthly regular schedule Analyze cause


Report if

• Process is standardized
with vendor for Pass Patch deployed as
problems Report if Daily patch
resolution. Pilot Test advertised updates
Fail problems reports
for 2 days.
Pass

• Only one single restart per


Fail
Retest
Analyze cause Patch is re-
with vendor for advertised 3 days
month Regular monthly
deployment
resolution. Deploy Daily patch reports after the initial
mandatory
deployment.

• Includes prior security patches FIS Decision

if missing Deployment
changed to
mandatory after 2
IT Management
informed where
compliance is
days. low.
 Critical risk patching follows an
accelerated process Compliance
reports changed to
Schedule

• Change control bypass


deployment or
weekly after 10
Exception?
daily reports.

approval for Servers


• Single patch only
Process stops Exception
after 30 days request.

27
Conclusion
 Start with client needs – “voice of the customer”

 Get management support by following the Digital Six Sigma methodology (or another company
backed program)
• Define, Measure, Analyze, Improve, Control
• http://www.isixsigma.com
• http://sixsigmatutorial.com
 Follow consistent processes
• Reduce variability is key
 Use tools that provide fast and accurate data (correct tools for the job)

 Baseline, improvement trends and compliance


• If you can’t measure it, you can’t manage it.
 Develop solutions with a clear business case

28
Real Life Conclusions

 Measurement, process and tools are key


to getting improvements; We
implemented all three ( with or without
six sigma)
 Teams add a lot; but are also a challenge
to keep focused
 Change can be interesting

29
Questions?

30

Das könnte Ihnen auch gefallen