Sie sind auf Seite 1von 43

Software Security Testing

Gary McGraw, Ph.D.


CTO, Cigital

http://www.cigital.com

© 2004 Cigital
Software security is getting harder

The Trinity of Trouble


 Connectivity
 The Internet is everywhere
The network is
and most software is on it the computer.
 Complexity
 Networked, distributed,
mobile code is hard
 Extensibility
 Systems evolve in
unexpected ways and are
changed on the fly
…issimple
This this complex
.NET program
interface…

© 2004 Cigital
Commercial security is reactive
 Defend the “perimeter” with a
firewall
 To keep stuff out
 Promulgate “penetrate and
patch”
 “Review” products when
they’re complete
 Why your code is bad
 Too much weight on
penetration testing
The “fat guy with keys” does
 Over-rely on security functions not really understand software
 “We use SSL” development. Builders are only
recently getting involved in
security.

© 2004 Cigital
Making software behave is hard
 Can you test in quality?
 How do you find (adaptive) defects in code?
 What about bad guys doing evil on purpose?

 What’s the difference between security testing and


functional testing?
 How can you analyze security design?
 How can you codify non-functional, emergent
requirements like security?
 Can you measure security?

© 2004 Cigital
More code, more defects

Windows Complexity

45
40
Millions of Lines

35
30
25
20
15
10
5
0
Win Win Win 95 NT 4.0 Win 98 NT 5.0 Win XP
3.1 NT (1997) (1998) (1999) (2000) 2K (2002)
(1990) (1995) (2001)

© 2004 Cigital
Software vulnerability growth

© 2004 Cigital
Normalized (and slightly shifted) data from Geer

© 2004 Cigital
Security problems are complicated
IMPLEMENTATION BUGS ARCHITECTURAL FLAWS
 Buffer overflow  Misuse of cryptography
 String format  Compartmentalization
 One-stage attacks problems in design
 Race conditions  Privileged block protection
 TOCTOU (time of check to failure (DoPrivilege())
time of use)  Catastrophic security failure
(fragility)
 Unsafe environment variables
 Type safety confusion error
 Unsafe system calls
 System()
 Insecure auditing
 Broken or illogical access
 Untrusted input problems
control (RBAC over tiers)
 Method over-riding problems
(subclass issues)
 Signing too much code

© 2004 Cigital
Can we “prove” security?
 Formal methods exist for  Short of provably secure
demonstrating certain systems, what is left?
properties hold given a
particular system  Risk analysis
 Security testing
 These approaches have by  Code review (static
and large not proven to be analysis)
economically viable
 B2 operating systems
 Penetration testing
 Mondex smart cards
 (Ad hoc, yet repeatable
processes)
 Embedded systems may be
a decent applied target

© 2004 Cigital
History is quirky
1995 Fall 2004
 Dan Farmer fired from  John Aycock at University of
Silicon Graphics for Calgary publicly criticized
releasing SATAN with for malware course
Wietse Venema  FUD: possible bad guy
 FUD: possible attack tool! factory

2004 Should we talk about


attacking systems?
 Any system administrator
not using a port scanner to
check security posture runs
the risk of being fired

© 2004 Cigital
The good news and the bad news
Good news Bad news

 The world loves to talk  The world would rather


about how stuff breaks not focus on how to
build stuff that does not
 This kind of work break
sparks lots of interest in
computer security (and  It’s harder to build good
software engineering) stuff than to break
junky stuff

© 2004 Cigital
Security as Knowledge Intensive

© 2004 Cigital
Stick to your principles
1. Secure the weakest link – Keep it simple
2. Practice defense in depth – Promote privacy
3. Fail securely – Remember that hiding
4. Follow the principle of least secrets is hard
privilege – Be reluctant to trust
5. Compartmentalize – Use your community
resources

© 2004 Cigital
Knowledge: 48 Attack Patterns
 Make the Client Invisible  User-Controlled Filename
 Target Programs That Write to Privileged OS Resources  Passing Local Filenames to Functions That Expect a
 Use a User-Supplied Configuration File to Run URL
Commands That Elevate Privilege  Meta-characters in E-mail Header
 Make Use of Configuration File Search Paths  File System Function Injection, Content Based
 Direct Access to Executable Files  Client-side Injection, Buffer Overflow
 Embedding Scripts within Scripts  Cause Web Server Misclassification
 Leverage Executable Code in Nonexecutable Files  Alternate Encoding the Leading Ghost Characters
 Argument Injection  Using Slashes in Alternate Encoding
 Command Delimiters  Using Escaped Slashes in Alternate Encoding
 Multiple Parsers and Double Escapes  Unicode Encoding
 User-Supplied Variable Passed to File System Calls  UTF-8 Encoding
 Postfix NULL Terminator  URL Encoding
 Postfix, Null Terminate, and Backslash  Alternative IP Addresses
 Relative Path Traversal  Slashes and URL Encoding Combined
 Client-Controlled Environment Variables  Web Logs
 User-Supplied Global Variables (DEBUG=1, PHP  Overflow Binary Resource File
Globals, and So Forth)  Overflow Variables and Tags
 Session ID, Resource ID, and Blind Trust  Overflow Symbolic Links
 Analog In-Band Switching Signals (aka “Blue Boxing”)  MIME Conversion
 Attack Pattern Fragment: Manipulating Terminal Devices  HTTP Cookies
 Simple Script Injection  Filter Failure through Buffer Overflow
 Embedding Script in Nonscript Elements  Buffer Overflow with Environment Variables
 XSS in HTTP Headers  Buffer Overflow in an API Call
 HTTP Query Strings  Buffer Overflow in Local Command-Line Utilities
 Parameter Expansion
 String Format Overflow in syslog()

© 2004 Cigital
Attack pattern 1:
Make the client invisible
 Remove the client from the
communications loop and
talk directly to the server

 Leverage incorrect trust


model (never trust the
client)

 Example: hacking browsers


that lie (opera cookie foo)

© 2004 Cigital
Warning! Knowledge can be easily misused
Software security “Application security”
 Requires input into design  Works for COTS software
and implementation  Low expertise
 High expertise  Black box dynamic testing
 Design software to be  Protection of installed code
secure  Policy issues
 Build secure code  Outside  In
 Security analysis
 Security testing
??
 Inside  Out
Deep Who
Trouble Knows

badnessometer

© 2004 Cigital
Attackers are Software People

© 2004 Cigital
Attackers do not distinguish bugs and flaws
 Both bugs and flaws
lead to vulnerabilities
that can be exploited

 Attackers write code to


break code
 Defenders are network
operations people
 Code?! What code?

© 2004 Cigital
The attacker’s toolkit
 The standard attacker’s toolkit has lots of (software
analysis) stuff
 Disassemblers and decompilers
 Binary scanners
 Control flow, data flow, and coverage tools
 APISPY32
 Breakpoint setters and monitors
 Buffer overflow kits
 Shell code, payloads (multi-platform)
 Rootkits (kernel, hardware)

© 2004 Cigital
Attacker’s toolkit: buffer overflow foo
 Find targets with static analysis  Trampolining past a canary
 Change program control flow
 Heap attacks Function arguments

 Stack smashing Return Address


 Trampolining
Canary Value
 Arc injection
 Particular examples Frame Pointer

 Overflow binary resource


Local Variable: Buffer A
files (used against
Netscape) Local Variable: Pointer A

 Overflow variables and tags


Local Variable: Buffer B
(Yamaha MidiPlug)
 MIME conversion fun
(Sendmail)
 HTTP cookies (apache)

© 2004 Cigital
Attacker’s toolkit: other miscellaneous tools
 Debuggers (user-mode)
 Kernel debuggers
 SoftIce
 Fault injection tools
 FUZZ
 Failure simulation tool
 Hailstorm
 Holodeck
 Boron tagging
 The “depends” tool
 Grammar rewriters

© 2004 Cigital
Breaking stuff is important
 Learning how to think like
an attacker is essential
(especially for good testing)
 Think hard about the
“can’ts” and “won’ts”
 Do not shy away from
teaching attacks
 Engineers learn from
stories of failure
 Testers must deeply
understand how things
break

© 2004 Cigital
Stuff that Works for Cigital

© 2004 Cigital
Software security in the SDLC

Security External Static Penetration


requirements review analysis testing
(tools)
Abuse Risk Risk-based Risk Security
cases analysis security tests analysis breaks

Requirements Design Test plans Code Test Field


and use cases results feedback

© 2004 Cigital
Requirements phase: Abuse cases
 Use cases formalize normative behavior (and assume correct
usage)
 Describing non-normative behavior is a good idea
 Prepare for abnormal behavior (attack)
 Misuse or abuse cases do this
 Uncover exceptional cases
 Leverage the fact that designers know more about their system
than potential attackers do
 Document explicitly what the software will do in the face of
illegitimate used

© 2004 Cigital
Design phase: Architectural risk analysis
 Designers should not do this
 Build a one page white board
design model (like that )
 Use hypothesis testing to
categorize risks
 Threat modeling/Attack
patterns
 Rank risks
 Tie to business context
 Suggest fixes
 Repeat

© 2004 Cigital
Risk analysis must be external to the team
 Having outside eyes look at  Red teaming is a weak form
your system is essential of external review
 Designers and  Penetration testing is too
developers naturally get often driven by outside
blinders on
in perspective
 External just means
 External review must
outside of the project
 This is knowledge include architecture
intensive analysis
 Outside eyes make it easier  Security expertise and
to “assume nothing” experience really helps
 Find assumptions, make
them go away

© 2004 Cigital
Risk assessment methodologies
 These methods attempt to Commercial
identify and quantify risks, then  STRIDE from Microsoft
discuss risk mitigation in the  ACSM/SAR from Sun
context of a wider organization
 SQM from Cigital

 A common theme among these


approaches is tying technical Standards-based
risks to business impact  ASSET from NIST
 OCTAVE from SEI

© 2004 Cigital
A prototypical analysis
 Learn as much as possible about the product
 Read specs and other design materials
 Discuss/brainstorm with group
 Play with the software (if it exists)
 Study the code
 Discuss security issues surrounding the software
 Argue about how the product works
 Identify possible weaknesses
 Map out exploits and discuss possible fixes
 Report findings
 A description of the major and minor risks
 Provide info on where to spend limited mitigation resources

© 2004 Cigital
29
SQM: Cigital’s architectural risk analysis
 Identify what to protect, from whom, how long
 Identify how much it is worth to keep data protected at every
point of the data lifecycle
 Identify other relevant high level security requirements
 “At least as secure as the average competitor” (a common
requirement, and quite acceptable)
 Identify any assumptions built into the system
 Drive testing from risk results
 Develop reusable security guidelines

 Risk Analysis (especially at the design level) is knowledge


intensive

© 2004 Cigital
Design phase: Risk analysis process
 Learn as much as possible about the
system Architectural Analyses Interim Work
Products
 Read specs, discuss with group Attack
Resistance
Ambiguity
Analysis
(Leveragable
Analysis
Data)
 Play with the software (if it exists)
Underlying Guidelines and Rules
Framework

 Study the code Weakness


Analysis

‘Attack’ Patterns,

 Discuss security issues surrounding the


Exploit Graphs

software Further Analyses


Warranted?
White Papers

Requested?
 Argue about how the product works Vulnerability
Data

 Identify possible weaknesses


Yes

Back-end Analyses
 Map out exploits and discuss possible Exploit
Creation, Code Review
Exploration Serve as content

fixes
KMS
and appendices

 Report findings Mitigation, Reporting

 A description of the major and minor Identify, Describe, &


Manage Risk(s)
[See RM Comp]
Deliver Findings
[See CPT Comp]
Assessment Report

risks Mitigation

 Provide info on where to spend limited


Impact
Analysis
End of Process

resources Client:
Project’
Manager

© 2004 Cigital
31
Process: Attack resistance
 Identify general flaws  Attack Patterns
 Non-compliance  Pattern language
 Where guidelines are not  Database of patterns
followed  Actual flaws from clients
 Map applicable attack patterns  Exploit Graphs
 Identify risks in architecture  Ease mitigation
 Consider known attacks  Demonstrate attack paths
against similar technologies  Secure design

Example flaws from experience…


 Transparent authentication token generation/management
 Misuse of cryptographic primitives
 Easily subverted guard components, broken encapsulation
 Cross-language trust/privilege issues

© 2004 Cigital
Process: Ambiguity analysis
 Consider implications of design  Apprenticeship model
 Generate separate arch.  Use system, technology experts
diagrams  Win32 knowledge
 Unify understanding  JVM/managed code
 Uncover ambiguity  Language/compiler knowledge
 Identify downstream  Previous experience
difficulty (traceability)
 Unravel convolution

Example flaws from experience…


 Protocol, authentication problems
 Javacard applet firewall, inner class issues, instantiation in C#
 Type safety and type confusion
 Password retrieval, fitness and strength

© 2004 Cigital
Process: Weakness analysis
 Consider systemic flaws  Experience base
 COTS  Assessments of COTS and
 Frameworks platforms
 Network topology  Attack patterns

 Platform  Other resources

 Identify services  Mailing lists

 Map weaknesses to  Product documentation

assumptions
Example flaws from experience…
 Browser and other VM sandboxing failures
 Insecure service provision: RMI, COM, etc.
 Debug (or other operational) interfaces
 Unused (but privileged) product “features”
 Interposition attacks: DLLs, library paths, client spoofing

© 2004 Cigital
Test phase: Two kinds of security testing
 Test security functionality
 Cover non-functional requirements
 Security software probing

 Risk-based testing
 Use architectural risk analysis results to drive
scenario-based testing
 Concentrate on what “you can’t do”
 Think like an attacker
 Informed red teaming

© 2004 Cigital
Test phase: Risk-based testing
 Identify areas of potential risk in the system
 Requirements
 Design
 Architecture
 Use abuse cases to drive testing according to risk
 Build attack and exploit scenarios based on identified risks
 Test risk conditions explicitly

 Example: Overly complex object-sharing system in Java Card

 Paper available: send e-mail

© 2004 Cigital
Code Phase: Code review
 Code review is a necessary evil  Implementation errors do
 Better coding practices make matter
the job easier  Buffer overflows can be
 Automated tools help catch silly uncovered with static
errors analysis
 Fortify/dev (Cigital rules)  Static analysis (Fortify)
 over 550 C/C++ rules
 Over 25 Java rules
 Tracing back from vulnerable
location to input is critical
 Software exploits
 Attacking code

© 2004 Cigital
State of the art in static analysis for security
 Parse code  Commercial analyzers this
 Build abstract syntax tree year
 Fortify
 Visit AST with (mostly local)
visitors that enforce “rules”  Coverity
 Use control and data flow  Ounce
analysis to refine rules
 Code scanning tools
 BUG-ocentric CANNOT find flaws
 Buffer overflows
 Process control
 Memory
 Simple timing issues
 Dangerous system calls

© 2004 Cigital
Fielded system phase: Penetration testing
 A very good idea since software is bound in an
environment
 How does the complete system work in practice?
 Interaction with network security mechanisms
 Firewalls
 Applied cryptography
 Penetration testing should be driven by risks
uncovered throughout the lifecycle

 Not a silver bullet!


 Often misused in the real world

© 2004 Cigital
Stuff to work on
 How do we describe and  How can we analyze early
characterize knowledge lifecycle artifacts for general
needed for architectural risk purpose principles (e.g.,
analysis compartmentalization)?
 Can formal methods
help?  What about metrics?

 What kinds of rules should


be statically enforced for
security?
 Can we move up the
food chain toward
guidelines?
 What about time?

© 2004 Cigital
Pointers

© 2004 Cigital
IEEE Security & Privacy Magazine

 See the department on Software


Security best practices called
“Building Security In”

 Also see this month’s special


issue on breaking stuff

http://www.computer.org/security

© 2004 Cigital
Pointers
 Cigital’s Software Security Group
invents and practices Software
Quality Management
 WE NEED PEOPLE

 Use Exploiting Software and Building


Secure Software

 Send e-mail:
gem@cigital.com

© 2004 Cigital

Das könnte Ihnen auch gefallen