Sie sind auf Seite 1von 14

Course Technologies for Network and Information Security

LECTURES OF THE COURSE TECHNOLOGIES FOR NETWORK AND INFORMATION SECURITY


LECTURERS: Assoc.Prof. Roumen Trifonov, PhD Assoc.Prof. Slavcho Manolov, PhD
Technical University - Sofia

Course Technologies for Network and Information Security

PART 21

WEB APPLICATION SECURITY

Technical University - Sofia

Course Technologies for Network and Information Security

The nature of the problem


The Web (initially physically implemented as an HTTP server, a Web browser and the HTTP protocol) was born with a particular set of properties in mind. Initially, security was not one of these properties. As the Web grew, and its components were extended to support new models and ideas, it became necessary to consider the security properties of the Web, however it was difficult to either reliably deploy or enforce security properties broadly because of the massive installed-based of Web agent software, so incremental "patches" were applied, leading to an arms race between those making the patches, and those creating the attacks (particularly in the area of the "same-origin policy"). The web browser is arguably the most security-critical component in the information infrastructure. It has become the channel through which most of our information passes. Banking, social networking, shopping, navigation, card payments, managing high value cloud services and even critical infrastructures such as power networks almost any activity you can imagine now takes place within a browser window. This has made browsers an increasingly interesting target for cyber-attacks: the volume of web-based attacks per day increased by 93% in 2010 compared to 2009, with 40 million attacks a day recorded for September 2010 (Symantec Threat Report, 2010). Many more complex threats such as DDoS attacks using botnets rely on flaws in web browsers, which allow the installation of malware. Even if the root cause is elsewhere, the browser is often in a position to protect the user e.g. in combatting phishing and pharming etc. Technical University - Sofia

Course Technologies for Network and Information Security

The threat model


They are three threat classesfor Web applications: passive network attackers, active network attackers, and imperfect web developers. However, they are two other classes of threats: phishing and malware. The threats can be also divided as addressed and not addressed ones. Web spoofing, also known as phishing, is a significant form of Internet crime that is launched against hundreds or thousands of individuals each day. Each attacked site may be used to defraud hundreds or thousands of victims, and it is likely that many attack sites are never detected. A typical web spoof attack begins with bulk email to a group of unsuspecting victims. Each is told that there is a problem with their account at a site such as E*Trade. Victims of the spoofing attack then follow a link in the email message to connect to a spoofed E*Trade site. Once a victim enters his or her user name and password on the spoof site, the criminal has the means to impersonate the victim, potentially withdrawing money from the victims account or causing harm in other ways. In the past few years, the available functionality on the client-side has seen an extensive growth with the introduction of new APIs. The permission to use this extended functionality is typically granted to a certain origin and stored persistently, until revoked by the user. This situation can make sites that have acquired such privileges highly interesting targets for attackers. A lot of the functionality defined in the specifications is available in multiple browsing contexts, including restricted contexts such as a sandbox or a private browsing context. Unfortunately, the specifications are not always clear on the exact behavior of this functionality in such a restricted context. Some example problems are: - are permissions stored in a normal browsing context also valid in a restricted context or vice versa? - can data be stored under one browsing context and retrieved under the other?

Technical University - Sofia

Course Technologies for Network and Information Security

Web break-through
The devices on which web applications run are very diverse, ranging from classic desktop systems to smartphones or embedded devices, such as gaming consoles or television sets. Each of these devices runs an operating system, which may already contain security controls for specific operations, such as determining the location of the device. Stacking several security controls on top of each other may be problematic and can confuse the user. Additionally, the security controls defined in the specification are typically more fine-grained than the underlying security controls. At present, the so called Web breakthroughs are most widespreaded. In its development the browsers went far from the initial versions intended only for consideration of hypertext documents. Their functionality constantly increasing, they are already full component of the operational systems. In parallel with this development numerous problems araise with security of used technologies, such as: additional modules (plug-ins), the elements ActiveX, Java applications, resources for preparation of scenarious Java Sript, VBScript, PerlScript, Dynamic HTML. Due to the support for these technologies not only from browsers, but also from email clients and errors in them, a big ammount of of virus in the mail appears, but also virus infecting html files (Implemented in VBScript using ActiveX objects). Also the troyan horses received large distribution. The Web-breakthrough is carried out usualy automatically by executable programs, which intended theft or destruction of computer data. They can be installed on the client computer when surfing the web and downloading necessary files from other web sites, or most often in ICQ or IRC sessions. This type of programs can be Java applets, ActiveX objects, Java Script, Visual Basic scripts, or virtually any new programming language intended for design Web pages.

Technical University - Sofia

Course Technologies for Network and Information Security

The cookies
One of the dangers for Web traffic are so called "Cookies. Because that the cookies are not contain executable programs, themselves can not cause any attack, but, at another case, they contain confidential information about clients' habits. Therefore, it would could be read from another website through a specially made script or ActiveX program. Netscape Navigator first implemented support for cookies in its version 2.0 browser, dating from 1996. Cookies offered a mechanism to allow a server to store per-client state, and have the client supply a (server-assigned) pointer to its state, automatically (via the client implementation) when sending any request to the cookie-specified domain and URL path. Many sites used this facility to identify a user session with the site, and then stored per-user/session data (such as a shopping cart) related to the cookie identifier. Cookies became successful because they were more reliable session indicators than competing mechanisms (such as putting session state in the URI or body of an HTTP request, which require that users don't accidentally drop the session part of the URI, for example) In order to ensure that a cookie was sent only to the originating domain, the browser needed to be able to determine the domain associated with a document and thus, the "origin" was born - scheme, host and port defining a unique origin. The same-origin policy states that a document from one unique origin may only load resources from the origin from which the document was loaded. In particular this applies to XMLHttpRequest calls made from within a document. Images, CSS and dynamically-loaded scripts are not subject to same-origin policy. Technical University - Sofia

Course Technologies for Network and Information Security

Security-related issues of Web architecture (1)


Documents (representations of Web resources) are often formed of content acquired from more than one "security domain" (an environment defined by a single set of security policies). Interactions between these pieces of content must be mediated in a "sandbox" environment on the client to prevent the possibility of content from one security domain causing problems with content from another security domain. Web browser redirects often take place without user input (for example, 'cookies) causing unintended user consequences. Web browser state management has been based on cookies, which are a shared client (browser) resource - one site may cause another's cookie to be sent in a request to the site which "owns" the cookie, causing that site to believe that the user is making an intentional, and authenticated request, when in fact, this may not be true (such as clickjacking attack) Identity-spoofing of Web sites on the Web is relatively easy (Referer header spoofing, DNS rebinding and cache poisoning, confusing the user with content which looks authentic but is controlled and presented by an attacker) Servers often depend on a client to "do the right thing" in providing security for the server (such as correctly process Web 'origin' and 'referer' information in order to allow the server to authenticate a request) but clients are open to manipulation by servers, and software defects. Not all clients will "do the right thing" -- by design. Authenticated protocols are based on un-authenticated protocols (for example, no true link between SSL certificate validation and the DNS IP address for the common name in the certificate) Technical University - Sofia

Course Technologies for Network and Information Security

Security-related issues of Web architecture (2)


No separate "download", "install" and "execute" steps for a user. Web content is often immediately executed by the client, without giving the user a chance to approve access to sensitive or limited client resources (such as CPU and local storage) Documents, or excerpts thereof, are usually not tied to their publisher in any way that can be verified across the Web (such as by an interoperable cryptographic signature) The desirable security properties of the Web require that: - one Web agent doesn't have to inordinately trust the correct behaviour of a whole class of Web agents when exposing a resource to the Web; -it is possible to "tie" one layer of Web protocol to other layers (DNS IP address should be tied to IP address of SSL cert, SSL cert key used to sign token at app layer protocol etc.) so that when necessary they cannot be separated; - it is possible to load or embed all Web resources from multiple security domains in a consistent manner (unlike the current situation where images and CSS are not subject to the same-origin policy, and where scripts may be dynamically added to a page (via the <script src=.../> tag) without being subject to the same-origin policy

Technical University - Sofia

Course Technologies for Network and Information Security

Some current Web security-related standards work (1)


The standards, which govern the browser and hence its security - are currently undergoing a major transformation. In order to accommodate innovations in web applications and their business models, a raft of new standards is currently being developed. These include an overhaul of HTML (HTML5), cross-origin communication standards such as CORS and XHR, standards for access to local data such as geolocation, local storage and packaged stand-alone applications (widgets). The specification CORS defines a mechanism to enable client-side cross-origin requests. Specifications that enable an API to make cross-origin requests to resources can use the algorithms defined by this specification. If such an API is used on http://example.org resources, a resource on http://hello-world.example can opt in using the mechanism described by this specification (e.g., specifying Access-ControlAllow-Origin), which would allow that resource to be fetched cross-origin from http://example.org. The specification HSTS defines a mechanism enabling Web sites to declare themselves accessible only via secure connections, and/or for users to be able to direct their user agent(s) to interact with given sites only over secure connections. This overall policy is referred to as HTTP Strict Transport Security (HSTS). The policy is declared by Web sites via the Strict-Transport-Security HTTP Response Header Field, and/or by other means, e.g. user agent configuration. HTTP Strict Transport Security (HSTS) is a proposed web security policy mechanism where a web server declares that complying user agents (such as a web browser) are to interact with it using secure connections only (such as HTTPS). The policy is communicated by the server to the user agent via a HTTP response header field named "Strict-Transport-Security". The policy specifies a period of time during which the user agent shall access the server in only secure fashion.

Technical University - Sofia

Course Technologies for Network and Information Security

Some current Web security-related standards work (2)


The HSTS policy helps protect website users against some passive (eavesdropping) and active network attacks. A Man-in-the-middle attacker has a greatly reduced ability to intercept requests and responses between a user and a website, while the user's browser has HSTS active for that site. The most important security vulnerability that HSTS can fix is SSL-stripping Manin-the-Middle attacks. These work by transparently converting a secure HTTPS connection into a plain HTTP connection. The user can see that the connection is insecure, but crucially there is no way of knowing whether the connection should be secure. Many websites do not use SSL, therefore there is no way of knowing (without prior knowledge) whether the use of plain HTTP is due to an attack, or simply because the site hasn't implemented SSL. HSTS fixes this problem by informing the browser that connections to the site should always use SSL. Of course, the HSTS header can be stripped by the attacker if this is the user's first visit. Chrome attempts to limit this problem by including a hardcoded list of HSTS sites. Unfortunately this solution cannot scale to include all websites on the internet; a more workable solution can be achieved by including HSTS data inside DNS records, and accessing them securely via DNSSEC. HSTS can also help to prevent having one's cookie-based website login credentials stolen by widelyavailable tools such as Firesheep. The specification HTML5 defines the 5th major revision of the core language of the World Wide Web: the Hypertext Markup Language (HTML). In this version, new features are introduced to help Web application authors, new elements are introduced based on research into prevailing authoring practices, and special attention has been given to defining clear conformance criteria for user agents in an effort to improve interoperability.

Technical University - Sofia

Course Technologies for Network and Information Security

The next-generation hypertext model


The centre-piece of the model is the browser concept of a window containing a document. Visually, such a window occurs as a single browser window, a tab, a popup or a frame. This window is represented by a window object. Through the window object, web pages and scripts gain access to internal properties (the URL, navigation history, ...), event handlers, the document and its associated DOM tree and numerous client-side APIs. The browser window and its associated window object enclose a document with a specific origin and location (a URL). A window can contain multiple documents (i.e. a browsing history) but only one of these documents can be active at any given time. Since the relation between window and document at one moment in time is one-to-one, we do not separate a window and a document when this is not relevant. Technical University - Sofia

Course Technologies for Network and Information Security

Some current Web security-related standards work (3)


New functionality introduced in the HTML5 specification allows the sandboxing of an iframe. This sandbox imposes restrictions on all the content in the iframe, as shown by the dotted line in the model. The specific features and consequences of the sandbox will be part of the security analysis. The two functional blocks inside the window (Event Handlers and DOM) represent two cornerstone pieces of functionality for dynamic web pages. Event handlers are used extensively to register handlers for a specific event, such as receiving messages from other windows or being notified of mouse clicks. Access to the DOM enables a script to read or modify the document's structure on the fly. Web Gateways By now, you might be noticing a pattern: What the traditional security industry refers to as defense-in-depth has so far been iterations of pattern-matching techniques deployed in network or host-based systems. These technologies represent an ongoing effort to augment basic port-based blocking and to overcome the inherent limitations of the previous round of signaturebased or list-based security product deployments. Web gateway security is no different. As attackers shifted tactics to deliver both attacks and malware communication over the Web, organizations found a need to tighten their control over Web-based communications. As a result, Web gateways were developed. These technologies, like the ones before them, use lists of known bad URLs and do not look to the evolving, unknown threats of the future. Vendors have based their prevention capabilities on a list-based approach, preventing the transmissions of Web data and Web sites that were known to be malicious.

Technical University - Sofia

Course Technologies for Network and Information Security

Some current Web security-related standards work (4)


While Web gateways provided some initial security value, attackers have shifted tactics. They have moved to completely dynamic and obfuscated models of both attack delivery and malware communication, which render lists of malicious Web sites obsolete. Consequently, just as Web gateways were beginning to be widely adopted, they became outmoded from a security perspective. While these technologies still have utility in enforcing HR policies that limit employee Web browsing, when it comes to combating modern attacks, Web gateways have been relegated to an increasingly marginal security role. The same is true of antivirus and other technologies due to the shift in tactics by cyber criminals. The Open Web Application Security Project (OWASP) is an open community dedicated to finding and fighting the causes of insecure software. All of the OWASP tools, documents, forums, and chapters are free and open to anyone interested in improving application security. The usual architecture is a simple linear procedural script. This is the most common form of coding for ASP, Cold Fusion and PHP scripts, but rarer (but not impossible) for ASP.NET and J2EE applications. The reason for this architecture is that it is easy to write, and few skills are required to maintain the code. For smaller applications, any perceived performance benefit from moving to a more scalable architecture will never be recovered in the runtime for those applications. For example, if it takes an additional three weeks of developer time to re-factor the scripts into an MVC approach, the three weeks will never be recovered (or noticed by end users) from the improvements in scalability. Technical University - Sofia

Course Technologies for Network and Information Security

Some current Web security-related standards work (5)


As applications get larger, it becomes ever more difficult to implement and maintain features and to keep scalability high. Using scalable application architectures becomes a necessity rather than a luxury when an application needs more than about three database tables or presents more than approximately 20 50 functions to a user. Scalable application architecture is often divided into tiers, and if design patterns are used, often broken down into re-usable chunks using specific guidelines to enforce modularity, interface requirements and object re-use. Breaking the application into tiers allows the application to be distributed to various servers, thus improving the scalability of the application at the expense of complexity. One of the most common web application architectures is model-viewcontroller (MVC), which implements the Smalltalk 80 application architecture. Security architecture refers to the fundamental pillars: the application must provide controls to protect the confidentiality of information, integrity of data, and provide access to the data when it is required (availability) and only to the right users. Security architecture is not markitecture, where a cornucopia of security products are tossed together and called a solution, but a carefully considered set of features, controls, safer processes, and default security posture.

Technical University - Sofia

Das könnte Ihnen auch gefallen