Sie sind auf Seite 1von 25

Top 10 Strategic Technologies for 2010

Gartner Symposium ITxpo 2009 David Cearley and Carl Claunch

October 18-22, 2009


Walt Disney World Dolphin
Orlando, FL

Notes accompany this presentation. Please select Notes Page view.


These materials can be reproduced only with written approval from Gartner.
Such approvals must be requested via e-mail: vendor.relations@gartner.com.
Gartner is a registered trademark of Gartner, Inc. or its affiliates.

This presentation, including any supporting materials, is owned by Gartner, Inc.


and/or its affiliates and is for the sole use of the intended Gartner audience or
other authorized recipients. This presentation may contain information that is
confidential, proprietary or otherwise legally protected, and it may not be further
copied, distributed or publicly displayed without the express written permission
of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All
rights reserved.
Top 10 Strategic Technologies for 2010

Strategic Imperative: Companies should use the Gartner Top 10 list as a starting point and
adjust their list based on their industry, unique business needs, technology adoption model
(that is, Type A, B or C company) and other factors.

Strategic Technologies and Trends


Strategic Technologies The Top 10 List
• Enterprise impact during the next • Factor in Hype Cycles, surveys, client
three years inquiries and ongoing research
- IT budget, business operations, • Selectively combine related items for
business opportunities strategic emphasis
- Threat/risk to the business • Evaluate for type, scope, degree,
- Direct, derivative and combinatorial speed and size of impact
- Short-term or long-term planning • Not a comprehensive list
• Reach maturity or a tipping point • Not a list of top IT budget items
• Driver of change/disruption • Removal from the list does not mean it
- User behavior and expectations is no longer strategic.
- Industry models and economics • Prefer items changing during next two
- Technology markets and pricing or three years not currently factored
• May be existing or emerging into most strategic plans
• Target mainstream enterprise • Rotate slow moving long-term and
emerging items periodically
All strategic technologies need to be investigated
and factored into the planning process.

A strategic technology is one with the potential for significant impact on the enterprise during the next three
years. Factors that denote significant impact include a high potential for disruption to IT or the business, the
need for a major dollar investment or the risk of being late to adopt. Companies should factor these
technologies into their strategic planning process by asking key questions and making deliberate decisions
about them during the next two years. Sometimes the decision will be to do nothing with a particular
technology. In other cases, it will be to continue investing in the technology at the current rate. In still other
cases, the decision may be to test/pilot or more aggressively adopt/deploy the technology.
A strategic technology may be an existing technology that has matured and/or become suitable for a wider
range of uses. In the case of a mainstream technology, the strategic decision will likely revolve around
product/vendor selection and the degree to which it is incorporated into the broad IT environment. A strategic
technology may also be an emerging technology that offers an opportunity for strategic business advantage for
early adopters or with potential for significant market disruption in the next five years. In the case of an
emerging technology, the strategic decision may be to request funding to evaluate the technology.
This presentation highlights technologies that will be strategic for most organizations. It is not a
comprehensive list of every technology that is ready for adoption or incorporation into the strategic planning
process. Companies should use the list as a starting point and adjust based on their industry, unique business
needs, technology adoption model (that is, Type A, B or C company) and other factors.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 1
Top 10 Strategic Technologies for 2010

Key Issues
PC Hype Cycle
• What technologies will 2009

drive the future of


applications, application
development and data
centers?
• What technologies will
drive change and
advances in the end-user
environment?
• What other long-term
trends and "forgotten"
technologies should be
evaluated?

2009 Hype Cycles related to the top 10 technology list:


• Cloud Computing
• Virtualization
A few Oldies. but Goodies — Selected Technologies
• Social Software that should already be in your strategic plan
• Data Management
* Service-Oriented Architecture *
• PC Technologies
* IM/Presence and Collaboration *
• E-Commerce
• Analytic Applications * Master Data Management *
• Consumer Mobile Applications
• Software as a Service
• Business Intelligence and Information Management
• Application Development
• Mobile Devices
• Data and Application Security
• Server Technologies
• Emerging Trends
See also: “The Enterprise CIO's 2009 Agenda: Discipline, Risk and Opportunity” (G00165267)

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 2
Top 10 Strategic Technologies for 2010

Strategic Imperative: Companies should factor the top 10 technologies into their strategic
planning process and make deliberate decisions about them during the next two years. This
does not necessarily mean adoption and investment in all of the listed technologies.

Technologies You Can't Afford to Ignore


Top 10 Strategic Technology Top 10 Strategic Technology
Areas for 2009 Areas for 2010
1. Virtualization ………………… 1. Cloud Computing
2. Business Intelligence ……… 2. Advanced Analytics
3. Cloud Computing …………… 3. Client Computing
4. Green IT ………………………. 4. IT for Green
5. Unified Communications ….. 5. Reshaping the Data Center
6. Social Software and Social 6. Social Computing
Networking …………………… 7. Security — Activity Monitoring
7. Web-Oriented Architecture .. 8. Flash Memory
8. Enterprise Mashups ……….. 9. Virtualization for Availability
9. Specialized Systems ………. 10.Mobile Applications
10.Servers — Beyond Blades ….

Modified for 2010 New for 2010 Dropped for 2010

Three items were removed from the list in 2010. unified communications is still a strategic long term trend but evolution continues at a slow pace
making it less critical to highlight on this years list. The dual trend toward specialized systems and servers beyond blades continues with some
vendors emphasizing "converged systems" that bring network, storage and server functions. However, in 2010 we do not expect major shifts in the
market and have replaced these technologies with other new data center technologies for consideration (flash memory, reshaping data center). These
topics added to specialized systems and "beyond blades" approaches for data center planning in 2010/2011. It should be noted that removing a
technology from the list does not mean it is no longer strategic. Rather, it simply reflects a shift in emphasis relative to other technologies. In some
cases, technologies are rotated from year to year to expose a larger number of technologies over time.
A number of topics remain on the list but with a new emphasis. Many aspects such of virtualization including virtualized servers, storage and
networking are part of the strategic plan for many data centers and virtualization has become an integral part of other trends on the 2010 list such as
cloud computing and client computing. In 2010, we emphasize new elements such as live migration for availability that have longer term
implications. Cloud computing is maturing with emphasis placed on public cloud service consumption, private cloud environment development and
web/cloud application development. WOA and Enterprise Mashups are brought together as part of this topic. A particularly strategic emphasis for
business intelligence is advanced analytics which is a key technology to support pattern based strategy. Social software and social networking is
maturing over the next two to five years in many dimensions impacting applications, application development and security. Social computing picks
up this theme. Green IT remains strategic but the aspect of that deals with energy consumption is increasingly an integral part of data center and
client strategies and as such are dealt with as part of reshaping the data center and client computing. In 2010, the focus shifts to how technology to
support greener business (i.e., IT for Green) augments the green IT strategies for improved IT energy efficiency.
In addition to flash memory and reshaping the data center three other technologies were added. Client computing and mobile applications reflect a
maturing and convergence of trends related to end user devices and applications, as well as the location where the devices are used. Mobile
applications in particular reflect the increasing importance of this market as an increasingly important touch point for an enterprises customers and
the need for IT to factor this into their strategy.
Note: In the above text items on the 2009 list are highlighted in bold, while items on the 2010 list are highlighted in italics

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 3
Top 10 Strategic Technologies for 2010

Key Issues

• What technologies will drive the future of


applications, application development and data
centers?
• What technologies will drive change and
advances in the end-user environment?
• What other long term-trends and "forgotten"
technologies should be evaluated?

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 4
Top 10 Strategic Technologies for 2010

Strategic Imperative: Establish a project team that includes represnetatives from across IT as
well as key business constituencies to establish an overall approach to cloud computing and
evaluate the opportunities and risks associated with this new style of computing.

Cloud Computing: Everything as a Service Promises


Agility, Innovation, Simplicity and Economic Value
Gartner defines cloud computing as "a style of computing
where scalable and elastic IT-related capabilities are provided
'as a service' to customers* using Internet Technologies".
5 Attributes That Support Outcomes

3 Focal Points For Cloud Projects


1 Service-Based Consuming 1
Cloud Services

2 Scalable and Elastic

Developing Cloud-
3 Shared Based Applications 2
and Solutions

4 Metered by Use
Implementing Private
Cloud Computing
5 Internet Technologies Environments
3
* Public cloud computing refers to delivery of cloud services by a third-party provider to external
customers. In private cloud computing, IT acts as the provider of services to internal customers.

During the past 15 years, IT industrialization has grown in popularity. IT services delivered via hardware, software and
people are becoming repeatable and usable by a wide range of customers and service providers. This is partly because of
the commoditization and standardization of technologies, virtualization and the rise of service-oriented software
architectures, and (most importantly) the dramatic growth in popularity/use of the Internet and the Web. These things,
taken together, constitute the basis of a discontinuity that amounts to a new opportunity to shape the relationship between
those who use IT services and those who sell them. The discontinuity implies that the ability to deliver specialized
services in IT can now be paired with the ability to deliver those services in an industrialized and pervasive way. The
reality of this implication is that users of IT-related services can focus on what the services provide them, rather than how
the services are implemented or hosted.
Just as utility companies sell power to subscribers, and telephone companies sell voice and data services, IT services such
as network security management, data center hosting or even departmental billing can now be easily delivered as a
contractual service. The buying decision then shifts from buying products that enable the delivery of some function (like
billing) toward contracting, with someone else delivering those functions. This isn't new, but it does represent a different
model from the licensed-based, on-premises models that have dominated the IT industry for so long. Names for this type
of operation have come into vogue at different times. Utility computing, SaaS, application service providers — all have
their place in the pantheon of industrialized delivery models. However, none has garnered widespread acceptance as the
central theme for how any and all IT-related services can be delivered globally.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 5
Top 10 Strategic Technologies for 2010

Strategic Imperative: Develop unique and appropriate selection criteria and governance
models for each layer of the cloud computing service hierarchy but combine them into an
overall cloud computing strategy.

Consuming Cloud-Computing Services


System Application Application, Information
Infrastructure Infrastructure Or Business Process
Services Services Services
Client Client Client

A A A
A A A
P
P P P P P

I I I I I I

• Rapid hardware • Rapid development, • Rapid application


provisioning and delegated deployment and change deployment and change
hardware management
• Greater vendor lock-in • Least flexibility
• Developer responsible for
cloud optimization Warning: Hazards Ahead
Service Consumer Responsibility Proceed With Caution
Service Provider Responsibility

Using cloud resources does not eliminate the costs of IT solutions, but does re-arrange some and reduce others. The cost of delivering
a business solution based in the cloud depends on many variables. One notable differentiator is the level at which the customer enters
the cloud.
System infrastructure services (e.g., Amazon EC2 and S3) delivers access to a virtualized pool of compute and storage services. The
least disruptive approach to cloud infrastructure is to use basic system infrastructure to host traditional enterprise applications. A
more sophisticated approach would be to build a program so that it will uniquely exploit cloud-centric distributed and parallel
processing capabilities, and run the resulting program on cloud system infrastructure. In both scenarios, the rest of the application
infrastructure (e.g., database, development tools) still must be acquired and installed and the consumer is typically responsible for
application life cycle management and security.
Use of application infrastructure services moves more of the infrastructure stack to a cloud model and typically uses cloud based
services to build, store, run and manage the resulting application. Application infrastructure services can provide an abstraction layer
that reduces the complexity of creating cloud optimized application. This approach is being championed by a new crop of RAD and
database vendors that provide turnkey environments (e.g., application platform as a service [APaaS]) that build on the scalability of
cloud-based infrastructures. However, use of application infrastructure services and APaaS increases vendor lock-in insofar as
applications developed for one vendor offering are often not portable to another vendor's service.
Developers can also access external application, information or process services via a mashup model. The result of the external
service is brought to the internal application. An extension of this model would allow for a degree of configuration and extension to
the external service being accessed. In either of these approaches the consumer of the cloud service offloads even more of the
development and management functions in comparison to either cloud infrastructure approach. The tradeoff is that customization is
limited to the configurability and mashability of the existing application, information or process services.
None of the approaches will be sufficient for the full range of application needs for the typical enterprise. Most users will employ a
hybrid model that uses all three techniques. Users planning their budgets for cloud engagements must understand both the cloud-
generated costs and the retained internal costs to arrive at the total cost estimate for such project.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 6
Top 10 Strategic Technologies for 2010

Tactical Guidance: Familiarize yourself with the trade-offs of different development


approaches, and develop a plan for where and how you will use each. Don't let vendor
marketing and hype dictate your strategy.

Cloud/Web Application Development


• Why Strategic
- SOA + Web + Cloud System Infrastructure
- All Web apps become cloud apps Application
Sys./App. Infra.
- New solution opportunities
- The Citizen Developer
Enterprise-
- The Enterprise as Cloud Provider Access and
Configuration Class App.
• Multiple Cloud Approaches Global-
- Cloud Service Configure/Mashup Class App.
- Cloud-Hosted Application
- Cloud-Optimized Application
• Cloud Tools, Techniques/Skills
- WOA & Global Class Design
- Design for linear scalability, parallel
processing and distributed data Browser/RIA
- Application Platform as a Service or Traditional
- Develop with social networking in mind Client
Developer
- Merge Web design and development Consumer

Cloud-based services can be exploited in a variety of ways to develop an application or solution. The least-disruptive approach is to continue using
traditional development tools and techniques, and exploit the cloud as a virtualized pool of compute and storage services. A slightly more
sophisticated model would be to build a program on internal systems that will uniquely exploit cloud-centric distributed and parallel processing
capabilities, and run the resulting program in the cloud. Developers can also create and execute applications internally, and simply access external
application, information or process services via a mashup model. The result of the external service is brought to the internal application. An extension
of this approach would allow for a degree of configuration and extension to the external service being accessed. Full exploitation of the cloud means
that external resources are used for building, storing, running and managing the application. This approach is being championed by a new crop of
RAD and database vendors that provide turnkey environments (e.g., application platform as a service [APaaS]) that build on the scalability of cloud-
based infrastructures. None of the approaches will be sufficient for the full range of application needs for the typical enterprise. Most users will
employ a hybrid model that uses all three techniques.
There is a fundamental tradeoff between using cloud system infrastructure and cloud application infrastructure to create cloud optimized
applications. Using cloud system infrastructure the developer is given maximum control over the environment but must build linear scalability,
distributed data management and multitenancy (if needed) into the application themselves. Most IT groups do not have the skills to accomplish this.
By using application infrastructure services (e.g., Application Platform as a Service) the developer can insulate themselves from this complexity but
at the expense of control over every aspect of the environment.
Web development tools have reached a point where they can address most enterprise use cases. Users should avoid religious debates on the
superiority of one scripting language over another and focus on how well the tool or framework supports developer productivity to construct and
maintain the application.
In addition to consuming cloud services enterprises will increasingly act as cloud providers and deliver application, information or business process
services to customers and business partners. The first step in this direction is often tied to a companies web site with elements of the web site being
made more "mashable" by exposing information and capabilities of the web site using RESTful APIs, RSS feeds and other WOA techniques. A
second step entails exposing other business processes and information sources that are not currently part of the companies formal website using the
same mechanisms. Best Buy is one example of a company pursuing this approach.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 7
Top 10 Strategic Technologies for 2010

Tactical Guideline: When deciding between public and private cloud initiatives, consider
what the scope of membership needs to be and who must own the assets.

Public, Private and Hybrid Cloud


Computing — One Size Doesn't Fit All
Private Public

Company/Entity
Hybrid No owned assets and
Owned Assets and scope is open to
Scope is Bounded by anyone who can pay
Exclusive for service as
Membership defined delivered by provider
by company/Entity
Distinctions are made at the extremes, but
Good Control Less Control
many examples exist somewhere between.
Lower Value High Value

Pros Cons
• More Control • Continued Asset Ownership
• Less Latency • Reduced Economies of Scale
• More Secure • Reduced Sharing
• Learning Environment • Reduced Flexibility
• Shift "Price/Cost" to "Price/Value" • CapEx Dominated

Private cloud computing is a style of computing where scalable and elastic IT-enabled capabilities are delivered as a service to
internal customers using Internet technologies. Note that this definition is extremely similar to our public cloud computing definition.
The focus on internal is related to who can access or use the services in question and who owns or coordinates the resources used to
deliver the services. We present two characteristics that are related to conducting private cloud computing, as opposed to public cloud
computing:
Limited Membership (that is, exclusive membership): A private cloud implementation has a bounded membership that is exclusive
(that is, only approved members can participate, and approval is contingent on some characteristic that the general public or other
general businesses cannot gain easily). For example, membership may be limited to employees and designees of a given company.
Alternatively, services may be limited to a set of businesses in a given industry, or they might be limited to an industry trade group.
Access to these services will frequently be controlled by a centralized organization (such as a company's IT organization or an
industry association), but this control is not essential to the concept of the private cloud. Note that a cloud service that only requires
customers to sign up is not considered to be exclusive.
Spectrum of Control/Ownership: A private cloud service is different from a public cloud service in that private cloud services are
implemented for an exclusive set of consumers. There is a spectrum from fully private services to fully public services that blurs
distinctions of ownership or control. There are many examples that fall somewhere between public cloud services and private cloud
services, including business partners sharing resources, private cloud services built on top of public cloud services and specialized
services limited to a small number of enterprises in a specific industry. Private cloud services can be built on top of a public cloud
infrastructure or in a hybrid model. Organizations need to focus on the services to improve clarity and direction.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 8
Top 10 Strategic Technologies for 2010

Virtualization: Far Beyond Consolidation

VM VM

• Live Migration — VMware or Microsoft


• Currently used for one time permanent move
• But … imagine a perpetual process, never done
- What happens if source VM fails?
- What happens if target VM fails?
• One mechanism for availability versus many
No High Separate Separate Fail-over Fault
special reliability physical hi-reliability clustering tolerant
approach machine machines machines software server

Versus
Change virtualization setting

Live migration is the movement of a running virtual machine (VM), while its operating system and other software
continue to execute as if they remained on the original physical server. This takes place by replicating the state of physical
memory between the source and destination VMs, then, at some instant in time, one instruction finishes execution on the
source machine and the next instruction begins on the destination machine. The source VM stops at this point, while the
users, networks and disk activities take place to and from the destination physical machine from that point forward.
VMware names this VMotion, whereas Microsoft calls it Live Migration.
Imagine, however, if the replication of memory were to continue indefinitely, but execution of instructions remains on the
source VM. We are one instant away from switching over to the destination machine, but don't take the step. Consider
what happens if the source VM were to fail suddenly – the next instruction that was to be executed would now take place
on the destination machine, completing the live migration. The source virtual machine is now stopped and we have a
single (destination) VM running. If the destination VM were to fail, we would just pick a new destination to start the
indefinite migration. Thus, we have very high availability possible.
The key value proposition is to displace a variety of separate mechanisms with a single “dial” that can be set to any level
of availability from baseline to fault tolerance, all using a common mechanism and permitting the settings to be changed
rapidly as needed. We can dispense with expensive high-reliability hardware, with fail-over cluster software and its
administrative complexity, and perhaps even with fault-tolerant hardware, yet still meet our availability needs. This is key
to cutting costs, lowering complexity, as well as increasing agility as needs shift.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 9
Top 10 Strategic Technologies for 2010

Strategic Guideline: Users should adopt a pod-based approach to new data center
construction and expansion.

Reshaping the Data Center —


POD Design and Power Zones
• What we knew about data center
design for decades was wrong!
• POD Design
- Incremental build-out methodology
- Use standard components to maximize
flexibility, cost-effectiveness
- Component configuration optimized
power and cooling to a prescribed
amount of raised-floor space
- Reduce capex
• Power Zones
- High density
- Medium density
- Low density
• Based on workload mix 9 Minimize overprovisioning
• Power and cooling designed at the 9 Reduce operating expenses by up to 40%
zone level 9 Allow very-high-density computing
• Expandable over time where needed
• Dramatically reduce capital
costs upfront

In the past, design principles for data centers were simple: Figure out what you have, estimate growth for 15 to 20 years, then build to
suit. Newly-built data centers often opened with huge areas of white floor space, fully powered and backed up by a UPS, water- and
air-cooled, and mostly empty. We have discovered that much of what we knew is really false – costs are lower if one builds in slices.
If you expect to need 9,000 square feet during the life of a data center, then design the site to support it, but only build what's needed
for five to seven years. This POD approach — in which populated floor space may only be 5,000 feet initially, fully supported by
power, UPS, chillers and generators, with 4,000 feet left as a slab or built out to absolute minimum requirements (essentially, a shell)
— is becoming a best practice.
Similarly, we thought the most efficient way to deal with varying densities was to intermix hot and cool devices to even out the heat
across the center, but placing cooling closer to the heat producers, who are isolated into hot zones, ends up more efficient in capital
and operating costs. Most customers that have considered the density zone approach have found that high-density applications
comprise, on average, 10% to 15% of the total, medium-density requirements are about 20%, and the rest of the workload is allocated
to low density. In a 9,000-square-foot data center, this approach yields lower cost, initially and ongoing. Assuming 10% of floor space
is designed to high-density (200 w/sq. ft.), 20% was medium-density (150 W/sq. ft.) and the rest was low or normal density (100
W/sq. ft.), the overall building cost would be $16.8 million (a 21% reduction). At a 50% load, yearly electrical costs would be
$740,000 (a 27% reduction). If zone requirements changed, and more high-density floor space was needed, then scaling up PDUs
would be a simple way to increase power. Adding more on-floor CRAC units would address cooling issues. For larger data centers, a
three- or four-pod approach creates a living facility that would be continually updated with the latest power and cooling technologies,
ensuring optimal performance. In the last phase (e.g., the fourth pod), plans could begin to retrofit the first pod with new technologies,
beginning an evolutionary cycle for the facility, and extending its useful life.
Cutting operating expenses, which are a nontrivial part of the overall IT spend for most clients, frees up the money to apply to other
projects or investments either in IT or in the business itself.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 10
Top 10 Strategic Technologies for 2010

IT For Green Business — Green IT


Means More Than Energy Efficient IT
Domains of Impact • Remote communication and
Product Use collaboration to reduce travel
• Analytics to optimize
Business Production and transportation of goods
Operations
IT's Impact • Teleworking
• Content and document mgt
Mfg Energy Retail Fin Svcs. Gov't.
reduce the need for paper
Fix Business Fix IT • Smart building technology
• How intrinsic is IT to business? • Carbon tracking,
• How much of a contribution to management and trading
the overall green footprint will
• Collation and reporting of
green IT make?
nonfinancial and CSR data
• If I don't spend on green IT
how will it affect my business?

IT can enable many green initiatives. The use of IT, particularly among the white collar staff, can greatly
enhance an enterprise's green credentials. Common green initiatives include the use of e-documents to
reduce the consumption of paper for operational and regulatory-compliance-related activities, including
filing of reports with regulatory bodies. Using e-document management systems can also serve as the
beginning steps to facilitate collaboration among geographically disparate parts of the organization.
Reducing travel, particularly in a service-centric organization, can drastically improve the overall carbon
footprint of the enterprise. An extension of this philosophy is teleworking, where employees no longer
commute to the office on a regular basis. Not only can the business gain from a reduction in energy
consumed for transportation, it could physically reduce its footprint on the environment by occupying
smaller buildings with smaller paved spaces, reducing storm water runoff. This makes the use of smart
building technologies, such as auto-off lighting and occupancy-based HVAC viable and cost-effective.
IT also can provide the analytic tools that others in the enterprise may use to reduce energy consumption in
the transportation of goods or other carbon management activities.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 11
Top 10 Strategic Technologies for 2010

Key Issues

• What technologies will drive the future of


applications, application development and data
centers?
• What technologies will drive change and
advances in the end-user environment?
• What other long-term trends and "forgotten"
technologies should be evaluated?

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 12
Top 10 Strategic Technologies for 2010

Strategic Imperative: Proactively build a five to eight year strategic client computing roadmap
outlining an approach to device standards, ownership and support; OS and application
selection, deployment and update; and management and security plans to manage diversity.

Client Computing: Strategic Decisions Ahead


• Virtualization changes Hype Cycle for PC Technologies*
the building blocks expectations
Web-Based Office Productivity Suites
• Provisioning choices PC Linux for Hosted Virtual Desktops
expand Consumers PC Application Streaming
(Mature Markets)
PC Application Virtualization
• Cloud-based and RIA PC Hypervisors
application use Hosted PC Virtualization Software
Operating System
increases Streaming
PC Linux for Data
• Device and OS options Persistent Entry Workers
matter less Personalization Server-Based
Open-Source Computing
• Major Microsoft Office Products Tablet PC As of July 2009
upgrade decisions loom Technology
Trigger
Peak of Inflated Trough of
Expectations Disillusionment
Slope of Enlightenment Plateau of
Productivity

time
• Non-enterprise-owned Years to mainstream adoption: obsolete
less than 2 years 2 to 5 years 5 to 10 years more than 10 years before plateau
devices become safer * Partial Listing of Hype Cycle Entries
and more desirable
Build a strategic client-computing road map.
• Generational shift alters Either your suppliers are riding your road
users expectations map, or you’re riding theirs.

Virtualization is bringing new ways of packaging client computing applications and capabilities: Hosted Virtual Desktops, Workspace
Virtualization, Virtual Machines, Application Virtualization. Streaming brings new ways to deliver application, OS and workspace
package to users. More efficient virtualized packaging options that allow apps and total user desktop images to be stored centrally and
used anywhere are emerging and will be mainstream in three years. As a result, the choice of a particular PC hardware platform and
eventually the OS platform becomes less critical. Web-based/cloud-based applications will encourage more hardware and OS-
agnostic application development. Previously binary decisions, like local versus centralized computing, are becoming a spectrum of
choice.
The Windows based desktop PC as the dominant client computing device is being challenged by alternative OS options and form
factors. Use of tablets, netbooks and smartphone to augment or replace the traditional desktop PC is increasing. By 2014, we expect
over 50% of the end-point devices accessing the internet will be smartphones. Linux for data entry workers is reaching mainstream
acceptance and maturity while use for consumers in both mature and emerging markets will reach mainstream adoption in two to five
years. Macintosh is growing as a viable option for business users in leading-edge organizations that exploit advanced techniques to
manage client diversity. In the midst of this turmoil organizations are also facing major upgrade decisions in 2010/2011 with the
release of Windows 7 and Office 2010. Facing this major upgrade decision, along with the myriad of trends driving change in the
client environment, organizations are realizing that, unless they build road maps, they will be hostages to the road maps of their
suppliers. Successful enterprises are taking advantage of choice to seize control of the cadence of their investments in an area that
represents a big slice of the IT budget.
An increasing number of users are pushing for support of non-standard and non-company-owned devices. Younger technology savvy
workers have higher expectations about what they should be using, need less support and are more willing to find a work-around for
IT restrictions. Our research indicates that most companies have 15% to 20% of these “rogue” devices on their networks with
engineering and technology savvy environments having the largest percentage. Deciding how to equip which user with what and how
has become a much more granular challenge. The balance between standardization and overprovisioning has changed. Meanwhile
companies are trying to reduce the capital budget and manageability overhead of PCs.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 13
Top 10 Strategic Technologies for 2010

Mobile Applications
How Does It Affect You?
• Mobile applications need new
servers to which they connect
• Application delivery and support
complexity increases
• Immature management tools
• Look at mobile apps as a critical
enabler of B2C client interactions
and increased customer satisfaction
• Mobile applications can create
• Stickiness
Harbinger: Many
• Customer behavioral inertia
tens of thousands
• Value “real estate”
of new and more
powerful applications • Company differentiation

are coming online. • Support impulse interactions


This will accelerate.

By YE 2010, 1.2 billion people will carry handsets capable of rich, mobile commerce providing a rich environment for the
convergence of mobility and the Web. Mobile devices are becoming computers in their own right, with an astounding amount of
processing ability and bandwidth.
We already see many thousands of applications for platforms like the Apple iPhone, in spite of the limited market (only for the one
platform) and need for unique coding. How much bigger would this be if many of the same applications that run in the enormous PC
market were usable on mobile devices, without having to be developed explicitly for that platform? It may take a newer version that is
designed to flexibly operate on both full PC and miniature systems, but if the operating system interface and processor architecture
were identical, that enabling factor would create a huge turn upwards in mobile application availability.
New marketplaces such as the iTunes store enable the easy dispersion of these B2C mobile applications. Potential users can visit a
reasonable number of such sites to find and download their desired programs, instead of requiring each business to promote its
application and lure customers to a unique download mechanism.
Moore's Law gives us twice as many transistors in the same chip area, which for PCs is leading to an increasing number of cores in
newer systems to employ all those added transistors. However, an alternate approach is to implement a single simple x86 processor on
an ever shrinking chip, with both cost and size decreasing as transistor densities keep improving. This allows an x86 system to be
deployed in mobile devices and in other places that don't suit a full PC. Current activities like the Intel Atom processors show this
potential is near, while research projects and demonstrations point to its feasibility. Gartner predicts that the technical, business and
market issues will be overcome, luring processor makers like Intel, mobile device makers and software developers into making this
possibility become a future reality.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 14
Top 10 Strategic Technologies for 2010

Definition: Optimization and simulation is using analytical tools and models to maximize
business process and decision effectiveness by examining alternative outcomes and
scenarios, before, during, and after process implementation and execution.

Advanced Analytics: Operational


Analytics to Optimize and Simulate
Gartner's definition of optimization/simulation: Using analytical
tools and models to maximize business process and decision
effectiveness by examining alternative outcomes and scenarios,
before, during, and after process implementation and execution.

In-line/
Fixed Supporte Predictive
Rules d by Data Analytics embedded
analytics

Enabling Trends
Increasing connectivity Traditional Deriving
offline analytics events/data
More-powerful mobile
processors Explanatory
Exploring
Sophisticated mobile events/data
applications Predictive

We have reached the point in the improvement of performance and costs that we can afford to perform analytics and simulation for
each and every action taken in the business. Not only will data center systems be able to do this, but mobile devices will have access
to data and enough capability to perform analytics themselves, potentially enabling use of optimization and simulation everywhere
and every time.
This can be viewed as a third step in supporting operational business decisions. Fixed rules and prepared policies gave way to more
informed decisions powered by the right information delivered at the right time, whether through CRM or ERP or other applications.
The new step is to provide simulation, prediction, optimization and other analytics, not simply information, to empower even more
decision flexibility at the time and place of every business process action.
Another way to view this is as a shift in timing. Business intelligence has mainly provided us historical analysis, increasingly
powerful ways of analyzing what has already happened. We can increase the scope of the information that is analyzed and we can
reduce delays between the data creation and its analysis, but at heart this is a look backwards. The new step looks into the future,
predicting what can or will happen.
We don't want to only spot past patterns of suspicious activity that might indicate fraud had already occurred. Think of the additional
value if we could look at an action as it is taking place, predict the results, spot the fraud that will ensue and be able to stop it before it
materializes. What if we could predict the likely sales in a store from remaining inventory for various possible purchases at this
moment, and offer inducements to a customer to pick the product right now that maximizes likely total revenue for the remainder of
the day?
Advanced analytics also involves new technologies to search unstructured content and other search enhancements.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 15
Top 10 Strategic Technologies for 2010

Strategic Imperative: Focus both on use of social software and social media in the enterprise
and participation and integration with externally facing enterprise-sponsored and public
communities. Do not ignore the role of the social profile to bring communities together.

Social Computing:
Participating in and Tapping Into the Collective
Social Software Blogs
Social Wikis
for the Enterprise Network
E-Mail Integration
Security Activity Streams
Content Tagging

Facebook
Discussions

Social Twitter Myspace


Public
Ratings
Externally Recommendations Profile Bloggers Social
Facing: Search Content Ratings
Media
Website
Tagging
Customer Integration

Communities
The Battle for User Input and Knowledge Sharing

Web content management (WCM) systems, team collaboration support and portals have evolved as
complementary applications. This early evolution and system interaction became the basis of the smart
enterprise suite (SES). A WCM system deals with content creation and management, controlling who can
create documents for the Web site, how multiple authors collaborate and who approves their work. It also
determines how documents will look and be revised, protected, stored and delivered. Portals control which
users see what content under what circumstances, and collect and configure content on demand. They
maintain a list of available content and know who controls it, the permissions required to access it, and the
form and character that its authors have specified for its delivery.
Early portal products focused on information access rather than providing the workspace within which teams
would collaborate. Team workspaces evolved during the same period to provide a Web-based work
environment. However, workers do not want two distinct environments to support their work — one for their
own work products (whether personal or group) and another for accessing "external" information. The
convergence of these functions has been fairly slow, while portals focused on integration, first of information
access, and then of application access.
Action Item: Plan for the inclusion of a document repository within the portal to create the opportunity to use
this merged capability to support collaborative teamwork.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 16
Top 10 Strategic Technologies for 2010
Reality Check: Organization charts seldom reveal how work really gets accomplished.

Examine Collaboration in Context Using


Social Network Analysis
Social network analysis
shows relationship-
based
based business
business
intelligence regarding:
•• Intricacies
Intricacies of
of working
working
relationships
relationships
-- Which
Which team
team and
and individual
individual
activities
activities would benefit
would benefit from
from
technology
technology
•• Network
Network connectors
connectors and
and
boundary
boundary spanners
spanners
-- Who
Who are
are the
the critical
critical people
people to
to
drive
drive technology
technology adoption
adoption
•• Social
Social analytics
analytics
-- Trends,
Trends, topics,
topics, expertise
expertise

Social networks exist. The process of social network analysis is a purposeful study of social networks to glean
information, gain insight and develop an action plan based on that insight. In some situations, a company may
simply want to understand how people interact and ensure that it supports those interactions with technology.
In other situations, a company may create an intervention to change behavior once it has had a chance to
evaluate the current state of the social network in its department or organization. For more information on
SNA, see "Social Network Analysis: What a Difference an 'A' Makes" (G00160449).
Action Item: Organizations that really want to optimize their effectiveness should begin with baseline SNA and
compare it every year to see what has changed.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 17
Top 10 Strategic Technologies for 2010

Key Issues

• What technologies will drive the future of


applications, application development and data
centers?
• What technologies will drive change and
advances in the end-user environment?
• What other long-term trends and "forgotten"
technologies should be evaluated?

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 18
Top 10 Strategic Technologies for 2010

Flash Memory — Wide Uses Ahead


Why Flash is Strategic
Performance • Orders-of-magnitude enhancements of
performance in small, rugged footprint with
decreased power consumption

Advantages
Energy efficiency • Price per byte declines shrink gap from costs
of traditional hard disk drives
Issues and Impact
Density • Adds a layer to the storage hierarchy, which
changes enterprise plans and strategies
Ruggedness • Cost and useful life disadvantages
• Rebalances the processor to storage speed
ratio after years of widening
Disadvantages

Strategic Decisions
• Evaluate storage requirements and budget
Cost per byte feasibility versus performance, capacity,
power and total cost of ownership (TCO).
• Find opportunities to leverage the unique
Update limits performance to displace other hardware or
solve service level challenges

Flash memory is a semiconductor memory device, familiar to many from its use in USB memory sticks and
digital camera cards. It is much faster than rotating disk, but considerably more expensive — a few dozen
times more per byte than the same capacity disk drive. This differential is shrinking, however. At the rate of
price declines, the technology will enjoy more than a 100% compound annual growth rate during the next few
years and become strategic in many IT areas.
Flash has a number of advantages compared to the technologies it will displace. It has enormous speed
compared to rotating disk drives. The energy requirements are somewhat lower than rotating disk, even when
very aggressive power management features are used to minimize disk drive consumption, but the power used
per input-output operation is enormously lower which is where this really shines. That is, when considering
flash-based storage against the amount of hard disk drive based storage that would be needed to sustain the
same rate of I/O operations, the energy efficiency is compelling. Flash based storage, being solid state, is much
more rugged and takes up less space.
The disadvantages of flash based storage are twofold — the cost per byte of storage is considerably higher
although the gap continues to lessen, and flash devices have a finite life with a maximum number of updates
that can be performed. This can be ameliorated by load-balancing techniques, but we might need to replace
devices during the useful life of the flash-based products when we would not with a hard-drive alternative.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 19
Top 10 Strategic Technologies for 2010

Flash Memory — Changing Storage


Device Speed Cost Nonvolatile A blend of RAM and disks
RAM +++++ ++++ N • Use as cache?
Flash +++ ++ Y • Use as solid state disk?
Disk + + Y • Implement a new layer in the
storage hierarchy?
Virtual • Internal or external location?
Paging
File Cache • New layer suggests unique
placement philosophy
? • Mechanism for placement?
Flash
- Operating system chooses
- Application/middleware picks
Disk Cache - Formal OS interface helpful
Disk Drive
New Storage Hierarchy

We will see huge use of flash memory in consumer devices, entertainment equipment and other embedded IT systems. In addition, it
offers a new layer of the storage hierarchy in servers and client computers that has key advantages — space, heat, performance and
ruggedness among them.
Unlike the RAM, the main memory in servers and PCs, flash memory is persistent even when power is removed. In that way, it looks
more like disk drives where we place information that must survive power-downs and reboots, yet it has much of the speed of
memory, far faster than a disk drive. Instead of the milliseconds experienced with disks, we measure flash and RAM access in
nanoseconds.
The simplest and most obvious uses of flash are to increase the size of caches, thus increasing the effective performance of disk (at
least for data that is able to be delivered out of cache instead of from the drive itself). This is complicated because flash memory has a
fairly limited number of writes it can sustain before it can fail, far less than RAM or disks. Putting mainly read-only data in a flash-
based cache is a good current exploitation. Similarly, one could create a solid state disk, something that is attached and accessed the
same as rotating disk but which internally uses flash memory to hold all the data.
The more interesting possibility is to recognize that this is neither fish nor fowl, but a new blend of both RAM and disk that can
become a new layer in the storage hierarchy. Just as we have to decide what data sits only in virtual memory in the PC and what data
is placed on some disk device, we should decide what data should be placed in flash. We can depend upon its nonvolatility, yet
benefit from its enormous speed compared to disk, the other nonvolatile choice. Key to getting maximum value is putting the right
information in the new layer — only the information or system tables that can benefit most from the speed advantage, giving the most
overall improvement to the system. This will require the operating system or application to designate the important data.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 20
Top 10 Strategic Technologies for 2010

User Activity Monitoring —


It's Going to Get Harder
User Activity: Cloud- Privileged
Based Applications Users: Cloud • Targeted attacks are on the
rise
Cloud Applications
• Application and infrastructure
Cloud Infrastructure are prime vulnerabilities
Targeted Attacks

• Cloud computing complicates

Insider Threats
Cloud Security Services
the picture
• Security shifting from targeted
Compliance Corporate protection to broad monitoring
Data
Your Data Center • User activity event streams
must be monitored
Applications - Instrument internal
Infrastructure systems
- Evaluate cloud providers
Activity monitoring
User Activity: Data-Center- Privileged Users:
Hosted Applications Data Center

We need to monitor the activity of privileged users and application users. Targeted attacks are on the rise.
Many of these attacks exploit security weaknesses in applications or infrastructure. When the attack occurs
from the application layer, the only signal that we may ever have of the breach is abnormal application
activity from a compromised user account. When the attack occurs from the infrastructure layer, the only
signal that we may ever have of the breach is abnormal system access from a compromised privileged user
account. During the next five years, the work of user activity monitoring, compliance reporting, and breach
or attack discovery will be complicated by the adoption of cloud computing. This work will be more
important as organizations consume IT services that are implemented on external shared IT infrastructures.
Most organizations will have a mix of IT functions that are consumed as a service and applications that are
implemented on their own IT infrastructure. Some corporate data will be exposed to the corporation's
privileged users, and some corporate data may be stored on a shared infrastructure and will potentially be
exposed to the privileged users of the service provider. Internal threat, fraud and attack discovery will still
depend on the ability to track user activity across transactions and across applications, but user activity will
be scattered across internally provided and externally sourced applications (which may not be instrumented
directly for monitoring).

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 21
Top 10 Strategic Technologies for 2010

User Activity and Resource Access


Monitoring Technologies
Monitor: • Identity and access policy change
• Privileged user activity
• Application user activity
• Sensitive data access and movement

Drivers User Activity Technologies


Network
Anti-Fraud Fraud
Application Detection
Security
Forensics/ Database Information
Access Activity
Breach Database Identity and Event
Management Monitoring
Detection And Management
(DAM) Access (SIEM)
Compliance
Reporting Data Data Loss Manage-
Prevention ment
(DLP) (IAM)
System

Information security professionals face the challenge of detecting malicious activity in a constant stream of
discrete events that are usually associated with an authorized user and are generated from multiple network,
system and application sources. At the same time, security departments are facing increasing demands for ever-
greater log analysis and reporting to support audit requirements.
A variety of complementary (and sometimes overlapping) monitoring and analysis tools help enterprises better
detect and investigate suspicious activity — often with real-time alerting or transaction intervention. SIEM,
content monitoring and filtering (CMF) and data leak prevention (DLP), database activity monitoring (DAM)
network behavior analysis, and fraud detection all provide improved user activity monitoring, alerting and
(often) event correlation. However, each of these tools varies in the scope of monitoring source coverage, and
while complementary, they don't necessarily interoperate. Also, in many organizations, each tool may appeal to
a different buying center or business unit. By understanding the strengths and weaknesses of these tools,
enterprises can better understand how to use them to defend the enterprise and meet audit requirements.

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 22
Top 10 Strategic Technologies for 2010

Action Plan

Action Plan
• Tomorrow
- Categorize vendor cloud offerings as "consumer" or "provider" focused
- Identify social profile sources and begin planning for coordination
- Use pod and zone concepts for data center refits or new builds
- Expand "Green IT" to include a focus on "IT for Green Business"

• Within 90 Days
- Establish a project team to identify public and private cloud computing
opportunities and governance models
- Determine where user activity and resource access monitoring can be
applied for maximum impact in your environment
- Find opportunities for mobile apps for both customers and workers
- Investigate flash memory to address your top performance problems

• Within 12 months
- Build a 10-year strategic roadmap for client computing
- Exploit SNA for fraud detection and organizational development
- Build transition plans to a live migration based availability strategy
- Select the high value points to employ operational analytics

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 23
Top 10 Strategic Technologies for 2010

Recommended Reading

Recommended Reading
• “Cloud Computing Services: A Model for Categorizing and Characterizing
Capabilities Delivered From the Cloud” G00163913
• "APaaS: A "Killer App" to Cloud Computing?" G00168152
• “The Spectrum of Public-to-Private Cloud Computing” G00167187
• “Extracting Business Intelligence from Social Networks” G00168222
• “Hype Cycle for Analytic Applications, 2009” G00167256
• “Overview of Content Analytics Projects, 2009” G00165682
• “Hype Cycle for PC Technologies 2009” G00168647
• “Sustainability Will Be Important Even Beyond the Recession” G00168403
• “Cost Optimization: Cut Costs by Building Agile Data Centers” G00167589
• “Hype Cycle for Social Software” G00168875
• “Pattern Discovery With Security Monitoring and Fraud Detection
Technologies” (G00170384)
• “Emerging Technology Analysis: Solid State Drives, Server Technologies”
G00168356
• “Smartphones: Why Internet Access is Not Enough” G00169100

Additional Research
• The Changing Hardware Foundation for Windows Server (G00171107)
• How Cloud Computing Relates to Grid Computing (G00171353)

This presentation, including any supporting materials, is owned by Gartner, Inc. and/or its affiliates and is for the sole use of the
intended Gartner audience or other authorized recipients. This presentation may contain information that is confidential, David Cearley and Carl Claunch
proprietary or otherwise legally protected, and it may not be further copied, distributed or publicly displayed without the express
written permission of Gartner, Inc. or its affiliates. © 2009 Gartner, Inc. and/or its affiliates. All rights reserved. SYM19_491, 10/09, AE Page 24