You are on page 1of 24


Cloud 2016

Federal IT Spend Will Be
Higher for These Two Reasons
What the Army Wants in a
Cloud Vendor
The Problems with the
Pentagon’s Cloud Policy

NSA CIO Smithberger’s Secret
Sauce to Cyber in the Cloud
How Law Enforcement
Collaborates to Fight Terror


Table of

Agency by agency, chief information officers
are hopping on the highway to the cloud. As
I heard it put at a recent AFFIRM event: Yes,
it’s a highway, but it’s like Virginia Rt. 66 at
rush hour, during a snowstorm—packed
with cars and not moving much.

FDA, NIH Finding
Solutions to Health
Care Problems in the
Cloud …Page 2
Cyber, Modernization
Efforts Driving
Expected Growth in IT
Spending …Page 3
Air Force Deploys
Special Team to Fix HR
Technology Systems
…Page 4
Finally an Answer
Emerges for “Why
IPv6?”…Page 6
DoD’s Cloud Policy
Rains Some Risks, IG
Says…Page 8
18F Shines Some Light
on Platform,
but Concerns Remain…
Page 10
Army Shops Cloud
Vendors to Host
Enterprise Apps
…Page 12
NSA’s Move to the Cloud
Includes Something
Borrowed, Something
New…Page 14
National Park Service
Bringing Online to the
Outside…Page 16
With New Threats,
Law Enforcement
Agencies Look for More
Collaboration…Page 18

From a strictly technical standpoint, it’s easy
for agencies to store and access data via the

What makes cloud computing hard is the
long on-ramp of thinking through the policy around balancing
accessibility with the privacy and security attributes that need to
surround the data. And then there is the potentially even longer
off-ramp of changing how individuals and groups re-imagine the
capabilities cloud can provide organizations enterprisewide.

In this Expert Edition, you’ll see many agencies have achieved
transition to certain cloud competencies, in some cases just the
“low-hanging fruit.” The Food and Drug Administration is using
infrastructure-as-a-service to process larger amounts of data
on demand, while the National Institutes of Health is using the
cloud to save time generating MRIs. And the FBI’s Criminal Justice
Information Services (CJIS) division is improving collaboration
between law enforcement communities to deliver rich data in near

As these successes multiply, so do the investments in modernizing
the surviving systems. The Air Force invests in a special team of
cyber and network operations specialists to solve complex technical
pay and personnel glitches. And GSA’s 18F rolls out to
help agencies work more effectively with cloud service providers.
Investments in creating new on-ramps that generate cloud
solutions also are increasing.

The Army forms a Cloud Computing Enterprise Transformation
(ACCENT) vehicle to handle the bulk of the service’s cloud
purchases. The General Services Administration, partnering with
the Defense Information Systems Agency, will look to industry and
DoD to provide feedback on a new multiple award cloud computing
services contract by the end of FY 2016.

The speed at which government travels the cloud highway to
improve internal operations and citizen services is only as fast as
the slowest vehicle, the road conditions or various factors. But rush
hours eventually end and snowstorms turn into blue skies – leaving
the driving lanes open for the government to choose its ultimate
destination to modernize and improve how it delivers its mission.

Lisa Wolfe
Program Director
Federal News Radio



FDA, NIH Finding Solutions to Health Care Problems in the Cloud


t the Food and Drug Administration, the
National Institutes of Health and several
other health care related agencies, the move
to cloud computing is having a dramatic
impact on their mission.

Thomson said now NHLBI is working with
Microsoft on how best to ensure the integrity of
the data because it’s going to be used by doctors
to make decisions.

“At the FDA, we ingest large amounts of data,
regulatory data, industry data and data from
genomic sequencing and medical devices,” Todd
Simpson, the FDA chief information officer, said
at the 2015 Health IT day sponsored by AFCEA’s
Bethesda, Maryland chapter. “What used to take
weeks to process, we can now do in literally

Thomson said NHLBI also will take advantage of
the cloud to improve collaboration on research.

The FDA, for example, launched a new
infrastructure-as-a-service platform as part of
the Precision Medicine initiative.

He said the infrastructure in the cloud is part of
an ongoing transformation to move away from
costly on-premise computing environments.
The cloud lets the FDA spin up more processing
power when needed and spin it down when it’s
not, said Simpson.

At the NIH, Alastair Thomson, the CIO for the
National Heart, Lung, Blood Institute said his
office is working closely with Microsoft and
Amazon to use their government-only clouds at
three facilities to reduce the time of an MRI scan
to about 20 minutes from typically an hour.
Dr. Michael Hansen of the NHLBI Division of
Intramural Research is leading the MRI cloud
project and said it’s a huge advance.

“We realized that the raw MRI data coming off
the sensors might as well be encrypted,” he said.
“You need advanced algorithms to do anything
with it. It’s de-identified when it goes away. It’s
never stored there. It’s computed on and sent
back. It makes it a lot easier.”



“It’s been about building understanding from the
commercial provider about what they actually do,
how are they protecting it and what do we need to
do to layer our protections on top of it,” he said.
“NIH has just implemented a science DMZ
[separating trusted from untrusted networks] to
allow communication of data over Internet 2 at
extreme high speeds,” he said. “There is a lot of
learning we have to do about how it will work.”
Internet 2 is non-profit made up of public and
private sector organizations, research labs and
universities around the world to provide highspeed network services and a secure testing

Simpson also has an Internet 2 initiative as part
of FDA’s new IT strategic plan.

“We have huge interoperability issues between
our systems,” he said. “But I have additional
interoperability problems. I have labs all across
the U.S. that are coming in with everything
Ethernet and we are not equipped to get those
feeds. The Internet of Things is coming in quickly
in the labs presence and we are trying to adapt
from that standpoint.”
Simpson said the FDA also is looking for a
business intelligence-as-a-service platform,
and recently launched data loss prevention

Cyber, Modernization Efforts Driving
Expected Growth in IT Spending


ybersecurity and the need to modernize
legacy systems are expected to push federal
IT spending higher across both civilian and
Defense sectors over the next few years.

“The unclassified spending is just under $80
billion, around $79.5 billion is what we are
expected to see,” said Robert Haas, the chairman
of the Professional Services Council’s Vision
Federal IT Budget Outlook Team in an interview
with Federal News Radio. “We’ve seen a dip in
the last couple of years. There has been a lot
of pressure on the IT spend, along with other
federal budgets. But we have optimism that is
supported by the interviews we’ve conducted
over the vision process that the 2015 numbers
will have come in above what was expected, as
well as, the 2016 numbers still look to be very
PSC’s Vision teams conducted dozens of
interviews with federal IT and acquisition
executives over nine months to come up with
the 51st annual forecast.

PSC estimated the civilian agency IT budget will
be about $49 billion in 2016 and it’s expected to
increase at about the rate of inflation to about
$52.4 billion in 2021.
The Defense Department’s IT budget, which
has dipped by about $3 billion over the last few
years, is expected to slowly increase to $30.5
billion in 2016 and rising to about $33.5 billion
by 2021.

Haas said the optimism of federal executives is a
good sign especially since the trend over the last
four years has been downward with a compound
annual growth rate of negative 1.6 percent. That
was much different than the previous five years
when the compound growth rate of the IT budget
across government was 6.4 percent.

Haas said agencies are under pressure to
modernize infrastructure and applications, and
the ongoing cybersecurity challenges require an
infusion of funding.

“One of the biggest drivers we are seeing is
around cybersecurity,” he said. “They are funding
cybersecurity projects oftentimes by taking
funding from other projects that currently exist.
They will cut back on maintenance, they will
cut back sometimes on new starts on programs.
Conversely, the other area we are starting to see
is the replacement of the older systems because
in some cases they are able to save money by
providing a cheaper, more cost effective solution
that is easier to maintain and takes fewer
The focus on replacing legacy systems is coming
directly from the Office of Management and
Budget. Federal CIO Tony Scott said late last year
that the problem with the legacy infrastructure
is “a crisis bigger than Y2K.”
Haas said there are several takeaways for
industry and agencies.

First to industry, he said agency leaders made it
clear they are looking for value and performance
and systems that provide for quick capabilities
without a huge investment in IT systems.

“If you can combine those features you
will probably have a winning system in the
government’s eyes,” he said. “We heard from a
number of folks we interviewed that they partner
with the mission owners from an IT perspective
to make sure the IT systems are aligned with the
processes they support.”



Air Force Deploys Special
Team to Fix HR Technology

irtually every organization on the planet suffers from
frustrating IT glitches in its human resources department
from time to time. But most of them aren’t managing pay and
personnel functions for half a million people, and the outages
don’t usually last for several hours at a time.

Both of those things are true of the Air Force. So in early December
2015, the 24th Air Force — the service’s cyber and network
operations specialists — stood up a special mission team that
will undertake an end-to-end examination of the complex web
of both modern and legacy systems that make up the service’s
personnel and pay infrastructure and what can be done to make it
more reliable.


Each time a key link in the Air Force’s HR IT enterprise goes offline for a few hours, the service’s
productivity losses are measured in man-years — not man-hours, said Bill Marion, the chief
information officer for the Air Force headquarters’ manpower, personnel and services organization
“When we’re down for three hours and then, with the latency it takes us another 45 minutes to
process each civilian hire, that has serious domino effects,” Marion said at an annual Air Force
IT conference hosted by AFCEA’s Northern Virginia chapter. “We need to root out any of the
inefficiencies we have.”



Part of the issue is sheer complexity: To process service members’ and civilians’ pay and personnel
transactions, the Air Force today uses 120 separate systems — 86 of which duplicate others that
provide similar capabilities — at 213 sites around the world.
The disaggregated nature of those IT systems means HR also is an extremely expensive and
manpower-intensive business for the Air Force, costing $1.3 billion per year and requiring one HR
specialist for every 22 airmen on the service’s payroll.

For the last five years, the Air Force has had a plan to replace several dozen of its legacy HR systems
with a commercially-based enterprise resource planning system it dubs Air Force Integrated Pay and
Personnel System (AF-IPPS).

“...the Air Force ... has
become the first DoD
component to transition
“critical mission information”
into a nongovernment cloud.
“We see great benefits
coming from it.”

But the $570 million program missed its target to begin
delivering capability by summer 2015 and is rated at
“moderately high-risk” for failure, according to the
latest data on the government’s IT Dashboard.

Marion said the Air Force is now reexamining its approach to

“It’s about building an entire ecosystem of lifecycle support for
the HR world in a way that’s consistent, repeatable and agile,”
he said.
To that end, Marion said the Air Force had taken several steps
of late to simplify the network architecture that supports its
HR data, to make its systems more self-service so that they
don’t require as much professional support, and to get out
of the business of hosting its own data centers as much as

In August 2015, the Air Force got the go-ahead to host sensitive data in a commercial cloud
environment at what DoD defines as impact level four. Since then, it has become the first DoD
component to transition “critical mission information” into a nongovernment cloud.

The project involves the Air Force’s existing myPers portal, which lets airmen handle some pay and
benefits matters on a self-service basis.

“We’ve still got a few critical steps to go because it’s the first time we’ve done this within the DoD,
with new concepts like cloud access points and rules for email flows, but the core is in place,” Marion
said. “We see great benefits coming from it.”



Finally an Answer
Emerges for “Why IPv6?”


here do agencies stand a decade after the
White House first called on agencies to
adopt the more secure protocol IPv6?

It was so important, at one time, that
the Office of Management and Budget issued
two memos requiring agencies to move to the
more secure, better protocol on the network
backbone. The CIO Council also stood up a
working group, issued how-to guides, and there



were the assorted conferences, talks and lunches
about the importance of IPv6.
The first time OMB mandated agencies move
was in 2005, giving them a 2008 deadline.
Then two years after most agencies missed the
2008 deadline, OMB came up with two more
deadlines: By 2012, agencies must upgrade
public or external facing services and by 2014,
they must upgrade internal client applications

that communicate with public services or
support enterprise networks.
Nearly a decade in the making, the latest
statistics from the National Institute of
Standards and Technology show quietly
agencies are leading industry.
And the FCC became the latest in a small
number to fully make the jump to IPv6.

“With the recent move
of the core FCC data
center to a commercial
cloud provider, FCC
enabled IPv6 for
public facing systems,”
said FCC CIO David
Bray, in an email to
Federal News Radio.

at the Energy Department, who also led the
government’s transition efforts to IPv6, said
the inability to explain why IPv6 is important
beyond the technology aspects has been the
biggest obstacle for companies and agencies.

But Tseronis said with the emergence over the
last year or so of connected devices under the
moniker the Internet of Things, there finally is
a good explanation of why moving to IPv6 is so

“...we’re hoping commercial
cloud providers adopt it for their
public-facing services soon...” 


The FCC made the move,
in part, because of its
decision to move to the commercial cloud.

“Our strategy at FCC is to reuse commercial
cloud services going forward, which is why we’re
hoping commercial cloud providers adopt it for
their public-facing services soon,” said Christine
Calvosa, the FCC’s deputy chief information
officer of resiliency, in a statement to Federal
News Radio.

One of the reasons the public sector is way ahead
of the private sector is commercial providers
don’t see the demand for IPv6. Even though
many have rung the alarm bell about running out
of IPv4 addresses, the better security that comes
with IPv6 and host of other benefits, the private
sector hasn’t been overly excited about it.
Peter Tseronis, CEO and founder of Dots and
Bridges and a former chief technology officer

“If I was using the term
Internet of Things in
2005, it would have
been a lot sexier than
IPv6,” Tseronis said. “We
struggled with what
IPv6 means. We got
bogged down in technical
jargon and it became all
about compliance and

The good news is agencies actually are making
real progress.

NIST runs a governmentwide scorecard on
agency progress to move external domains to
IPv6. The results show a strong majority of the
networks either are IPv6 enabled or on their
way. The Department of Interior, NASA and
the Social Security Administration have fully
moved their domains to IPv6. Others such as
the National Science Foundation, the Veterans
Affairs Department and the Nuclear Regulatory
Commission are close.

On the other end of the spectrum, the
departments of Agriculture, Health and Human
Services and the General Services Administration
have a long way to go.



DoD’s Cloud Policy Rains
Some Risks, IG Says


Defense Department Inspector
General’s report found problems with
the Pentagon’s cloud policy that may have
monetary and cybersecurity risks.

DoD does not maintain a comprehensive list
of cloud computing service contracts because
the department’s chief information officer
failed to establish a standard, departmentwide definition for cloud computing. In
addition, the DoD CIO did not develop an
integrated repository that could provide
detailed information used to identify cloud
computing service contracts, the report
As a result, DoD has no way of determining
if it is actually saving money by migrating to
the cloud and may not be able to effectively
identify and monitor cloud computing
security risks, the report stated.

“DoD’s ability to track cloud computing cost
savings and benefits is greatly limited if DoD
is not aware what cloud computing service



contracts exist within
DoD … [and] unless DoD
Components accurately
classify their information systems as using
cloud computing services, DoD CIO will not
be aware what security risks are specific to
those services,” the report stated.

The DoD IG found inconsistencies between a
list of cloud computing service contracts kept
by the CIO and the ones kept by the military
departments for fiscal 2011-14.
For example, the Army identified
nine contracts for its cloud computing
services in that period, while the DoD CIO
only identified three. Likewise, the Navy
identified zero contracts when the CIO had
two potential contracts.

Part of that problem may have been because
of the DoD’s lack of a repository for cloud
computing service contract information.

The DoD CIO uses four different IT reporting
systems to gather information on DoD cloud

computing. However, the systems are
not integrated and do not provide
the level of detail desired by the CIO,
the report stated.

DoD has no way of determining if it is actually
saving money by migrating to the cloud and
may not be able to effectively identify and
monitor cloud computing security risks.

The DoD CIO is taking steps to remedy the
situation by looking for ways to link the

The DoD IG recommended the CIO issue
guidance to establish a department-wide
definition or clarify NIST’s definition. The
DoD CIO’s office responded, saying its DoD
Cloud Computing Security Requirements
Guide (SRG) “established a standard
definition of cloud as well as requirements
and processes for assessing cloud computing
security risks.”
The DoD IG disagreed with the assessment.

The IG also recommended the DoD CIO
establish an integrated repository of
cloud computing contracts. The CIO’s
office responded stating it implemented
enhancements to its systems to better collect
contract details.

The DoD CIO was contacted for further
comments on the report but stated it had
nothing to add.

In recent years, the military has made an
effort to save money by hosting less of its
material on its own drives and contracting
more private companies to provide cloud
services. Private companies can procure the
best available technology faster than DoD,
which needs congressional appropriations.
The DoD CIO last year gave the military
services and department components the
ability to procure their own cloud services
independent of the department.



18F Shines Some Light
on Platform,
but Concerns Remain


he General Services Administration’s
18F organization rolled out a new service,, last October.

While there was some initial confusion of
what exactly is, an 18F spokesman
confirmed that is a platform-as-aservice based on the open source technology
called Cloud Foundry.

Cloud Foundry is a not-for-profit supported
by a who’s who of private sector technology
companies, such as IBM, VMWare, Intel, EMC
and many more. The organization’s website
says it is built for fast-cycle innovation of cloud
applications and boasts higher rates of user
adoption, faster cycle time and higher reliability.
This sounds good on paper. 18F is taking
advantage of a no-cost or low-cost offering to
help agencies move more quickly to the cloud.

“ enables 18F to deploy its cloudbased applications with baseline security and
scalability concerns addressed consistently up
front, without dramatically scaling the number
of cloud operations experts in our organization,”
an 18F spokesman said by email.
But if cloud services are widely considered
a commodity, then why is the government
competing with the private sector to offer these

Six vendors have earned Joint Authorization
Board (JAB) approvals for PaaS and nine vendors
have received agency approvals according to the
Federal Risk Authorization and Management
Program (FedRAMP) website.



So if 15 vendors and agencies are offering PaaS
already, why did 18F presumably spend time and
money to develop another competing platform?
The 18F spokesman said isn’t
competing with the private sector.

“By using Cloud Foundry and making cloud.
gov available, we’re raising the ‘lowest common
denominator’ capabilities agencies should
expect from any vendor to the Cloud Foundry
capabilities, and enabling greater competition,”
he said.

The spokesman added PaaS vendors not part of
Cloud Foundry also will benefit because cloud.
gov will increase the migration of government
applications to a “12-factor model, and therefore
the applicability of PaaS models to government.”

“12-factor applications have very little
dependency on the exact platform that is
supporting them, so these applications will
be more easily ported to non-Cloud Foundry
commercial PaaS solutions as well,” he said.
“Vendors of such systems have open-book access
to how works, and can therefore
provide easy on-ramp solutions to migrate
government applications to their platform.”
Is 18F is trying to be a PaaS cloud broker?

“It looks like they are giving you templates and
ways to work with particular CSP,” said one
federal CIO. “The more I looked at, the
more I would like to know the details. It didn’t
seem like they were trying to duplicate what
industry was doing, but help us get to the cloud
So why are vendors and some in government
so concerned? It is a matter of better
communication from 18F? Or does the entire
concept of 18F make feds and contractors
uncomfortable? will be worth watching to see if 18F
can attract customers beyond the work they are
doing for agencies. It also will be interesting to
see when Congress decides to take a look at the
18F concept and whether the government is
unfairly competing with the private sector.

Maximize the full value of your technology assets
with ViON Infrastructure As A Service Offerings.
Gain deeper insight into best practices for procuring technology as a
service – DOWNLOAD these White Papers Today:

Flexible IT: An Overview of Infrastructure as-a-Service
Information Technology “as-a Service” Considerations
for Contract Administration 
SC Technology Council Report on best practices for
Federal Agency Adoption of Commercial Cloud Solutions

to Host

he Army has started polling

prospective cloud computing

vendors on their capabilities,
laying the groundwork for

a new contract vehicle that’s

likely to handle the lion’s share of

the service’s cloud purchases over
the next three years.

A request for information, issued

December 30, 2015, asks companies

to describe their approaches to dozens of

support services the Army believes it will

need as it transitions legacy applications from
government data centers to commerciallyoperated ones between now and the end

of 2018, including authentication, network

monitoring and the ability to have secondary
facilities take over in case of a system failure.
The overall project, dubbed Army Cloud
Computing Enterprise Transformation

(ACCENT) still is in the planning stages, but
a draft request for proposals released last

November envisions a two-step acquisition
process in which cloud vendors would first

qualify for spots on ACCENT via “basic ordering
agreements” (BOAs) but with no guarantee

of future work. They would then compete for
individual projects at the task order level.




Functionally, the BOAs would operate much like
the blanket purchase agreements government

agencies routinely use to buy a variety of goods

Additionally, companies would need to support

Army’s program executive office for enterprise

sure all the government data they’re hosting

and services, said Doug Haskin, the project
director for enterprise services within the
information systems.

“But we think the BOA is better for the high-

dollar contract actions that are associated with
hosting services,” he said in an interview with
Federal News Radio. “It also gives us more

flexibility to adapt to uncertain requirements
when you don’t know all the requirements

up front, which is the position we’re in now.
Keeping that flexibility is important for us
because we know that the guidance and

policies for commercial cloud in DoD have been

changing and will continue to change. I think it’s
also going to benefit industry, because if they’re
going to commit the resources to get onto this
agreement they need to know it’s going to be
viable and not obsolete in a year or two.”

The Army envisions using the contract to

buy infrastructure-as-a-service, platform-

as-a-service and software-as-a-service cloud
offerings. Vendors would need to meet the

governmentwide FedRAMP standards for cloud
security and have DoD-specific provisional
authorizations to sell their wares to the

government, but the additional DoD security

controls they would need to comply with would
be set out task order by task order.

DoD’s public key infrastructure to authenticate
users via their common access cards, make

is physically stored in the U.S. and be willing

to open their facilities to DoD security teams

at any time in case of a cybersecurity incident
or criminal

investigation. And
under task orders
issued for DoD’s

highest unclassified
cloud security

designation —
Level 5— all

government data
would have to be
kept physically
separate from

that of the hosting

commercial clients.

“Keeping that
flexibility is important
for us because
we know that the
guidance and policies
for commercial cloud
in DoD have been
changing and will
continue to change.”

The Army plans to
use the ACCENT

contract to follow through on guidance the
service’s chief information officer issued in

July of 2014, telling all Army components that

they had until 2018 to migrate as many of their
enterprise-level applications as possible to
commercial environments.



NSA’s Move to the
Cloud Includes
Something Borrowed,
Something New




he National Security Agency is taking a page
from commercial service providers as it sets
up three different cloud services. But NSA
is doing cloud computing in its own special
Greg Smithberger, the NSA chief information
officer, said the agency is bringing the
Intelligence Community IT Enterprise (ICITE)
program to reality by taking some of the
concepts of the commercial cloud and applying
special cybersecurity technologies on top.
“This is very similar to the sort of commercial
cloud offering that everyone is familiar with,
but in this case we are offering it inside our
very secure environment,” he said. “We’re


also providing a shared data storage cloud
for the intelligence community that allows us
to integrate data from across the community
while still maintaining that very fine-grained
access control and enforcing that need to know.
That’s based on a lot of that unique technology
developed at NSA. We also are providing shared
resources for the community that allows people
from across the community to run shared data
analytics on that shared data repository while
still ensuring users only see the data they’re
personally authorized to see.”

Smithberger said all three clouds are operational
but NSA and its IC partners are expanding and
scaling out these initial foundational capabilities.
The end goal, he said, is to get capabilities
delivered to the analysts that will let them more
easily and more routinely collaborate across the
The rate of adoption is different throughout the
IC. Smithberger said NSA is using the shared
storage cloud and resources for analytics
aggressively, while others in the IC are working
those tools into their processes.

“It is successful but we have a ways to go to
before we realize the DNI’s full vision for the full
collaborate joint environment for the intelligence
community and it has
other components
than the cloud
components,” he said.

NSA also is planning
to adopt the shared
desktop environment
led by the Defense
Intelligence Agency
and the National
Geospatial Intelligence
Agency. He said it’s
similar to the virtual desktop environment
currently in use at NSA.
“That will probably be in the next couple of
years,” Smithberger said.

NSA is preparing for a new contract as part of its
typical desktop refresh cycle. Smithberger said
he expects to have a few thousand users on the
shared desktop in the coming year, but the major
transition will not happen until the contract is
awarded in 2017 or 2018.



National Park Service
Bringing Online to
the Outside


he Interior Department’s National
Park Service celebrates its centennial
anniversary in 2016.

connectivity options as IT evolves, those
technologies often get cheaper and smaller.”

Shane Compton, the National Park Service’s
associate CIO, said the first step is to work
with the telecommunications carriers and
their subcontractors to bring the signal into
places like the Grand Canyon or Yosemite
National Park.

NPS is focusing on increasing bandwidth at
national parks because it wants to provide
visitors more content — whether photos or
text or maps — about the monument or park.

As part of the commemoration, NPS
wants to bring new technology tools and
services to national parks.

“We are looking to help people to pick
up their own signal when they are there
with their mobile device,” he said. “We are
looking at partnering on testing new ideas.
We have new tools out there that have been
demonstrated at some of our national events
where we have things like cellular on wheels
or high-speed microwave. Some of those


Compton said his office is evaluating what
can be done and is developing a plan that by
2018, high speed connections are available at
all national parks.

“We are asking the parks, our regions
to come up with why you think WiFi
might actually be better for the customer
experience. It may not be a better customer
experience to hike out in the backcountry,
but if you are in a visitors’ center or looking
at monuments, that is where that might
be beneficial. We are trying to make some
conscious decisions on where it will be the
best for the public.”


Compton said he’s working closely with the
mission areas to ensure a better customer
experience across all areas of the park

National Park Service Director Jonathan
Jarvis launched the Call to Action plan in
2011 in preparation for the centennial in
2016. Compton said many of the IT efforts to
focus on the customer are coming from that

“IT is becoming more and more important for
the customer experience,” he said. “I can tell
you the complaints we get when somebody
goes in to a park and finds a hotel doesn’t
have [the] high-speed WiFi so they can do
their work while they are on vacation. It’s an
expectation now so IT has to make sure it’s

But it’s not just about the external customers.
Compton is focused on internal technology
improvements as well.
He said NPS is considering moving more and
more sites and applications to the cloud.

“ is one of the largest federal
websites. People hit it every day. They may
not hit the home page, but they will do a
Google search for a park and be on our site.
Imagine a million people on the website at
one time on one day; the cloud is perfect for
that,” he said. “We just think it’s a matter of
time before we move a lot of what we have to
the cloud.”
Compton said NPS’ goal is to move off
platforms and current servers and put
those applications into the cloud as part of a
normal refresh life cycle.



With New Threats,
Law Enforcement
Agencies Look for More


ecent terrorist attacks and
intelligence operations
in Paris and Beirut are
underlining the need for
better information sharing
between U.S. law enforcement

Law enforcement leaders said they
understand the power that data can have
on their ability to respond to crises, but the
path to sharing that information and developing
interoperable systems to support it has been too
“We get lots of information,” said Karl Mathias,
chief information officer and assistant director
of the information technology division at the
U.S. Marshals Service, during a November 2015
panel discussion at AFCEA Bethesda’s monthly
breakfast. “I have it from all over, from many
different sources, which is the problem. It’s



taking data and turning it
into knowledge that we can
execute against, [that] is really
the struggle we face.”

Too often, government thinks
in terms of programs and
systems and the agencies that
own them, said Jeff Johnson, chief
technology officer and assistant director
for IT applications and data division at the FBI.

Instead, agencies should consider the
information itself and what purpose it will serve
toward the larger goal.

“If we focus in on that data and those attributes
that we really meaningfully want to protect, it
will enable us to more broadly share in the cloud,
because we will talk about what people have
legitimate, authorized access to what data,” he
said. “What data needs to be shared with which

“If we focus in on that data and those attributes
that we really meaningfully want to protect, it will
enable us to more broadly share in the cloud...
...we must start to reshape the way we define
programs, the way we fund programs and the way
we acquire programs in the government.”

communities, and which protections that goes
under. Right now, we are protecting systems. We
protect devices, and then we have exceptions to
those devices.”
The devices and systems law enforcement
agencies use to gather, process and share threat
data are improving, but also at too slow a pace.

The FBI’s Criminal Justice Information Services
(CJIS) division is improving collaboration
between law enforcement communities, Johnson
said. The hope is to add more rich data and
deliver it closer to real-time.

The Marshals Service is working with the Bureau
of Prisons to develop an automated, paperless
data sharing program, Mathias said.
The agency also hired a chief data officer to
work with Mathias in the CIO’s office on the data
analytics side.

For Mathias, his priorities lie in perfecting the
basic technologies and services his employees
need to do their jobs, like tablet computers,
desktop support and mobile phones.

Ultimately, Johnson said the real task is making
the data the FBI collects transparent, and
then equipping the right people with the right
information to make better decisions and
intercept possible threats.

“That information doesn’t belong to a system,”
Johnson said. “It doesn’t belong to a program.
It doesn’t belong to an agency. It belongs to the
taxpayers. It belongs to the citizens of the world,
and it must be shared. In order to get there,
we must start to reshape the way we define
programs, the way we fund programs and the
way we acquire programs in the government.”



Download the power of knowledge,
packed with wisdom, from our team
of journalists and the entire
Expert Edition Library.

Free at
Download today!

Experience the Difference.


Cloud 2016