Sie sind auf Seite 1von 54

Colegio de San Juan de Letran Calamba

ANALYSIS OF DATA PROTECTION PRACTICES OF DEPARMENT OF EDUCATION


(DEPED) STO. ROSA BRANCH TOWARDS THE DEVELOPMENT OF CLOUD DATA BACKUP AND
RECOVERY STRATEGY

Romualdo B. Antonio

A Proposal Presented to the Graduate School and Professional Services Department as a


Requirement for Theories and Principles in Research and Statistics

Masters in Management - Major in IT Management

Dr. Jayjay Meneses

Calamba City, Laguna


May 26, 2018
Colegio de San Juan de Letran Calamba
Table of Contents
I. INTRODUCTION ................................................................................................................................ 3
Structure and Rationale of the Study ......................................................................................... 3
Research Questions and Objectives ........................................................................................... 4
Significance of the Study ............................................................................................................. 5
Scope and Limitation .................................................................................................................. 5
Definition of Terms ..................................................................................................................... 6
II. REVIEW OF RELATED LITERATURE AND STUDIES ............................................................................. 9
International Studies................................................................................................................... 9
Cloud Services ......................................................................................................................... 9
Efficient and Reliable Data Recovery Technique in Cloud Computing ................................. 10
Concern and Issues in Cloud storage .................................................................................... 13
A Survey of IT Professionals .................................................................................................. 14
AWS Risk and Compliance .................................................................................................... 16
Data Classification ................................................................................................................. 18
Classification is a Business Imperative .................................................................................. 22
Data Classification for Cloud readiness ................................................................................ 24
Best Practices on Leverage the Public Cloud for Backup and Disaster Recovery ................. 28
Data Protection Next Level ................................................................................................... 29
Backup Solutions Guidance for Schools ................................................................................ 30
Government’s Cloud First Policy ........................................................................................... 34
Data Privacy: Encryption ....................................................................................................... 37
Synthesis ................................................................................................................................... 38
Research Gap ............................................................................................................................ 39
Theoretical Perspective ............................................................................................................ 39
Conceptual Framework ............................................................................................................. 40
Philosophical Underpinnings/Lenses ........................................................................................ 41
Assumptions.............................................................................................................................. 42
III. RESEARCH METHODOLOGY ............................................................................................................ 42
Research Design ........................................................................................................................ 42
Ethical Considerations............................................................................................................... 42
Research Locale......................................................................................................................... 42
Sampling.................................................................................................................................... 43
Instrumentation ........................................................................................................................ 44
Data Gathering procedure ........................................................................................................ 44
Data Analysis ............................................................................................................................. 44
Appendix A ............................................................................................................................................. 46
Colegio de San Juan de Letran Calamba
I. INTRODUCTION

Structure and Rationale of the Study

Not all information is stored in our memory. Most of the information is translated into digital
form that we all know as data. For common people, data are files, documents, images, pictures
or any other format that contain important information and ideas. This information must be
available whenever we need it and must be provided in timely manner. We saved our files on
our computer and on other media that we feel safe that our files are always there to retrieve.

Our files are not always there for us. There are some catastrophic events and malicious users
that affect the operation of the organizations that can disrupts day to day operations that can
damage our files. Even government agencies are not exempted from natural and man-made
disaster. Government files may be lost if not properly protected. Government agencies should
understand the importance of backing up their important data. No matter how efficient a
computer system appears to be, there is always the possibility of a malfunction wiping out
valuable data. There are both onsite and offsite backup options.

In Philippine Government launched their national cloud service, the GovCloud. The GovCloud
can provide cloud solution to all government agencies including Department of Education. The
creation of the private in-country data center will ensure data security and make online
information available. Philippines government adopted the "cloud first" approach to consider
cloud computing as part of their primary infrastructure.

The government agencies should understand that Cloud as data storage is not the only solution
to protect our information. Although Cloud services have some advantages, but having a good
backup strategy to local premises gives them more advantage in creating a good backup
strategy.
The study may provide us the information with employees’ awareness about cloud computing
as good data storage due to its technological advantage and to be available anytime due to its
distributed storage system. However, without the proper data handling and classification, we
may not be able to maximize the benefits offered by Cloud computing. In this case, the research
can provide practical suggestion relates to assist the IT group of each Government department
who is aiming to formulate a strategic approach and understand the requirements to have
effective data backup and recovery. Lastly, it can do more informed in decision making.
Colegio de San Juan de Letran Calamba

The research can contribute to DepEd in their aim to achieve data backup and recovery
procedure by utilizing Cloud services with proper data handling and classification, and
supplement it with local onsite backup strategy to have a more holistic data backup approach.

Research Questions and Objectives

Government agency like Department of Education (DepEd) Sto. Rosa branch is saving their wide
ranges of information and data files in the on local network and Cloud storage

General Objective:

The main objective of this study is the analysis and development data backup and recovery
strategy of DepEd Sto. Rosa branch.

Research Objectives
i. To determine the current type of data and existing policies implemented in DepEd office
of Sto. Rosa
ii. To assess the data classification and current data backup practices of DepEd Sto. Rosa.
iii. To recognize data risks and problems encountered by DepEd Sto. Rosa.
iv. To develop cloud-based backup and recovery strategy for DepEd Sto. Rosa

Research Questions
i. What are the current state of data and existing policies implemented in DepEd office of
Sto. Rosa?
ii. What are the data classification and current Data Backup practices of DepEd Sto. Rosa?
iii. What are the data risks and problems encountered by DepEd Sto. Rosa?
Colegio de San Juan de Letran Calamba
Significance of the Study

• To IT Professional:
IT professionals who are specializing in data protection may find the results interesting
to know about the current DepEd data backup strategy. This can give them some
insights on how to formulate their own strategies that are applicable not only to
government agencies but also with private organizations who are planning to avail cloud
services for their data storage.
.
• To future researcher:
The study will serve as their guide in conducting similar studies. It can give them the
necessary information to refer to their own research.

• To Other Government agencies:


The results and recommendations from this study may help them understand the
broader range of techniques in data protection with the Government and Private cloud.
It will serve as baseline if they want to consider data backup in cloud storage.

• To Government Employees:
They can benefit on the study to identify their data backup approach to store and
protect their data above their current Cloud storage service.

Scope and Limitation

The Study is focus on the data backup and recovery strategy of DepEd Sto. Rosa while including
the Cloud service by any private provider. The Data protection strategy of DepEd Sto. Rosa is by
performing their Data Backup and Recovery strategy as secondary to their Cloud Storage, and
their security awareness in organizing data into categories or in particular importance.

Data Protection involves different techniques and technology in order to provide the availability
and security of data. The Data protection mentioned in this study limits only with local and
Cloud data backup and recovery, and any DepEd’s data classification practices.
We are not to include other security technologies and software as part of our study, and not to
include DepEd's data disposal practices, implementation of Authentication, Role based access,
Colegio de San Juan de Letran Calamba
Administrative controls, Technology control like Data Loss Prevention (DLP) devices and
encryption gateways, and risk mitigation techniques outside the research objectives.

Definition of Terms

• Software as a Service (SaaS): SaaS is a collection of application and software; it allows


the clients to subscribe the software instead of purchasing it. Software application is
presented as service to the customer based on their demand.

• Platform as a Service (PaaS): This model provides platform as a service. This provides
clients to develop his own application using the tools and programming languages

• Infrastructure as a Service (IaaS): This model provides the shared resource services. It
provides the computing infrastructure like storage, virtual machine, network
connection, bandwidth, IP address. IaaS is complete package for computing.

• Storage-as-a-service Used for Data Protection (STaaS/dp) is cloud service offering cloud
storage as data storage.

• Backup-as-a-service (BaaS) is cloud service offering cloud storage as backup storage.

• Storage replication is process of duplicating disk storage to separate disk storage


location.
• Volume management software, it is the real-time disk mirroring achieved by software
shipped with the operating system.

• Virtual storage. Virtual storage is concentrating multiple storage modules (such as disks,
disk arrays) in a storage pool by certain means and unified management

• Hot Standby: It refers to hot standby based on two servers in the high availability system

• Data is all information generated or owned by the DepEd (including, but not limited to,
information generated or developed by the DepEd’s employees, contractors and
volunteers while performing their duties and responsibilities to the DepEd, unless the
DepEd has waived its ownership rights to the Data) and information not generated or
Colegio de San Juan de Letran Calamba
owned by the DepEd but which the DepEd has the duty to manage. This information can
exist in any form including, but not limited to, print, electronic and digital.

• Data backup strategy: Data backup strategy refers to the determination of steps and
actions to achieve data recovery and reconstruction objectives. It can be divided into
specific data backup and operating system backup
• Data disposal. As discussed in this paper, the policies, timeframes, and methods for
secure disposal of data.

• Data Owner - is the designated person at the DepEd assigned as the owner and decision
maker on the respective set of Data. The Data Owner sets the appropriate data
classification and determines the impact of Data to organization.

• Data Privacy – Ensuring data isn’t misused, misappropriated, or publicly exposed by


those who have authorized access to it

• Data retention - As discussed in this paper, the policies, timeframes, and methods for
storing, archiving, and retrieving data.

• Data retention policy- should reflect the data classification model and data retention
rules that apply to the data that is being retained.

• Data recovery - As discussed in this paper, the long-term storage of data and its
retrieval when it needs to be returned to service

• Data Security – Ensuring data is protected from unauthorized access or interception

• GovCloud – Government cloud provided cloud computing services

• Private Cloud – cloud available to the general public over the internet

• Private Cloud – cloud dedicated to single organization

• RTO( Recovery Time Objective) - refers to the maximum tolerable time the user is
willing to wait to recover his data from data loss.
Colegio de San Juan de Letran Calamba
• RPO( Recovery Point Objective) - refers to the maximum tolerable amount of data
where user is willing to loss

• Drive Encryption - referred to as full disk encryption or whole disk encryption. Drive
encryption solutions encrypt the entire hard drive, including the operating system,
applications, drivers, and user data.
Colegio de San Juan de Letran Calamba
II. REVIEW OF RELATED LITERATURE AND STUDIES

International Studies

This chapter will serve as the foundation for the development of the study. It will discuss the
relevant literature relating to the classification Data, Cloud computing services and data backup
strategy to describe both operational backup and recovery of data.

Cloud Services

Praveen S. Challagidad (2017). Cloud computing provides accessing of any kind of services
dynamically over Internet on demand basis. One of the most significant service that is being
provided is storage as a service. Cloud customer can store any amount of data into cloud
storage results to huge amount of data at the datacenter.

Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to
a shared pool of configurable computing resources It deals with data storage application,
infrastructure using service oriented technology.

Cloud Service Models

Software as a Service (SaaS): SaaS is a collection of application and software; it allows the
clients to subscribe the software instead of purchasing it. Software application is presented as
service to the customer based on their demand.

Platform as a Service (PaaS): This model provides platform as a service. This provides clients to
develop his own application using the tools and programming languages

Infrastructure as a Service (IaaS): This model provides the shared resource services. It provides
the computing infrastructure like storage, virtual machine, network connection, bandwidth, IP
address. IaaS is complete package for computing.

Cloud Deployment Models


• Public Cloud: A public cloud is available to any user with an internet facility, is
less secure than the private cloud because it can be accesses by general public.
• Private Cloud: Private cloud is available to a specific organization so that the user
who belongs to that organization can have access the data. It is more secure
than the public cloud because of its private nature.
Colegio de San Juan de Letran Calamba
• Hybrid Cloud: The hybrid cloud is basically combination of no less than two
clouds such as combination of private, community or public cloud.
• Community Cloud: Community cloud allows the resources and system to be
accessible by number of associated organization.

Roles and responsibilities in cloud computing

Microsoft (2017). Cloud providers must have operational practices in place to prevent
unauthorized access to customer data; it’s also important to note that any compliance
requirements a customer organization has must also be supported by the provider. Although
cloud providers can help manage risks, customers need to ensure that data classification
management and enforcement is properly implemented to provide the appropriate level of
data management services.

Although customers are responsible for classifying their data, cloud providers should make
written commitments to customers about how they will secure and maintain the privacy of the
customer data stored within their cloud. These commitments should include information about
• privacy and security practices,
• data use limitations,
• and regulatory compliance

Efficient and Reliable Data Recovery Technique in Cloud Computing

Praveen S. Challagidad (2017). Data storage is one of the most significant services provided by
cloud computing technology. But, recovering for any lost data is one of the challenging issue in
cloud computing. The original data can be recovered by using some of the data recovery
techniques. There are existing recovery techniques are not efficient and reliable hence, to
recover the lost original data a technique is needed to meet efficiency and reliability.

Here are few data backup and recovery techniques in cloud computing
• Cloud mirroring technique. It uses the mirroring algorithm. The method provides the
high availability, integrity of the data, recovery of the data and minimizes the data loss.
This method can be applied to any kind of the cloud. Cost to recover the data is also
less.
• Data backup and recovery technique. This technique provides the data protection from
the service failure and also decreases the cost of solution. By using this technique, the
process of migration becomes simple and also removes the cloud vendor dependency.
They proposed an effective data backup technique to recover the data from the server
Colegio de San Juan de Letran Calamba
in case of data loss. For every business it is essential to back up the data to avoid the
data loss.
• According to Somesh P. Badhel study, they proposed a Seed Block Algorithm
Architecture (SBA) and suggested a remote backup server. The remote Backup server is
a replica of original cloud server which is physically situated at a remote location. This
method is based on the concept of Exclusive-OR (XOR) operation of digital computing.
The SBA uses a random number and a unique client id associated with each client.

An Important Indicator of Cloud Disaster Recovery Center


• Disaster recovery capability: It refers to the ability to recovery timely and continuing to
operate using disaster recovery resources and disaster recovery plans after the disaster.
• Replication technology: It is the core technology of disaster recovery and it is also the
main technical of the stored. We can achieve data backup between the primary data
center and disaster recovery center through it and it is the basis to maintain data
synchronization and remote disaster recovery. Replication technology is divided into
synchronous and asynchronous replication
• Data backup strategy: Data backup strategy refers to the determination of steps and
actions in
order to achieve data recovery and reconstruction objectives. It can be divided into
specific
data backup and operating system backup.
• Data snapshots: Snapshot is also one of the key technologies of storage, and storage
establishes a snapshot logical unit number and snapshot cache for the backup data
through software disk subsystem scanning the date to be backed up quickly.
• RTO and RPO: There are two key measures in the process of disaster recovery: one is
the RTO, another is the RPO. RTO( Recovery Time Objective) refers to the time
requirement from information systems or business function standstill to recovery after
the disaster. It is embodied that how long system downtime can users stand for.

Gang Li (2013). Suggested Design Principle and Classification of Disaster Preparedness


Strategies. He suggests that we need to categorize the disaster recovery strategy to adapt to
the needs of different users when determine the four principles in design of disaster recovery
strategy. Generally, disaster recovery strategy is usually divided into:

1. File-level disaster recovery is backing up data to disaster recovery system by means such
as backup software. It needs to restore data and operating systems in order to complete
the restoration of the system when a disaster occurs
o RPO is in minute’s level or higher, RTO is in hour’s level or higher, and misuse
can’t be restored for file-level disaster recovery.
Colegio de San Juan de Letran Calamba
File-level disaster recovery strategy includes
• Remote mount, Remote mount is directly attaching disk of disaster recovery center to
the production host at the system level using software shipped with the operating
system
• Virtual tape library is using hard disk to imitate the tape library functions when the hard
disk storage cost drops to a certain extent.
• Backup software, it is directly backing up data to disaster recovery center using third-
party software.
• Cloud storage is a system which collects a large number of different types of storage
devices in the network through the application software to work together and to
provide data storage and service access function in common. Cloud storage can be very
good as a disaster recovery strategy for cloud disaster center combined with other
strategy

2. Data-level disaster recovery is backing up data system environment to disaster recovery


system in real-time by replication technology.
o If the user's operating system has no problem, they can remote access to the
data system of the disaster recovery center directly when a disaster occurs. It
guarantees data system recovery in real-time and data integrity, but the
operating system level runs after recovery by certain means
o RPO tends to zero, RTO is in hour’s level, and misuse can be restored for data-
level disaster recovery

Data-level disaster recovery strategy includes


• Storage replication is directly implemented data disaster recovery using synchronous or
asynchronous replication software.
• volume management software, it is the real-time disk mirroring achieved by software
shipped with the operating system.
• virtual storage. Virtual storage is concentrating multiple storage modules (such as disks,
disk arrays) in a storage pool by certain means and unified management

3. Application-level disaster recovery is building a set of same application system in the


disaster recovery site to ensure the consistency of the system and data by replication
technology. When disaster occurs, the system can be switched to the disaster recovery
site in a very short period to ensure business continuity
o RPO tends to zero, RTO tends to zero, and misuse can be restored for
application-level disaster recovery
Colegio de San Juan de Letran Calamba
Application-level disaster recovery strategy includes
• Hot Standby: It refers to hot standby based on two servers in the high availability
system.

Concern and Issues in Cloud storage

According to Rongzhi Wang (2017), there are some problems when storing data in the cloud
services. In cloud storage, the user stored their data on the cloud server and storage devices,
and these devices are under the management domain of the Service providers. Then Users
have no control in any event of equipment failure and misconfiguration that may lead user’s
sensitive data loss, damage or leakage.

In the entire storage life cycle of the data, the user cannot monitor the behavior of the cloud
service provider. And these raised into three questions:
1. How to ensure strong confidentiality of data?
2. How to ensure that when the accident caused the loss of data, the user can still
complete data recovery?
3. How to ensure the data when encountering malicious tampering?

Users are concerned about the confidentiality of their data cannot be assured. To achieve the
security required, the need to encrypt the data in the local storage to the cloud. A need for
local encryption before uploading to the cloud storage.

In distributed storage system, data redundancy technology is the most basic method to ensure
system reliability and improve data availability and persistence. Through multiple instances of
the same data storage file (unit for a file or sheet) to different storage node availability to
strengthen data, ensure that even if some nodes are not available, the remaining storage node
also can recover the original data integrity.

According to Praveen S. Challagidad (2017), Data recovery process present some issues during
recovery. These issues are discussed below.

1. Data Storage: All the enterprises store their large amount of data in the cloud. For
providing the security to data the computing is distributed but storage is centralized.
Therefore, single point failure and data loss are critical challenge to store the data in
cloud.
2. Data security: User stores their huge data in the cloud. The stored data may be
confidential and sensitive data. Providing the security to these data is important.
Colegio de San Juan de Letran Calamba
3. Lack of redundancy: If the cloud gets destroyed due to any reason then secondary site
gets activated to provide the data to user when primary storage fails to provide the
data.
4. Dependency: Customer doesn’t have control on their system and data. Backup service is
provided to overcome this drawback.

A Survey of IT Professionals

Druva is the leader in cloud data protection and information management, leveraging the
public cloud to offer a single pane of glass to protect, preserve and discover information –
dramatically increasing the availability and visibility of business critical information, while
reducing the risk, cost and complexity of managing and protecting

Research as sponsored by Druva ( 2015) conducted a survey of 214 IT professionals with the
responsibility for corporate data. The goal of the survey was to understand attitudes,
approaches, and challenges with ensuring the privacy of corporate data. And their key findings
are:
Data privacy is important, but don’t depend on employees as solution to address it
• 99% have sensitive data
• 84% report data privacy is increasing in importance in 2015
• 82% have employees who don’t follow data privacy policies

International requirements are making data privacy even more challenging to manage
• 93% face challenges ensuring data privacy
• 91% have data privacy controls, but those controls are incomplete
• 77% find it challenging to keep up with regional requirements for data privacy

Privacy isn’t viewed as a separate priority, and most resources are on external threats
• Only 20% separate data privacy and data security
• 72% put more effort into coping with threats from external sources than internal
sources

Cloud data is growing, but privacy concerns persist


• 88% expect cloud data volume to increase in 2015
• 95% have sensitive data in the cloud
• 87% are concerned about the privacy of data in the cloud
Colegio de San Juan de Letran Calamba

Enterprise Strategy Group (ESG) also surveyed organizations for the common use of cloud
infrastructure services to their organization. Enterprise Strategy Group is an integrated IT
research, analyst, strategy, and validation firm that is world renowned for providing actionable
insight and intelligence to the global IT community

According to ESG, Jason Buffington (2016) survey, Backup and disaster recovery are the two
of the most frequent uses for cloud storage today and it is the secure offsite data repository
as shown Figure 1. Their other respondents, saying their organizations are considering cloud
service to support remote data survivability, save money, and enhance their recovery ability.

Figure 1. Top Five Cloud Infrastructure Use

According to ESG, over the five-year span from 2012 to 2017, tape use for primary backup
appears to decrease from 56% to 45% while the cloud increased. Essentially, these findings
indicate that although cloud usage for data protection is increasing significantly, tape usage is
not declining at a corresponding rate. Organizations who saved their data in the traditional
tape system are intend to retain their data for six years or more (often much more). However,
most of those organizations also say they would store data in the cloud for three years or
less.

Many organizations should consider the cloud as an addition to any existing disk-plus-tape
backup strategy and not as a substitute to tape backup for long-term retention. Enterprise
Strategy Group (ESG) provided two solution types that are widespread in the industry today:

1. Storage-as-a-service Used for Data Protection (STaaS/dp) is the easiest method of


incorporating cloud services into a data protection.
Colegio de San Juan de Letran Calamba
2. Backup-as-a-service (BaaS) is using the same kinds of enterprise-class data protection
software that previously was available only for on-premises use. It can offer some
advantages as backup service, including
a. Remotely managed and monitored data protection jobs and
b. Improved reliability of the backup and recovery process due to the expertise
provided by the service provider partner

AWS Risk and Compliance

Amazon Web Services (AWS) is a subsidiary of Amazon.com that provides on-demand cloud
computing platforms to individuals, companies and governments, on a paid subscription basis.
In terms of controlling the overall IT environment, AWS and its customers shared responsibility
for managing the IT environment. They shared responsibility in providing services on a highly
secure and controlled platform. The customers’ responsibility includes configuring their IT
environments in a secure and controlled manner for their purposes. While customers don’t
communicate their use and configurations to AWS, AWS does communicate its security and
control environment relevant to customers.

Shared Responsibility Environment

The customer shoulders responsibility and management of the guest operating system
including system updates and security patches. The responsibility also includes other associated
application software as well as the configuration of the AWS provided security group firewall.
The customer can enhance their security to meet their compliance requirements by using
technology such as host base firewalls, host based intrusion, detection/prevention, encryption
and key management.

Strong Compliance Governance.

But AWS customers are required to continue to maintain adequate governance over the entire
IT control environment. Leading practices include an understanding of required compliance
objectives and requirements (from relevant sources), establishment of a control environment
that meets those objectives and requirements, an understanding of the validation required
based on the organization’s risk tolerance, and verification of the operating effectiveness of
their control environment
Colegio de San Juan de Letran Calamba
Evaluating and Integrating AWS Controls.

AWS provides a wide range of information regarding its IT control environment to customers
through white papers, reports, certifications, and other third-party attestations. This
documentation assists customers in understanding the controls in place relevant to the AWS
services they use and how those controls have been validated.

AWS IT Control Information

AWS provides IT control information to customers in the following two ways:


1. Specific control definition. AWS customers can identify key controls managed by AWS.
Key controls are critical to the customer’s control environment and require an external
attestation of the operating effectiveness of these key controls to comply with
compliance requirements—such as the annual financial audit
2. General control standard compliance. If an AWS customer requires a broad set of
control objectives to be met, evaluation of AWS’ industry certifications may be
performed.

Control Environment

AWS manages a comprehensive control environment that includes policies, processes and
control activities that leverage various aspects of Amazon’s overall control environment. This
control environment is in place for the secure delivery of AWS’ service offerings. The collective
control environment encompasses the people, processes, and technology necessary to
establish and maintain an environment that supports the operating effectiveness of AWS’
control framework
Colegio de San Juan de Letran Calamba
Data Classification

Data Classification – Taking control of your data


(Happiest Minds, Thiruvadinathan – 2017)

Data classification is one of the most fundamental ingredients of an effective and efficient
information security strategy. It directly impacts decisions, procedures and practices on what
kind of data is collected, how it is stored, used, protected, shared or disclosed. In very simple
terms, it is an exercise carried out to understand the nature, type, criticality, sensitivity and
other attributes of the data in order to distinguish between good and bad data, important vs.
non-important data,

This article focuses on the key drivers for data classification –


• a high-level process to design and
• implement a program,
• the challenges involved and long-term benefits.

Key steps for beginning a Data Classification program


• Plan a pilot project for a given business process
• Identify data that the process requires
• Determine attributes such as sensitivity and criticality of data
• Identify the regulatory requirements
• Identify the requirements for security
• Determine the applicable classes or levels of classifications

Steps to be taken following a successful classification policy


• Determine how classified data will be collected, verified, created, processed, shared,
stored, transmitted, archived and disposed.
• Define security requirements for each of the stages
• Engage with the data owners, custodians and users to raise awareness and monitor
internal compliance; external compliance can fall in place, naturally
• Build a strong case for the management to sponsor an enterprise-wide roll-out

Key Success Factors


• An effective mechanism to locate and determine critical and sensitive data for mission
success
• Identification of relevant regulatory and statutory requirements for data protection
Colegio de San Juan de Letran Calamba
• Identification control deficiencies, excessive controls leading to effective and efficient
security investments in controls, monitoring, review, etc.
• Determination of Recovery Point Objective (RPO) of data and enable assessment of
business impact and risks
• Identification of minimum security requirements for the data when stored, in use, and
transmitted, user access control, retention and disposal requirements, audit, review,
monitoring, etc.
• Increased ability to respond effectively and efficiently to data discovery requirements
arising out of any legal proceedings.
• Increased ability to understand newer dimensions of the data i.e. metadata to identify
new opportunities for data use

DATA CLASSIFICATION: The First Step to Protecting Unstructured Data.


(QinetiQ Company, Boldon James- 2015)

WHERE DO YOU START? BEFORE YOU CAN DEFEND, YOU MUST DISCOVER

Before you know what protection your data requires you need to know what you’ve got, where
it’s stored, why you have it and who has access to it. Once you’ve got to grips with that, you can
identify what is of true value to the organization – what’s business-critical and what’s sensitive
– and then how to best to treat it. This valuable data might include intellectual property such as
product designs and formulas, strategic plans, personal details, contracts and agreements,
regulated documents and plans for investment. Think about what the impact would be if the
piece of information was leaked or lost. If it was made public,
• Would it harm the business, or your customers, partners or suppliers?
• Would it put an individual’s security or privacy at risk?
• Would you lose advantage if a competitor got hold of it?
• Is it subject to any privacy or data laws, or compliance regulations?
• Would its loss breach a contract or agreement?
• Would it incur a cost?
• Would it damage the brand?
• Would you lose your job?
.

BEST PRACTICE: USER-DRIVEN DATA CLASSIFICATION


(QinetiQ Company, Boldon James- 2017)

Data classification involves the user attaching an appropriate identifier or label to a message,
document or file, to give the data a value and let other users know how it should be handled or
Colegio de San Juan de Letran Calamba
shared. By classifying data according to its value to the business, organizations can develop
more effective data-centric security approaches that safeguard against accidents and reduce
risk. Using classification tools to implement the approach allows data security controls, rules
and policies to be more consistently enforced. These tools apply clear, consistent electronic
markings to any type of file and message – for instance ‘commercial in confidence’, ‘internal
only’, ‘public’ – and then allow it to be saved or sent only in accordance with the rules that
correspond to that marking.

Metadata: the magic ingredient

Most importantly, as well as attaching the label in a visual form, classification tools apply it as
metadata, embedding a tag into document or file properties that stays with it wherever it goes.
This helps shield the business against accidental data loss – for example, a diligent employee
emailing a sensitive document to their home PC to work on at the weekend or someone saving
a confidential file in a public folder with the slip of a key. Attaching the label in two different
forms means the value of the data is clearly displayed to the user, while the metadata can be
used to direct other security and data management solutions downstream.

Introducing the human element

The most effective approach involves the user in the process. The employee themselves places
the identifier on the information at the point of sending or saving it, deciding which
classification to apply within a context – something a computer just can’t do with any real
accuracy.

For user-driven data classification to work, you need to set a clear policy that enables users to
make fast and intuitive decisions about how each document, message or file should be marked.
Use clear labels and terminology that will be instantly recognizable and meaningful in the
business context, and keep the number of different identifiers to a minimum
Colegio de San Juan de Letran Calamba
Information Classification - Who, Why and How
(SANS Institute, Susan Fowler - February 2003)

Companies need to protect their information today more than ever the increasing need for
companies to protect their customer and financial information is obvious. Signs are prevalent in
the news, publications, and in the turn of recent business and world events. The most
compelling reason to classify information is to satisfy regulatory mandates. Although
information classification is not specified as a required protection measure, it is implied by
special handling requirements for sensitive, medical and financial information.

Some companies also have contractual commitments to protect information according to


customer or business partner specifications. The obvious benefit for satisfying regulatory and
legal requirements is that it minimizes the risk of financial penalties for non-compliance.

Information classification goals


Having established that companies should classify their data, it is important to understand what
an effective information classification system should accomplish. That is to categorize
information to communicate company-endorsed safeguards for information confidentiality,
integrity and availability. An effective data classification system should also be easy to
understand, use and maintain.

Implementing Information Classification

Approach for classifying information There are many ways to implement an information
classification system. The key is to facilitate employee compliance of company endorsed
information protection measures. To successfully implement information classification, a
company must transition
from recognizing that it should classify its data to recognizing that it can.

Step 1. Identify all information sources that need to be protected. If information sources
haven’t been compiled for other initiatives, the best sources might be developers, operating
system and database administrators, business champions, and departmental and senior
managers

Step 2. Identify information protection measures that map to information classes


Some common, industry-recognized information protection measures are
highlighted below. Their applicability to your company depends on its business needs and
information protection goals.
Colegio de San Juan de Letran Calamba
• Authentication
• Role based access
• Encryption
• Administrative controls. Administrative controls are also used to ensure the integrity of
information. These controls are often presumed to be implemented but may not be
because of high administrative overhead. Examples of these are formal change controls,
separation of duties, rotation of duties and cross training.
• Technology control. There are also technology specific controls like virus protection;
disk, system and application redundancy; and network segregation
• Assurance. Validating that systems are safeguarded is also a level of protection.
Examples are policy compliance monitoring, code walkthroughs, intrusion detection,
system performance monitoring, transactional monitoring, administrative monitoring,
and file access monitoring.

Step 3. Identify information classes. Information class labels should convey the protection goals
being addressed.
Step 4. Map information protection measures to information classes.
Step 5. Classify information
Step 6. Repeat as needed

Classification is a Business Imperative


(Titus – 2014)

1. Data Security

While technology has the potential to help enforce data security policies, without a pervasive
culture of data security users fail to use the technology properly. Either due to a lack of training
or the complexity of the security tools, studies continue to show that employees frequently
violate information security protocols

To secure data, senior executives must set the foundation for a culture of information
protection, which includes executive support and involvement, user training and guidance, easy
to use technology, and data classification.

Classification is foundational to securing your information as it allows users to quickly and easily
indicate the value of the data to the organization. The classification is applied as visual markings
(to alert end users), and persistent metadata (to inform security technology systems).
Colegio de San Juan de Letran Calamba
2. The Data Security Imperative

The propagation of data sharing tools, such as email, social media, mobile device access, and
cloud storage media are making it harder for IT and data security departments to keep sensitive
information from moving outside the network perimeter. The reality is that the data security
perimeter is forever changed as data is accessed and stored in multiple locations

It is important to note that the insider threat is not just a malicious user or disgruntled
employee, but could also be trustworthy employees who are just trying to work more
efficiently. When workers are unfamiliar with correct policy procedures and there are no
systems in place to train, inform, and remind them, they engage in risky information handling. If
your users don’t understand the value of the data they are using they are likely to see the
technology as an impediment to their workflow, and actively seek methods to circumvent
security.

3. The Security Culture Imperative


Without executive guidance, security is transferred to IT departments that are already
struggling for proper data security funding. When senior executive sponsorship is
communicated directly to the employees it is less likely that the employees will find excuses to
resist the change.

When the CEO communicates to her employees the importance of security for their job as well
as for the organization, employees are much more proper to comply. Once the users are on
side in principle, it is important to follow up with tools that are easy to use and provide
immediate feedback with corrective suggestions when there is a violation

4. The Classification Imperative


Classification is the crucial foundation to data security as it allows users to identify data, adding
structure to the increasing volumes of unstructured information. When data is classified,
organizations can raise security awareness, prevent data loss, and comply with records
management regulations
Classification is effective because it adds “metadata” to the file. Metadata is information about
the data itself, such as author, creation date, or the classification. When a user classifies an
email, a document, or a file, persistent metadata identifying the data’s value is embedded
within the file. By embedding classification metadata, the value of the data is preserved no
matter where the information is saved, sent, or shared.

Classification can also aid where compliance legislation regulates the protection and retention
of company records. By providing structure to otherwise unstructured information,
Colegio de San Juan de Letran Calamba
classification empowers organizations to control the distribution of their confidential
information in accordance with mandated regulations.

Data Classification for Cloud readiness


(Microsoft Trustworthy Computing – 2017)

Data classification has been used for decades to help large organizations such as Microsoft,
governments, and military entities manage the integrity of their data. Although risk
assessments are sometimes used by organizations as a starting point for data classification
efforts

Data classification fundamentals

Successful data classification in an organization requires broad awareness of the organization’s


needs and a thorough understanding of where the organization’s data assets reside. Data exists
in one of three basic states: at rest, in process, and in transit. All three states require unique
technical solutions for data classification, but the applied principles of data classification should
be the same for each. Data that is classified as confidential needs to stay confidential when at
rest, in process, and in transit. At the same time, Data can also be either structured or
unstructured.
• Structured data found in databases and spreadsheets are less complex and time-
consuming to manage than those for unstructured data
• Unstructured data such as documents, source code, and email.

Customers can verify the effectiveness of their cloud provider’s practices. Having this
information will help customers understand whether the cloud provider supports the data
protection requirements mandated by their data classification. However, to achieve
compliance, such organizations need to remain aware of their classification obligations and can
manage the classification of data that they store in the cloud.

Three levels of classification sensitivity, are shown in the following table.


Colegio de San Juan de Letran Calamba

Confidential (restricted). Information that is classified as confidential or restricted includes data


that can be catastrophic to one or more individuals and/or organizations if compromised or
lost. Such information is frequently provided on a “need to know” basis and might include:

• Personal data, including personally identifiable information such as Social Security or


national identification numbers, passport numbers, credit card numbers, driver's license
numbers, medical records, and health insurance policy ID numbers.
• Financial records, including financial account numbers such as checking or investment
account numbers.
• Business material, such as documents or data that is unique or specific intellectual
property.
• Legal data, including potential attorney-privileged material.
• Authentication data, including private cryptography keys, username password pairs, or
other identification sequences such as private biometric key files.

For internal use only (sensitive). Information that is classified as being of medium sensitivity
includes files and data that would not have a severe impact on an individual and/or
organization if lost or destroyed.

• Email, most of which can be deleted or distributed without causing a crisis (excluding
mailboxes or email from individuals who are identified in the confidential classification).
• Documents and files that do not include confidential data.

Public (unrestricted). Information that is classified as public includes data and files that are not
critical to business needs or operations.

Define data ownership


It’s important to establish a clear custodial chain of ownership for all data assets.

• The data asset owner is the original creator of the data, who can delegate ownership
and assign a custodian. When a file is created, the owner should be able to assign a
classification, which means that they have a responsibility to understand what needs to
be classified as confidential based on their organization’s policies. All of a data asset
owner’s data can be auto-classified as for internal use only (sensitive) unless they are
responsible for owning or creating confidential (restricted) data types. Frequently, the
owner’s role will change after the data is classified. For example, the owner might
create a database of classified information and relinquish their rights to the data
custodian.
Colegio de San Juan de Letran Calamba

• An administrator represents a user who is responsible for ensuring that integrity is


maintained, but they are not a data asset owner, custodian, or user. In fact, many
administrator roles provide data container management services without having access
to the data. The administrator role includes backup and restoration of the data,
maintaining records of the assets, and choosing, acquiring, and operating the devices
and storage that house the assets.
• The asset user includes anyone who is granted access to data or a file. Access
assignment is often delegated by the owner to the asset custodian.

Protecting confidential data

After data is classified, finding and implementing ways to protect confidential data becomes an
integral part of any data protection deployment strategy. Protecting confidential data requires
additional attention to how data is stored and transmitted in conventional architectures as well
as in the cloud.

The data asset custodian is assigned by the asset owner (or their delegate) to manage the asset
according to agreements with the asset owner or in accordance with applicable policy
requirements. Ideally, the custodian role can be implemented in an automated system. An
asset custodian ensures that necessary access controls are provided and is responsible for
managing and protecting assets delegated to their care. The responsibilities of the asset
custodian could include:
• Protecting the asset in accordance with the asset owner’s direction or in agreement
with the asset owner
• Ensuring that classification policies are complied with Informing asset owners of any
changes to agreed-upon controls and/or protection procedures prior to those changes
taking effect
• Reporting to the asset owner about changes to or removal of the asset custodian’s
responsibilities
Colegio de San Juan de Letran Calamba

Rights management software


One solution for preventing data loss is rights management software. Unlike approaches that
attempt to interrupt the flow of information at exit points in an organization, rights
management software works at deep levels within data storage technologies. Documents are
encrypted, and control over who can decrypt them uses access controls that are defined in an
authentication control solution such as a directory service

Encryption gateways can provide a means to manage and secure data that has been classified
as confidential by encrypting the data in transit as well as data at rest
• This approach should not be confused with that of a virtual private network (VPN);
encryption gateways are designed to provide a transparent layer to cloud-based
solutions.
• Encryption gateways are placed into the data flow between user devices and application
data centers to provide encryption/decryption services.

Data loss prevention (DLP) technologies can help ensure that solutions such as email services
do not transmit data that has been classified as confidential. Organizations can take advantage
of DLP features in existing products to help prevent data loss

DLP technologies can perform deep content analysis through keyword matches, dictionary
matches, regular expression evaluation, and other content examination to detect content that
Colegio de San Juan de Letran Calamba
violates organizational DLP policies. For example, DLP can help prevent the loss of the following
types of data:
• Social Security and national identification numbers
• Banking information
• Credit card numbers
• IP addresses
Some DLP technologies also provide the ability to override the DLP configuration (for example,
if an organization needs to transmit Social Security number information to a payroll processor).
In addition, it’s possible to configure DLP so that users are notified before they even attempt to
send sensitive information that should not be transmitted

Best Practices on Leverage the Public Cloud for Backup and Disaster Recovery

To leverage public cloud to supplement current backup and disaster recovery solutions,
suggested general backup best practices.

1. Think of local protection as the first line of defense. When it comes to performing
backup and recovery, the best performance—99 percent of the time—will be delivered
by using resources local (on premise) to the systems and data being protected.
2. Identify the systems and the dependencies of those systems that are critical to the
business. If the local protection is the first line of defense, public cloud-based based
protection would be the second line of defense.
3. Don’t think only of traditional backup for disasters. Consider the ability to use
replication technologies to provide continuous data protection locally and in the cloud.
But remember that replication, although a great complement, is never a replacement
for backups.
4. Think about how you want to restore data and back up to meet that goal. Backing up
the system and all the storage will protect everything on that OS instance, which is
perfect for when to restore the entire environment using bare metal recovery scenarios.
5. Backup at the hypervisor level may not always be enough for the best restoration
experience—consider running backup agents within the VM OS instead of just on the
virtualization host.
6. Long-term backup storage in the cloud. Data is stored for many reasons, the most
important of which is long-term archiving for corporate needs and to meet regulatory
requirements.
7. Ensuring security of the public cloud data. Verify the security used in cloud solution, the
physical security of the public cloud locations, encryption of data at rest on the storage,
and logical separation of your organization’s data from other organizations using the
same public cloud backup provider and encryption used to protect the data during
transmission.
Colegio de San Juan de Letran Calamba
8. Running the recovery directly in the cloud Look at options to run your systems in virtual
environments in public cloud virtual machine hosting solutions using the systems and
data backed up in the public cloud. This approach allows your operations to be up and
running again even without your own datacenter.
9. Unified backup and management. Consider leveraging a single solution that supports a
hybrid model and enables a single management approach.
10. Test the processes periodically and any time a significant change occurs in
infrastructure. Perform regular tests to ensure restore processes work and the data
protected is valid

Data Protection Next Level

Traditional Data Protection Solutions deploy resource-intensive backup agents in the physical
server, which copy and move data from production storage to a back-end disk or tape
environment. This worked well for physical environments with limited storage capacity but is
not sufficient for a virtualized environment with high utilization rates. As such, a modern data
protection strategy is necessary.

Comprehensive Approach to Data Protection:


1. Reduce the Volume of Data Copied for Protection
• Move inactive data to a content store with built-in protection
• Reduce the amount of storage needed for protection
• Require less time for protection and recovery of less data

2. Application-aware storage-based Protection


• Leverage storage-based In-System and Remote Replication technologies
• Create, and recover from copies instantaneously
• Interface with application for consistency / granularity

3. Protect Data Continuously and Apply Incremental Forever Strategies


• Copy data as soon as it is created
• Create multiple virtual copies from this data stream
• Discard copies / changes that are not needed for recovery / use
Colegio de San Juan de Letran Calamba
Backup Solutions Guidance for Schools

Backup Solution for Schools: Advice and Guidance


(Coventry City Council, ICT Services – 2015)

It is vital for a school to have their own backup system and regularly backup their files. In the
event of theft, file, virus infection and any system failure or file corruption their data can be
restored. According to ICT of Covert City, Their ICT services have some recommendation for a
backup software for their schools. The software must be automated with online backup and
data recovery, and simplified management for school backup. The Backup software for Schools
automatically protects data residing on servers and according to retention policies and
schedules set by the user, the Data is compressed and encrypted. When data recovery is
required, users can simply select the data to restore using an intuitive interface. They
recommend Redstor Backup software for Schools, it is currently protecting data for over 60
schools within Coventry. Coventry is a city and metropolitan borough in the West Midlands,
England.

Benefits of Redstor Backup for Schools Include:


• No capital outlay for storage and backup equipment
• No maintenance costs
• Data is encrypted and is stored off site in Redstor high security data centers
• No tapes and tape management issues
• Maximum data availability and data security
• No need to acquire and maintain specific storage technical knowledge or management
• Consistent regular data backups with minimum human intervention
• Safeguards the confidentiality of school data through strong encryption
• Flexible data retentions to comply with internal policy regulations
• Rapid data recovery online or with an optional data recovery appliance
• Reduces the time that school admin staff will have to spend on checking backups
• Takes away the onus on schools to store tapes/data cartridges, therefore eliminating
any financial risk that could be had if tapes/data cartridges were lost/stolen

A Guide to Choosing the Right Data Backup Solution for your School
(Microsoft Authorized Education Reseller, Our ICT - March 2015)

Our ICT is the dedicated educational division of Our IT Department Ltd. a market leading IT
services provider with offices located in Central London and the East London/Essex border.
According to Our ICT (2015), School Primary data storage refers to data that is stored on their
Colegio de San Juan de Letran Calamba
computers while Secondary data storage is the process of backing up data and storing critical
data in a secondary location like in cloud backup and storage services. Using cloud service is a
means of securing data while working with school limited budget. Some other schools may still
deploy their data storage on own infrastructure or use a combination of both onsite and offsite
storage.

What is the difference between onsite & offsite data backup?


• Onsite storage refers to any type of technology that is used to store data on the
premises
o When the data is backed up and stored to a secondary device for future retrieval
includes computers, external hard drives, servers, hard disk drives, network
attached storage.
• Offsite data backup and storage is also known as cloud storage and is the process of
storing data in a datacenter at an offsite location.
o The data can then be accessed from any device that has an Internet connection.
o The modern form of offsite data storage and provides an easier way to access
and restore data
o The cloud data software is designed to facilitate data transfer and backup in the
background during daily operations. Any data that has been modified after that
is automatically backed up to the cloud.
o It is a backup and storage process that is more manageable and cost effective

It is significant for a school to maintain a safe data storage, but having it is enormous task then
placed in school premises. They must implement network infrastructure, fund their IT staff, pay
their security technologies. Using Cloud storage, they can still keep their data safe, meet
compliance and security requirements, and have an effective disaster recovery strategy at a
fraction of the cost of maintaining data backup and storage infrastructure on the premises

Here are the things to consider when choosing a Cloud Service Provider
1. Select a Service Provider with one that has a long-established track record
2. Factors to consider include
o Availability
▪ most service providers use redundancy and the reassurance of a Service
Level Agreement (SLA)
▪ Most quality service providers guarantee an uptime of at least 99.9
percent
o Scalability.
Colegio de San Juan de Letran Calamba
▪ One of the advantages of working with a data backup and storage
provider is the solution is scalable not locked into the amount of storage
space you designated when you first subscribed to the service
▪ storage provider will allow you to scale up or down as necessary and as
school district requirements change
o Security.
▪ use the same security technologies as the military.
▪ any data that is transmitted to their servers is fully encrypted
▪ invest the time to learn all you can about the security technologies they
use and the compliance certifications they have been awarded
o Recovery Time.
▪ an easy and straightforward process for performing single action data
recovery
o Automated Backup.
▪ Should be able to use an automated process to backup and store all
school data.
▪ not require any of your staff to be near their device when the data is
backed up

Here are a few ways on HOW to provide safe data storage using Cloud Storage:
1. Data Security:
• Cloud service provider maintains the infrastructure including keeping it secure
o service provider undergoes periodic audits to ensure their infrastructure
stays compliant
o service provider meets strict compliance standards
• The data is safe, and it can be accessed or restored at any time using a secure
password.
• In addition to the use of passwords, all data is encrypted prior to being sent to the
servers maintained by the cloud data storage provider
• Use the encryption password to unscramble the data and download it from the
server
• Because the data is not being stored on an external device such as a flash drive or
CD, you never have to be concerned about device malfunctions or stolen data

2. Reliable Backup and Storage.


• many schools opt to use cloud services to ensure safe data storage
• uses an automated backup process which means you never have to rely on a staff
member to perform the backup.
Colegio de San Juan de Letran Calamba

3. Future Proof Safe Data Storage Processes.


• Cloud data backup and storage service will always ensure you never run out of
server space
• Use redundancy for their servers and have server infrastructure distributed in more
than one location
• The high-quality service providers often use underground vaults to store data. This
ensures recovery in the event of a natural disaster
o This type of system results in expensive infrastructure which is cost
prohibitive for most school districts.

4. User Access.
• School districts never have to worry about compatibility and end user issues
• School is not required to upgrade devices, software, or media on their end since you
access the data storage service using a secure Internet connection and a web
browser
• Any additional expenses of monthly or annual subscription.
• Easy user access is important in the event data must be recovered quickly and
efficiently

Data Classification Policy

Data Classification policy is to provide a structured and consistent classification framework for
defining the university’s data security levels, which will establish the foundation for appropriate
access control policies and procedures. This policy is applicable to all Data, as defined in this
policy. This policy does not apply to information that is the personal property of individuals
covered by the policy

Data Access Generally


The university assigns access based on Least Privilege Required such that users are only granted
the permissions needed to perform their specific duties and responsibilities. Data security
measures must be implemented commensurate with the sensitivity of the Data and the risk to
the university if the Data is compromised. It is the responsibility of the applicable Data Owner
to evaluate and classify Data according to the classification system.
The university has adopted the following three security classifications of Data.
Colegio de San Juan de Letran Calamba
Category I – Green
Publicly available information that is unrelated to the university.

• General information and marketing materials about the university such as press
releases, campus maps, athletic results, information about academic program offerings.
• St. Thomas e-mail addresses.
• University reports filed with federal or state governments and generally available to the
public.
• Copyrighted materials that are publicly available.
• Student information covered as “Directory information” under FERPA if not restricted by
individual student action.
• Published research.

Category II – Yellow
• Data required by law not to be disclosed without consent of the subject of the
information, that is not covered under Category III - Red.

Category III - Red


• Data to be protected with the highest levels of security as defined and/or required by
contractual terms or applicable laws and regulations.
• Personally Identifiable Information (PII) protected by state or federal law against
unauthorized disclosure

Government’s Cloud First Policy

Department of information and communication technology ( DICT ): Government Cloud service

Philippine Government to launch its national cloud service program. An initiative from the
Philippine Government of the iGovPhil program, the (GovCloud) allows government agencies to
take full advantage of the benefits cloud computing. GovCloud will use a hybrid cloud strategy,
ensuring data security while enabling on-demand availability of storage and computing
resources.

DICT (Department of Information and Communications Technology), is the executive


department of the Philippine government responsible for the planning, development and
promotion of the country's information and communications technology (ICT), announced their
Cloud-First policy under which government agencies will have to move to cloud computing as
the preferred ICT deployment strategy for internal administrative use and external delivery of
government online services, and put the GovCloud forms an integral part of the agency policy.
Colegio de San Juan de Letran Calamba
GovCloud was initially provided as Cloud Infrastructure-as-a-Service (IaaS) to small scale with
government developed applications then later offered to other government agencies in need of
cloud computing. This would make government services more accessible to citizens and
businesses and ultimately make the government more transparent and accountable.

GovCloud provides the benefits of Security, Scalability and on-demand availability of storage
and computing resources. The cloud infrastructure serves as a centralized data repository, and
it will allow sharing and integration of resources among the government agencies.

The implementation of the GovCloud is pursuant to the DICT’s Circular No. 2017-002 or the
adoption of the Cloud First Policy by agencies to provide better services to citizens with scalable
and on-demand cloud computing

DICT: Department Circular- Section 5

Department Circular, signed 18 January 2017, prescribes the Philippine Government’s Cloud
First Policy, which aims to promote cloud computing as the preferred ICT deployment strategy
and a means to reduce costs.

In DICT Department Circular Section 5 - CLOUD FIRST POLICY

5.1 Cloud computing has brought a new and more efficient means of managing government
information technology resources. It is hereby declared the policy of the government to adopt a
“cloud first” approach and for government departments and agencies to consider cloud
computing solutions as a primary part of their info structure planning and procurement.

5.2 All government agencies shall adopt cloud computing as the preferred ICT deployment
strategy for their own administrative use and delivery of government online services, except:

5.2.1 When it can be shown that an alternative ICT deployment strategy meets special
requirements of a government agency; and

5.2.2 When it can be shown that an alternative ICT deployment strategy is more cost
effective from a Total Cost of Ownership (TCO) perspective and demonstrates at least
the same level of security assurance that a cloud computing deployment offers.
Colegio de San Juan de Letran Calamba
GovCloud Service Catalogue
Currently, we have the following offerings:

Instance Specifications What can you do with this capacity?


Extra Small Compute*: 1 vCPU • Low-traffic, static website
Memory**: 1 GB • DNS server
Storage***: 10 GB
Small Compute: 1 vCPU • Small-size database
Memory: 2 GB • Low-traffic, dynamic website
Storage: 10 GB
Medium Compute: 2 vCPU • Medium-size database
Memory: 4 GB • Application server
Storage: 20 GB • Multiple webhosting server
Large Compute: 4 vCPU • High-traffic database server
Memory: 8 GB • High-traffic application server
Storage: 40 GB • High-traffic dynamic website
• Multiple webhosting server
Extra Large Compute: 8 vCPU • Storage-optimized
Memory: 16 GB • High-traffic database server
Storage: 80 GB • High-traffic application server
• High-traffic dynamic website
• Multiple webhosting server
*Compute is a variant for the term ‘cores’ or ‘CPUs’ (central processing units).
**Memory is referred here as the size of the ‘RAM’ (random access memory).
***Storage is where the databases and other files are stored (like an external hard drive).

Operating System
READILY AVAILABLE
• Red Hat Enterprise Linux 6.5 (64-bit)
• CentOS 6.5 (64-Bit)
Colegio de San Juan de Letran Calamba
Data Privacy: Encryption

Keeping Your Private Data Secure


(Symantec – 2015)

Why Encryption?
Regulatory compliance, data privacy concerns and brand reputation often become powerful
motivating factors for organizations to take advantage of encryption technologies. In addition,
most of the states in the U. S. have enacted Safe Harbor Laws that protect organizations if they
use strong encryption. If data is encrypted, it’s still protected in the event of a breach.

Obstacles to Encryption
• Misperception is encryption is too expensive
• Encryption solutions are difficult to deploy and manage

Approaches to file Encryption


In need to deploy file encryption everywhere within infrastructure. Consider these encryption
approaches: Endpoint encryption solutions is to protect data-at-rest from loss or theft.

File and Folder Encryption


While sometimes offered as part of an endpoint encryption offering, file and folder encryption
encrypts designated files and folders on a device. This can include desktops, laptops, shared
network drives and even cloud storage. When a file or folder is copied, archived or shared with
other users, the encryption will follow the file or folder such that only authorized users can gain
access its data. Authorized users can use and open the encrypted files in applications just as
they normally would.
File and folder encryption facilitates secure collaboration where team members or specific sets
of individuals within an organization need to share sensitive information. Sensitive files that
need to be shared can be individually encrypted or placed in an encrypted shared folder to
ensure only authorized individuals can view the material.
Colegio de San Juan de Letran Calamba
Synthesis

Data protection has different approaches to provide a more reliable strategy in securing the
Data. Data can be defined as structured or unstructured, important or not important. Data
classification is the primary strategy to categories the information and data within
organizations, including government agencies. Classifying data according to its value to the
business can help the organizations to develop a more effective data-centric security approach.
Implementing Data classification allows the data to have effective security controls, rules and
polices.

By properly identifying the data according to its impart to the organization, treatment to data
can varies depending on its important, Data backup that can assure of its availability in case of
disaster, can treat the data differently based on its importance on requirements on disk storage
space, data retention, Recovery Point Objective requirement and Recovery Time Objective
requirement. With Data classification can be user-driven classification where data owner
defines the classification of their data, while there is some software that can categories the
data based on configured parameters.

Data protection also covers the data security; Encryption is the basic strategy to protect the
data in use or data in transit. Organizations can implement encryption before saving their data.
Organizations can implement local backup strategy to protect the data and make it available
anytime. Data recovery Is easier in case of data loss or file corruption with local data restoration
procedure. However, onsite backup strategy should be supplemented by Cloud Service.

Cloud services can provide a reliable storage for organizations, due to its technological
advantage to make the data protected from any man-made or natural disaster. Cloud Service
has distributed cloud technology within a region where data can be automatically replicated to
another cloud region. Cloud storage is the primary preference for primary data storage and
onsite backup is the secondary data storage. Many organizations should consider the cloud as a
supplement to any existing disk-plus-tape backup strategy and not as an alternative to tape
backup for long-term retention. Yet, adding cloud storage to any ordinary current backup
strategy will not make it outstanding.

Philippine Government launched its Cloud services. Government declared the policy to adopt a
“cloud first” approach and for government departments and agencies to consider cloud
computing solutions as a primary part of their infrastructure planning and procurement. As the
Colegio de San Juan de Letran Calamba
public sector adopts a cloud first policy, the Philippine GovCloud will continue to support
agencies efforts to adopt cloud solutions according to their requirements.

Research Gap

Government Cloud services (GovCloud) is a service that has not yet been explored. GovCloud
has no established track record that can be compared to Private Cloud service provider.
Government implemented the "Cloud first" policy to encourage government agencies to
consider Government cloud in their infrastructure.

However, there are no published articles or any reviews about the cloud services offered by the
Government that help us to understand the distinct its services, and how to know if they offer
the same level of services to private organizations. We must also understand the service
difference in terms of availability, security, recovery time and assurance of service level to the
government agencies or non-government organizations considering the Cloud service as
primary data storage.

Theoretical Perspective

The theoretical foundation of this study is built on the theory of the process, and this part
describes the control theory. The Control theory offers a good framework through which to
understand the information security process, and a way to reconcile between theory and
practical information protection.

Control Theory (Henry C. Lucas, Jr. – 2009)

Procedure control offers a useful model for thinking about control in general. Consider Figure 2,
which shows a typical control system. In this system, a sensor determines actual conditions, and
a comparison device compares the standard with what exists. If the difference between reality
and the standard is too great, the comparison device sends a signal to act. The action taken in
turn affects the sensor and standard, and the cycle continues until the comparison device finds
agreement between sensor and standard, and stops signaling for action.
Colegio de San Juan de Letran Calamba

Figure 2: A Control System.

Creating a backup strategy is important for data protection plan. Managers have a concept of
data integrity, and these Managers must be aware of any deviations like data corruption or
data loss. Given the indicator that deviates from data integrity and availability, the
management must act accordingly to bring the data back and making it accessible and reliable.
Deviations can be technical, man-made or natural disaster yet by having efficient backup
strategy with good restoration procedure and by taking the necessary action, a possibility to
restore to data in good state.

Conceptual Framework

The Government agency wants to ensure that all users of the IT structure within the
organization’s domain abide the prescription regarding the security of data stored digitally
within the boundaries the organization stretches its authority.

So how the management and researcher analyze and view the data security strategy to be
developed as intend to recommend the data protection strategy in Cloud. After careful analysis
of existing literature, the proposed framework in this paper focuses on type of data, data
storage, backup and recovery and data classification. The conceptual framework is presented as
depicted in Figure 3 below.
Colegio de San Juan de Letran Calamba

Figure 3. Conceptual Framework

Philosophical Underpinnings/Lenses

Information technology research has usually been examined in terms of positivism paradigm.
(Jakobsen – 2013). “Positivism in general refers to philosophical positions that emphasize
empirical data and scientific methods”. Positivism prefer quantitative approach and methods
as social surveys, structured questionnaires and official. It is a framework using practical
investigation as its basis, and seeks to identify the regularities and interrelationships between
components within the setting under review. When the research setting includes other
variables over which the researcher has no control, namely social factors, this paradigm may
result in unsatisfactory or incomplete explanations of the setting in which social factors play a
part. Positivism is concerned with research that can be replicated to confirm its validity.

Positivism refers to an evidence-based reality that can be mathematically interpreted.


However, scientists have come to the realization that all observation, including objective reality,
is imperfect which led to the post positivist paradigm.
Colegio de San Juan de Letran Calamba
Assumptions

• Cloud service is not ‘one-size fits all’ approach in protecting our data against data
corruption and loss.
• Government agencies are committed in their data backup and recovery strategy.
• When it comes to data backup and recovery, IT group implemented risk assessment on
both local network and cloud storage.
• We may assume that data deduplication, encryption, and replication are not available
and implemented in DepEd Sto. Rosa.

III. RESEARCH METHODOLOGY

Research Design

The researcher used descriptive research in this study. The study aimed to describe the present
condition of DepEd Sto. Rosa branch in relation to their practices in performing their data
backup which will be the foundation of our analysis and recommendations for safeguarding
important information. The data will be gathered from each individual employee of DepEd Sto.
Rosa and will be treated each response as an individual primary data source.

Ethical Considerations

The researcher will ask the full consent from DepEd Office of Sto. Rosa branch including with
the respondents who will participate in the research to be conducted. The researcher will
ensure that the data collected will not contaminate the validity of the results. The Researcher
will protect the privacy of the responders and ensuring the anonymity of individuals. The
researcher will avoid any deception or exaggeration about the objectives of the research.

Research Locale
The Study will be conducted at DepEd Sto. Rosa Branch located at 2nd Floor Leon C. Arcillas
bldg. Barangay Market Area City of Santa Rosa, Laguna. DepEd Sto. Rosa is one of the branches
of Department of Education, the executive department of the Philippine government which is
responsible for ensuring access to, promoting equity in, and improving the quality of basic
education.
Colegio de San Juan de Letran Calamba

DepED Sto. Rosa Branch is composed of three divisions: Office of the Schools Division
Superintendent, Curriculum Implementation Division and School Governance and Operations
Division. Each division has specific goals and functions to address different areas of concern. A
total of 95 computer users from three divisions, and each employee under each division are
regular computer users who are using, transforming, transferring and saving data.

Sampling

The respondent from DepED Sto. Rosa branch involved three divisions: Office of the Schools
Division Superintendent, Curriculum Implementation Division and School Governance and
Operations Division. The total population of this office is 95 rank-and-file employees, composed
of 44 computer users from Office of the Schools Division Superintendent, 21 computer users
from Curriculum Implementation Division, and 27 computer users from School Governance and
Operations Division.

The Researcher will use total enumeration wherein all the respondents are part of the survey.
The respondents of the study are enumerated as:

Office of the Schools Division Superintendent 44


Curriculum Implementation Division 24
School Governance and Operations Division 27
Total Population 95
Colegio de San Juan de Letran Calamba
Instrumentation

To answer the questions and to attain the objectives of this study, a structured survey
questionnaire will be used. Through this questionnaire, the researcher could identify the
present data backup and recovery practices in the DepEd Office Sto. Rosa.

The set of questions


i. To determine the current state of data and if there are any existing policies and
regulation implemented
ii. To describe any existing procedure on how employees copies and backup their data and
employees categorize their data before saving their files.
iii. To ask the employees for any information of any data risks and problems they are
encountering in saving, copying and storing files.

Data Gathering procedure

The questionnaire will be disseminated to all the respondents of DepEd Sto. Rosa. The
instruments will be distributed manually to all the respondents and the researcher is present
while the respondents are answering the survey questionnaire. In this method, the researcher
can explain any questions to the respondents when needed. The researcher will conduct the
survey while the respondents are still in their natural working environment.

Researcher will distribute his own questionnaire and will improve the reliability of his
instrument by performing pretest and validation test by respondent debriefing. The respondent
debriefing is the approach by running the survey on a small number of respondents’ prior
sending it out to entire population. The questionnaire should be well understood by the
respondents and any comments and suggestions from the respondents are entertain.

Data Analysis

Descriptive analysis is a method and technique will be using to describe the current practices of
DepEd Office Sto. Rosa. Descriptive statistics such as frequency, percentage and means will be
used.
Colegio de San Juan de Letran Calamba

In analyzing and summarizing the data collected, the researcher will use the weighted mean.
Formula as follows:

Weighted Mean:
_
x = _∑wX_
∑w
Where:
_
x = Weighted mean
X = Average of a given value (mean)
w = Frequency number (weighing factors)
∑ = Σ means “add up”
The mean scores computation will be interpreted by using the Likert scale. This tool will
measure and interpret the positive and negative responses from the respondents.
The following tables show the range of scale, rating and interpretation needed to assess the
responses.

Scale Interpretation
5 Always
4 Often
3 Sometimes
2 Rarely
1 Never

Scale Interpretation
3 Agree
2 Undecided
1 Disagree
Colegio de San Juan de Letran Calamba
Appendix A
Survey Questionnaire

Dear Respondent,

We are conducting research on the analysis of Data backup and recovery. Your response in this regard
shall help us to complete this research in efficient way. We ensure you that data collected shall be kept
confidential.

I. A. Current state of data Always Often Sometimes Rarely Never


5 4 3 2 1
1. Type of data/file created by the employees
Document file
Excel file
Image/Photo file
Video File
Portable Document Format ( PDF ) File
others: Please specify _________________
2. Software application used in creating data and files
Microsoft Word
Microsoft Excel
Microsoft PowerPoint
Microsoft Project
Adobe Acrobat
Adobe Photoshop
others: Please specify _________________
3. Computer Operating System used in creating files
Windows 10
Windows 8
Windows 7
Windows Vista
Windows XP Professional
Windows XP
others: Please specify _________________
Colegio de San Juan de Letran Calamba
Always Often Sometimes Rarely Never
5 4 3 2 1
4. Storage location preferred and promoted by DepED Sto.
Rosa to save data and files
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Network Attached Storage ( NAS )
others: Please specify _________________
5. Storage location where personal data and files are being
saved
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Network Attached Storage ( NAS )
others: Please specify _________________
6. Storage location where corporate files are being saved
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Network Attached Storage ( NAS )
others: Please specify _________________
7. Storage location where copies of your data are being saved
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Network Attached Storage ( NAS )
others: Please specify _________________
8. Data and files backup by employees
Email
Document file
Excel file
Image/Photo file
Video File
Colegio de San Juan de Letran Calamba
Always Often Sometimes Rarely Never
5 4 3 2 1
Portable Document Format ( PDF ) File
others: Please specify _________________
9. Feel secure with your files saved in
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Network Attached Storage ( NAS )
others: Please specify _________________
10. Storage that gives you complete control over your data
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Network Attached Storage ( NAS )
others: Please specify _________________
11. What type of data do you consider putting in cloud?
Confidential files
Private files
Public Files

B. Current Data Policy and Protection Agree Undecided Disagree


3 2 1
1. DepEd Sto. Rosa has any policy or procedures about managing your
personal’s files separated from corporate files and documents
2. DepEd Sto. Rosa has existing USB policy in copying files
3. DepEd Sto. Rosa has privacy or data laws, or compliance regulations
in placed
4. In saving files in local drive, do you followed any structured directory?
5. DepEd Sto. Rosa has existing process of organizing data into
categories for its most effective and efficient use
6. DepEd Sto. Rosa has protection from virus/malware
7. Have you ever had full data backup of your data ?
8. Do you practice encrypting your files?
Colegio de San Juan de Letran Calamba
II. A. Data Classification Practices Always Often Sometimes Rarely Never
5 4 3 2 1
1. Corporate restricted data and confidential files are
stored in
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Network Attached Storage ( NAS )
others: Please specify _________________
2. Corporate public data are stored in
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Network Attached Storage ( NAS )
others: Please specify _________________
3. Corporate private (not explicitly classified as Restricted
or Public ) data are stored in
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Network Attached Storage ( NAS )
others: Please specify _________________
B. Current Data Backup and recovery strategy
1. Data is being backed up
Document file
Excel file
Image/Photo file
Video File
Portable Document Format ( PDF ) File
others: Please specify _________________
2. Preferred Data backup storage
local Hard Disk
External Hard disk Storage
USB Flash drive
Cloud Storage
Colegio de San Juan de Letran Calamba
Always Often Sometimes Rarely Never
5 4 3 2 1
Network Attached Storage ( NAS )
Tape backup
others: Please specify _________________
3. Responsibility of securing organizational data needs to be
in the hands of
IT Support
Data owner
others: Please specify _________________
4. Frequency to perform data backup
Hourly
Daily
Weekly
Monthly
Annually
others: Please specify _________________
5. What is (RPO) Recovery Point Objective or the maximum
tolerable period in which your data might be lost? Is it
tolerable to lose
1 hour of work
5 hours of work
12 hours of work
18 hours of work
1 day of work
2 days of work
1 week of work
6. What is ( RTO)Recovery Time Objective or the Maximum
tolerable time for you to wait to recover your data?
1 hour
5 hours
12 hours
18 hours
1 day
2 days
1 week
Colegio de San Juan de Letran Calamba
Always Often Sometimes Rarely Never
III. Current Data Backup and recovery strategy 5 4 3 2 1
1. Confidential and Corporate File/Data risk faced by
employees
Data loss
Data confidentiality
Data corruption
Accidental Data Overwritten
Data modified by other user
Data damage due to flooding
Data damage due to Fire
others: Please specify _________________
2. Experience data loss due to
Viruses/Malware
Hard disk failure
Power failure
Computer hardware malfunction/failure
Accidental deletion
Natural disaster ( flood, fire, earthquake )

3. Problems encountered when saving files to local drive


a.)______________________________________________
b.)______________________________________________
4. Problem encountered when saving files to cloud storage
a.)______________________________________________
b.)______________________________________________
5. Problems encountered when retrieving files to local drive
a.)______________________________________________
b.)______________________________________________

6. Problem encountered when retrieving files to cloud storage


a.)______________________________________________
b.)______________________________________________

7. What do you do if the file is missing or corrupted?


_________________________________________________
8. What do you do if you overwritten the files?
_____________________________________________________
Colegio de San Juan de Letran Calamba

Literature Cited

Praveen S. Challagidad ( April 2017). Microsoft. Efficient and Reliable Data Recovery Technique in Cloud
Computing. Basaveshwar Engineering College, Bagalkot, India

Frank Simorjay (April 2017). Microsoft. Shared Responsibilities for Cloud Computing
Retrieved from https://gallery.technet.microsoft.com/Shared-Responsibilities-81d0ff91

Rongzhi Wang ( 2017 ). Research on data security technology based on cloud storage.
Institute of Computer, Hulunbuir College , Mongolia,China

Darren Thomson (2015 ). Symantec. STATE OF PRIVACY REPORT 2015


Retrieved from https://www.symantec.com/content/en/us/about/presskits/b-state-of-privacy-report-
2015.pdf

May 2017. Amazon Web Services: Risk and Compliance


Retrieved from
https://d1.awsstatic.com/whitepapers/compliance/AWS_Risk_and_Compliance_Whitepaper.pdf

Thiruvadinathan (2017 ). Happiest Minds Technologies. Data Classification – Taking control of your data
Retrieved from https://www.happiestminds.com/whitepapers/Data-Classification-Taking-control-of-
your-data.pdf

James Boldon (2017). QinetiQ Company. Whitepaper - The First Step to Protecting Unstructured Data.
Retrieved from https://www.boldonjames.com/resources/data-classification/

Boldon James ( August 2017 ). QinetiQ Company. BEST PRACTICE: USER-DRIVEN DATA CLASSIFICATION
Retrieved from https://www.boldonjames.com/tag/user-driven-classification/

Susan Fowler ( February 2003). SANS Institute. Information Classification - Who, Why and How
Retrieved from https://www.sans.org/reading-room/whitepapers/auditing/information-classification-
who-846

Titus ( 2014 ). Classification is a Business Imperative


Retrieved from
http://nbizinfosol.com/sites/default/files/product_whitepapers/Data_Classification_is_a_Business_Imp
erative.pdf
Colegio de San Juan de Letran Calamba

Data Classification for Cloud readiness


(Microsoft Trustworthy Computing – 2017)
Retrieved from https://download.microsoft.com

CA ARCserve ( 2017). Best Practices on Leverage the Public Cloud for Backup and Disaster Recovery
Retrieved from http://www.arcserve.com/us/solutions/~/media/Files/SolutionBriefs/ca-arcserve-top-
ten-practices-public-cloud.pdf

CA ARCserve ( 2017). Data Protection Next Level


Retrieved from http://www.arcserve.com/us/solutions/~/media/Files/SolutionBriefs/ca-arcserve-top-
ten-practices-public-cloud.pdf
dvertising supplement sponsored by CA ARCserve

CA ARCserve ( 2017). Best Practices on Leverage the Public Cloud for Backup and Disaster Recovery
Retrieved from http://www.arcserve.com/us/solutions/~/media/Files/SolutionBriefs/ca-arcserve-top-
ten-practices-public-cloud.pdf

ICT Services ( 015).


(Coventry City Council, ICT Services ( 2015). Backup Solution for Schools: Advice and Guidance
Retrieved from
http://www.coventry.gov.uk/download/downloads/id/14590/backup_guidance_for_schools.pdf.

Our ICT ( March 2015). Microsoft Authorized Education Reseller. A Guide to Choosing the Right Data
Backup Solution for your School
Retrieved from http://www.ourict.co.uk/best-school-data-backup

Department of information and communication technology ( DICT ) (2017 ): Government Cloud service
Retrieved from http://www.dict.gov.ph/prescribing-the-philippine-governments-cloud-first-policy/

Department of information and communication technology ( DICT ) (2017 ): Government Cloud


(GovCloud) Retrieved from http://i.gov.ph/govcloud/
Government Cloud (GovCloud)

Azure Government ( 2017 ). Microsoft. GovCloud Service Catalogue


Retrieved from https://azure.microsoft.com/en-us/global-infrastructure/government/

Department of information and communication technology ( DICT ) (2017 ): GovCloud Service Catalogue
Retrieved from http://i.gov.ph/govcloud/service-catalogue/
Colegio de San Juan de Letran Calamba

Symantec (2015 ) Keeping Your Private Data Secure


Retrieved from https://www.symantec.com/content/dam/symantec/docs/white-papers/keeping-your-
private-data-secure-en.pdf

Das könnte Ihnen auch gefallen