Sie sind auf Seite 1von 86

www.csi-india.

org

ABOUT CSI
The seed for the Computer Society of India (CSI) was first shown in the year 1965 with a handful of IT
enthusiasts who were a computer user group and felt the need to organize their activities. They also
wanted to share their knowledge and exchange ideas on what they felt was a fast emerging sector.
Today the CSI takes pride in being the largest and most professionally managed association of and
for IT professionals in India. The purposes of the Society are scientific and educational directed
towards the advancement of the theory and practice of computer science and IT. The organisation
has grown to an enviable size of 100,000 strong members consisting of professionals with varie d
backgrounds including Software developers, Scientists, Academicians, Project Managers, CIO's ,
CTO's & IT vendors to just name a few. It has spread its branches all over the country. Currently
having more than 500 student branches and rooted firmly at 73 different locations, CSI has plans of
opening many more chapters & activity centres in smaller towns and cities of the country. The idea is
to spread the knowledge, and provide opportunities to as many interested as possible.

The CSI Vision: "lT for Masses"


Keeping in mind the interest of the IT professionals & computer users CSI works towards making the
profession an area of choice amongst all sections of the society. The promotion of Information
Technology as a profession is the top priority of CSI today. To fulfill this objective, the CSI regularly
organizes conferences, conventions, lectures, projects, awards. And at the same time it also ensures
that regular training and skill updating are organized for the IT professionals. Education Directorate,
CSI helps physically challenged citizens by providing training 'Punarjani'. CSI also works towards a
global approach, by seeking out alliances with organizations overseas who may be willing to come
forward and participate in such activities. CSI also helps governments in formulating IT strategy &
planning.

CSI Adhyayan [Oct.-Dec. 2015]

Page 2

www.csi-india.org

Contents
President Message
Green Cloud-Role Of Cloud Computing In Agriculture
Cloud Computing In Health Care
Cloud Vendors Comparison
Big Data Cloud Database & Computing
Cloud Computing And The Internet Of Things: A Mixed Approach
Cloud Computing
Cloud Computing & Security Cephalalgia
Cloud Computing- The 3 Ws
Cloud Computing & Security Issues
Stream Me In Cloud
Google Cloud Platform
Grid Computing
Deep Web
Secret Space Encryptor
Digital Life
Comparative Analysis Of Sql And NoSql Cloud Database Solutions
Spatial Database Systems And Its Application In Military: - An
Overview
Blue Brain
Puzzle

4
5
10
15
18
22
28
31
35
40
44
50
58
62
65
66
69
76
79
83

Disclaimer: CSI Adhyayan contains information about new technologies useful for students. The information contained in this
newsletter is not advice, and should not be treated as such. You must not rely on the information in the newsletter as an
alternative to information from research journals.
We do not represent, warrant, undertake or guarantee:

that the information in the newsletter is original, correct, accurate, complete or non-misleading
that the use of guidance in the newsletter will lead to any particular outcome or result; or We will not be liable to you in
respect of any losses arising out of any event or events beyond our reasonable control. We will not be liable to you in
respect of any business losses, including without limitation loss of or damage to profits, income, revenue, use,
production, anticipated savings, business, contracts, commercial opportunities or goodwill. We will not be liable to you
in respect of any loss or corruption of any data, database or software. We will not be liable to you in respect of any
special, indirect or consequential loss or damage.

CSI Adhyayan [Oct.-Dec. 2015]

Page 3

www.csi-india.org
Prof. Bipin V Mehta
President
Computer Society of India
president@csi-india.org

Message
It gives me great pleasure to convey my best wishes to the re-launch of CSI Publication CSI
Adhyayan in digital form during the Golden Jubilee Year of CSI. The CSI Adhyayan is for the
student members, by the student members and of the student members. It will immensely benefit
academicians, researchers and student community.
CSI is promoting research and innovations in field of IT and allied fields, considering the fact that
research and innovations plays significant role in the development of the multi disciplinary approach
leading towards overall development of the society and country at large.
Shri Narendra Modi, Prime Minister of India has said Technology is the demand of time and it can
be used to solve the problems of the world. So, the youths should use their creativity and should
come up with technological solutions that can positively change the lives of billions of people of the
world. Technology combines 3Ss- speed, simplicity and service. Technology is fast, technology is
simple and technology is a brilliant way to serve people.
It is well said by Former President Late Dr. A.P.J. Abdul Kalam A developed India by 2020, or
even earlier, is not a dream. It need not be a mere vision in the minds of many Indians. It is a mission
we can all take up - and succeed. His dream can become a reality by innovations and research.
Today many Disruptive Innovations have changed the world. It is a process by which a product or
service starts as a simple application in the bottom of the market and then moves up displacing
establish competitors as described by an academicians Clayton Christensen. Many technologies and
inventors have led disruptive innovations. There are many examples like driver less cars by Google
and not by leading car manufacturers, 3D Printing in manufacturing, GPS systems for navigation,
ATMs replacing Cashiers, Virtual Reality, Internet- Internet Video, Social Networking, Wi-Fi, Cloud
Storage etc.
I feel that our students, academicians and researchers can work in this area to make more success
stories of such innovations.
The level of articles published in CSI Adhyayan are of high quality, satisfying the ever increasing
need of student members to acquire more knowledge in the area of IT and Computer Science.
I am confident, CSI Adhyayan will facilitate and provide exposure to the latest and Next
Generation Technologies, products and best industry practices thus encouraging and inculcating the
attitude amongst the students to solve the problems through learning, teaching and practice.
My special thanks and congratulations to entire Publication Committee, for their untiring efforts to
revive the digital publication of CSI Adhyayan successfully.
My best wishes for all the success for the uninterrupted publication of CSI Adhyayan.
With Seasons Greetings and Best Wishes for the New Year.

Bipin V Mehta

CSI Adhyayan [Oct.-Dec. 2015]

Page 4

www.csi-india.org

GREEN CLOUD-ROLE OF CLOUD


COMPUTING IN AGRICULTURE
Compiled by:
Lakshmi Alekhya Kasireddy

1. INTRODUCTION
Since the times human beings learnt to grow crops, harvested them and sold them to market, they have
sought information from each other. Thus, the recent economic and social changes and the advancement in
application of IT and geospatial technologies to capture and analyse data is transforming agriculture sector
into a big information intensive sector. For a farmer who is well acquainted with field information,
availability of timely, accurate and updated information about soil conditions, weather, water availability,
nutrients etc. plays a crucial role in decision making. Cloud computing is the most promising, reliable and a
cheaper tool for the agriculture sector.
The Cloud concept is not new; it is drawn by the innovations and integration in existing technologies like
virtual computing, cluster computing, software as a service (SaaS), platform as a service (PaaS) and
infrastructure as a service (IaaS). It can be defined as a technology which uses a network of remote servers
hosted on the internet to store, manage, and process data. Clouds are of three types public, private and
hybrid. They enable convenient, on demand network access to a shared pool of configurable computing
resources (e.g., networks, servers, storage, applications, and services).

AGRICULTURE AND IT
The introduction of IT into agriculture has hitherto been performed in areas where farmers have been more
or less obliged to use it to comply with government and distribution industry rules, such as filing tax returns
and maintaining traceability records. It cannot be said for certain that IT has been useful in actual
agricultural production. By focusing on the following two points, there is a possibility of applying cloud
computing for agricultural production.

Increased efficiency for agriculture as an industry


Succession of agricultural technology

Although the involvement of cloud computing in agricultural technology is of course important, it has been
found to be very difficult to use IT for this purpose at this stage in our country.
One can deliver routine work reports and share information between experienced and inexperienced farm
workers, including specific information about harmful insects or growing conditions, and offer expert advice.

2. AFFINITY WITH CLOUD COMPUTING


The application of cloud computing in the field of agriculture has the following advantages.

Reduction of initial costs

CSI Adhyayan [Oct.-Dec. 2015]

Page 5

www.csi-india.org

Allocation of resources on demand without limit


Maintenance and upgrades performed in the back-end
Easy rapid development including collaboration with other systems in the Cloud
More possibilities for global service development

Based on the above, agriculturists follow PDCA cycle, which involves successively performing and obtaining
feedback from the following actions.
Plan: Draw up production and operation plans.
Do: Gather work results (this involves performing the actual work on-site, though IT support cannot be
provided for this).
Check: Perform progress management and patrol the cultivated plots.
Act: Make any necessary modifications to the plans.
With this workflow, basic sensing and knowledge management techniques are likely to be the main ones
used to provide Cloud services. Data is routinely collected which includes weather and soil data, GPS data,
image data, worker observations, and data related to cultivated plots of land. The quantity is 510
megabytes per case per day. Since agricultural data has to be stored for 1030 years according to a report by
the National Agriculture and Food Research Organization5 .
To obtain advice and recommendations by analyzing this stored data, analysis engines such as data miners
are operated on this large amount of data in the Cloud. Although it is not possible to calculate parameters
such as the correct MIPS (million instructions per second) value, suitable central processing unit and input/
output performances are required. In terms of performance, it is clear that highly efficient(high sustained
performance) parallel computer technology or the like is necessary.

3.

SUGGESTED MECHANISM AND DATABASE


PLANTING SIMULATION

A mechanism to support the drafting of optimal planting plans based on knowledge management and a
cultivated land database.

CSI Adhyayan [Oct.-Dec. 2015]

Page 6

www.csi-india.org
PROFIT-LOSS CALCULATIONS FOR EACH PLOT OF LAND
Mobile phones with GPS functions are used to automate the collection of position and time data (which is
sent to a server automatically by 3G transmission). This is used to implement a mechanism for performing
profit-loss calculations for each plot of land on the basis of information such as data representing which
people went where and for how long, which is used to calculate human resources costs that make up the
bulk ofindirect costs, and material expenses obtained by mobile phones with barcode reading functions

CULTIVATED LAND RECORDS


A database of static data such as land rights and land areas, together with plot characteristics, soil analysis
results, production histories, and the like. The managers of the farming corporations that cooperated with
the verification trials responded very enthusiastically, saying that implementing these measures would
change the face of Japanese agriculture. As a whole, this system is referred to as the farm work
management system.

4.

MAINTENANCE

When IT systems are introduced into any new field, not just agriculture, frequent bug fixes and upgrades are
needed. In Cloud computing, instead of an engineer having to visit an office to do this work, the
maintenance work for hundreds of thousands or even millions of users can be done simply by amending and
adding to the software on a single system in the Cloud center; this could solve many of the problems
associated with maintenance. Moreover, in Cloud computing, there are no disparities in the software
versions being used by different users, which leads to improved usability in addition to reduced maintenance
problems.

CSI Adhyayan [Oct.-Dec. 2015]

Page 7

www.csi-india.org
DATA MANAGEMENT TECHNOLOGIES SUPPORTED:
Data management technologies are based on technologies such as sensors (weather, soil, global positioning
system [GPS]), networks (wireless local area network, third generation [3G]), and knowledge management.

DATA STORAGE

Position and time information from mobile phones with GPS functions
Weather/soil sensor data
Image and audio data obtained by mobile phones (with digital camera and audio recorder
applications)
Noteworthy data extracted from the results of routine work
Materials management data obtained using mobile phones with barcode reading functions

DATA ANALYSIS

Registering and updating virtual models


Data mining

5.

EXPANSION INTO OTHER FIELDS

With the exception of cultivated land data management, three out of the four functions mentioned in the
previous section are not specific to agriculture but also have potential applications in various other fields
(such as medicine/nursing and maintenance work), where technologies such as GPS activity sensing, Webbased mapping applications, and data mining are already being used. We are therefore working not only on
vertical integration of these concepts, but also on horizontal expansion into other fields.
IT resources are said to have spread throughout the world. This can certainly be said of offices and facilities
such as factories and research centers. However, for this sort of on-site work, new IT applications are likely
to be incorporated into terminal equipment other than personal computers and mobile phones. Even a
single food-related business in Japan, such as retail, eating-out, or food manufacturing, is an industry with
sales of the order of 80 trillion yen, so there is enormous potential for global development in these fields or
in other fields with a broader base than the 10 trillion yen or so of the IT industry to which we belong.

6.

CONCLUSION

Agriculture has traditionally been maintained by families and communities where the passing on and sharing
of knowledge is regarded as very important. The accumulation and sharing of knowledge has resulted in
better overall efficiency and productivity. Agriculture is the embodiment of a large amount of ancient
knowledge. If the leverage effects of IT can be widely developed, then we should be able to bring about a
further leap in agriculture. It goes without saying that Cloud computing can support this process. Indeed,
one might say that the mechanism of Cloud computing is highly suited to the task of handing down human
knowledge to later generations. One can deliver routine work reports and share information between
experienced and inexperienced farm workers, including specific information about harmful insects or
growing conditions, and offer expert advice. However, as mentioned earlier, there is still a long way to go
before IT can be applied to on-site work in fields such as agriculture.

CSI Adhyayan [Oct.-Dec. 2015]

Page 8

www.csi-india.org
REFERENCES
1. Ministry of Agriculture, Forestry and Fisheries:
2. Statistical data. (in Japanese). http://www.maff.go.jp/j/tokei/index.htmlK. Yamashita: The pitfalls of
agricultural cooperatives. (in Japanese), First edition, Takarajimasha, Japan, January 2009.
3. Rural Culture Association Japan: Agricultural technology compendiumvegetables, crops. (in
Japanese). http://lib.ruralnet.or.jp/taikei/select01/ index.html
4. Tricks of the trade in spreading compost and fertilizer. (in Japanese), Modern Farming, Rural Culture
Association Japan, March 2009.
5. National Agriculture and Food Research Organization, Central Agricultural Laboratory,
Agricultural Data Research Unit, Model Development Team: An automatic system for
periodically running various models and delivering their results, Kanto,
6. Tokai and Hokuriku research findings, 2003.
7. M. Nitanda et al., Experimental evaluation of a database of near-miss incidents. (in Japanese), 69th
National Convention of IPSJ, March 2007.
8. Y. Utashiro, Information/knowledge management information/managementIT and knowledge
management (the complete management fundamentals). (in Japanese), First edition, Gakubunsha,
March 2007.

Lakshmi AlekhyaKasireddy [CSI: 01320031] is student of 3rd year Computer Engineering Department
studying in Andhra University College of Engineering for Women.

CSI Adhyayan [Oct.-Dec. 2015]

Page 9

www.csi-india.org

CLOUD COMPUTING IN HEALTH CARE


Complied by:
Sowmya Baggam

INTRODUCTION
In the present day digital scenario where the whole world is moving towards cloud environment, not only IT
& Communications field, but healthcare industry has also embraced this technology. Cloud technology can
provide access to hardware, software, IT knowledge and resources and services, all within an operating
model that drives down costs and simplifies technology adoption. The application of cloud technology in
medical field has enabled health care organizations to focus their efforts on clinically relevant services and
improved patient outcomes.

1.

CURRENT STATE OF HEALTHCARE

Patients today are better advocates for their own healthcare; they are more educated to their diseases and
increasingly demand access to the latest technologies. At the same time, they seek the best care at the best
cost and are willing to investigate their options. As a result, demands for access to personal patient records
is increasing and organizations need to keep up. When citizens can access bank accounts from anywhere in
the world, withdraw money, get balances and make payments, it is hard for them to understand why they
cannot have universal access to their secure health information.
This has enabled the healthcare organization to render better and efficient services to the patients with the
help of IT and in particularly nowadays with the advent of cloud technology. In contrast to the earlier days,
there are several healthcare systems, mobile apps related to health where the patient information is
accessible to the concerned doctor or medical organization in a secure and private manner. The entire
patient record, consolidated into a single view from any number of different applications, provides accurate
and up-to-date information upon which physicians can make better informed decisions. Clinics, hospitals,
insurance payers and patients are all able to access the relevant information as needed.
In addition, electronic medical records, digital medical imaging, pharmacy records and doctor's notes are all
consolidated and accessible. The ability of researchers to run analytics, better treatment options, optimal
insurance programs and the possibilities of truly personalized healthcare have become a reality.

CSI Adhyayan [Oct.-Dec. 2015]

Page 10

www.csi-india.org
2.

IMPACTS OF CLOUD TECHNOLOGY ON HEALTH CARE

As with any industry, certain drivers need to be present in order for new technologies to be adopted. For
many years, these drivers have been minimally present in healthcare, resulting in a reluctance to change.
Recent investments and the increased visibility of healthcare on many countries' national agendas have
raised the drivers for cloud adoption.

3. DELIVERY OF COST-EFFECTIVE HEALTH CARE


The cost of healthcare delivery has grown to such huge proportions that governments face serious funding
issues if there is no resolution. The drive to lower the cost of healthcare delivery has become so
predominant in society that governments have risen and fallen on their healthcare platforms. Alternative
models that deliver cost savings and efficiencies must be explored in order to rein in the increasing costs.
ADMINISTRATIVE SIMPLIFICATION
Hospitals are patient care centers, not centers of technical innovation. IT departments are stretched to
accommodate the different clinical systems that are introduced into use, dealing with different vendor
systems, platforms and licensing models. Clinical departments drive the acquisition of applications without
always considering the existing infrastructure, and the result is inefficiencies. Take storage purchases as an
example. Departments typically buy 5 years of storage during the procurement cycle without any
consideration of the storage needs of other departments. This storage can sit unused but paid for, tying up
valuable capital. Add to that the need for the IT department to then manage the application's backup and
archiving needs with those of other departments. There can be 10 to 20 different applications that need
managing, taking the IT department's time away from being strategic in responding to physicians needs and
being more focused on day-to-day operations. Simplifying administration in the IT department allows more
time to be spent on clinical systems and less time on the infrastructure.

4.

CLOUD CHALLENGES IN HEALTHCARE

We have established that healthcare lags behind other industries with respect to technology adoption, and
embracing the cloud is certainly in that category. Healthcare providers face many challenges as they
investigate moving to a cloud model. Once these challenges have been satisfied, cloud technology will
become less a question of "if" and more a question of "when."
PRIVACY CHALLENGES
Privacy and security rank at the top of the list of reasons for slow adoption rates. Putting personal health
information into a 3rd-party, remote data center raises red flags where patient privacy laws are concerned.
The possibility that patient data could be lost, misused or fall into the wrong hands affects adoption. What
recourse does an organization have should a cloud provider lose data? It has happened, and it has the
potential to be a very expensive problem to resolve. Violation of patient confidentiality carries heavy fines,
including significant costs of recovery and patient notification.
A potential solution is a private cloud model. In this case the data still resides at the customer data center
and a certain degree of control still exists for organizations to manage patient privacy.
SECURITY CHALLENGES

CSI Adhyayan [Oct.-Dec. 2015]

Page 11

www.csi-india.org
This may be a moot point where healthcare providers are concerned. One of the benefits of cloud
technology is the ability to access resources that would otherwise be unattainable. A cloud provider will
have security experts deploying the latest patches and software to its data center. Secure access to the
physical property will be well guarded, and many policies, processes and mechanisms will be in place to
ensure data security. Add to that the fact that any applications operating through the cloud will store all
their data in the cloud. This means there is no protected health information (PHI) residing on hospital
computers, which is a more secure situation than today's current environment.
Health and human services studies show that PHI violations have come from the theft of computers taken
from facilities, loading docks and even physicians' vehicles. These thefts have been more for the computer
and less for the PHI. This raises the question: Wouldn't it be better to have everything in the cloud? Any
transition to a cloud would require significant support from the technology partners to ensure a smooth
transition for users.

5.

CASE STUDY

Take for example, the current practice of requesting a diagnostic exam. A physician fills out a request form
with patient details, history and reason for exam. This gets sent to the radiology department for scheduling.
The clinical staff books the exam and informs the doctor, who advises the patient, who has a conflict with
the appointment time. Back and forth it goes. Now, consider an electronic scheduling system based in the
cloud, whereby the doctor enters all the relevant information and the system determines the most
appropriate exam and notifies the patient directly of possible options. The patient logs in, selects the best
time for the exam, and the system books the exam. It seems simple, but change management is required to
ensure the transition is smooth.
As a part of this workflow transition, serious consideration should be given to staffing needs within the
organization's IT department. As the cloud starts to permeate the clinical environment, no longer will the
same skill sets be required. Different technology will need to be supported, new training will be required and
new skill sets will need to be defined. An organization that had staff working on managing backups and
archiving will now migrate to network connections and clinical applications.
IT staff will focus on the rollout of the electronic medical record (EMR) instead of managing the storage layer
the EMR sits upon. Access to this kind of skill set is in high demand today.

6.

HEALTHCARE CLOUD SOLUTION CHECKLIST

Ultimately there are certain minimum requirements that providers need to consider when evaluating
cloud provider.

CSI Adhyayan [Oct.-Dec. 2015]

Page 12

www.csi-india.org
SECURITY
To overcome current perceptions of the risks associated with using the cloud for personal health
information, cloud providers must demonstrate security measures that prevent unauthorized access to
patient data. Consideration must be given to the following:
Secure access to the facility
Network security
Data security
Staff training and regulatory compliance awareness

HIGH AVAILABILITY
Healthcare organizations are dealing with mission-critical applications where downtime can mean the
difference between a patient's life and death. Cloud providers need to be aware of and prepared for these
stringent availability requirements and should be ready to guarantee delivery of information.
Consider:
Downtime for maintenance
Responsiveness as data volume grows
Network latency and redundancy
Hardware redundancy
SCALABILITY
As new systems come online, the volume of data will grow, creating a need for the cloud provider to be able
to scale up, out and deep. As the data volume grows, the impact on performance should be negligible.
Consider:
Provisioning
Plug-and-play growth
Dynamic scaling

REMOTE ACCESS
Flexibility to access the data should be considered by healthcare organizations as they look to the cloud.
Various aspects need to taken into account to ensure adequate services are provided to the users.
Capacity of users
Performance at peak access times
Flexibility of mobile devices

CONTRACTUAL ASSURANCE
As with any agreement, healthcare facilities should develop ironclad agreements that ensure the delivery of
services will not be interrupted without penalty. Contracts should include items such as:
Curing periods for breach of contract without interruption of service
Insurance for breach of privacy
Service level agreements
Migration assistance
Scalability

CSI Adhyayan [Oct.-Dec. 2015]

Page 13

www.csi-india.org
7.

THE FUTURE STATE

With the current state of healthcare and the many adoption challenges that it faces, it is logical to conclude
that cloud technology will be at the forefront of healthcare innovation. Data drives the new healthcare world
and access is greater than ever before. Big data becomes better managed due to cloud technology, as
storage, compute power and1consolidation reach levels never before achieved. Portability of data delivers
information where it is needed, when it is needed.
Coordinated care with patients conducting their own treatment regimes becomes a possibility with cloud
technology. Patients are able to become more deeply engaged as their information is in a single index. This
means they can seek preferred treatment that addresses their state of health.
The possibility of duplicate tests and medical errors, such as contraindicated medications, can be minimized
as access to the data becomes a reality. And healthcare provider IT departments can offload the burden of
managing infrastructure and focus on supporting more patient-care-related activities. New technologies can
be quickly evaluated for their effectiveness and deployed broadly from a cloud model, allowing healthcare
providers to stay abreast of the latest and greatest tools. Ultimately, patient care will improve, which in turn
will drive down costs and improve efficiencies.
Cloud technology will be a driving force in the healthcare ecosystem for years to come. The alternative is
bankrupt facilities, healthcare costs that skyrocket to unaffordable levels and patient care delivery that relies
on an archaic and inefficient system.

REFERENCES
1. http://www.expresshealthcare.in/201109/itathealthcare04.shtml
2. http://www.ecommercetimes.com/story/Googles-No-Bellwether-for-Healthcare-Cloud-Services72829.html
3. http://www.computerworld.com/s/article/9215679/Report_Iron_Mountain_to_shutter_cloud_
Storage service
4. http://www.information-management.com/news/health-care-cloud-computing-10020883-1.html
5. http://newsroom.cdw.com/features/feature-05-26-11.html
6. http://www.cloudcomputingzone.com/2011/02/healthcare-coming-to-the-cloud/
7. http://www.cloudbook.net/resources/stories/can-cloud-computing-help-fix-health-care
8. http://www.zdnet.com/news/challenges-of-cloud-computing-in-healthcare-integration/6266971
9. http://info.exist.com/blogs/bid/61400/Cloud-Computing-for-Healthcare
10. http://thehealthcareblog.com/blog/2011/05/02/health-care-in-the-cloud/

SOWMYA BAGGAM [ CSI : 01320009] is Student of (3/4) B.Tech (C.E) at Andhra University College Of
Engineering For Women .

CSI Adhyayan [Oct.-Dec. 2015]

Page 14

www.csi-india.org

CLOUD VENDORS COMPARISON


Compiled by:
S.N.P.Bharadwaj and M.Vamsy Kiran
Traditionally, websites are hosted on just one server within the data centre of a business organization. High
traffic sites are normally hosted on a dedicated server and smaller sites are hosted on a shared server with
other websites. There is an alternative means to host a website which is becoming more and more popular
called the cloud hosting.

WHAT IS CLOUD HOSTING?


The idea of cloud computing has existed since the 1950s. It was founded upon a principle of sharing
resources. Consider a big firm. A company could install the application in a single centralized infrastructure
and the application could be accessed by clients from all around the network instead of installing an
application on a large number of computers. This removes the need of installing and updating applications a
large number of times.
Cloud computing is used by many internet companies that were large. Email services provide for example
Outlook and Gmail (once Hotmail), Facebook, Skype, YouTube, Twitter, your bank, and major news websites
use cloud computing. Due to the rise in popularity of back-up services such as Dropbox and Google Drive,
most net users have an overall understanding of the term "The Cloud."

BUT WHAT DOES CLOUD HOSTING REFER TO?


Like the original idea of cloud computing, cloud hosting is really all about pooling resources. In a cloud
hosting setup, customers don't possess a particular server assigned to them. Instead, the business that is
hosting creates a cloud infrastructure across multiple servers. Virtualization software is then used to break
up these physical servers into multiple virtual servers aka "The Cloud." Each virtual server would be set up
with its own distinctive software and programs. From an administration (i.e customer) point of view, you will
not notice any difference in hosting your website on a cloud server than a dedicated server.
Since cloud servers are crossed across multiple real servers, no such physical constraints are placed on the
client. Your cloud hosting package might be raised and decreased as and when necessary. While some cloud
hosting companies offer pay as you go rates, most still offer pre-packaged plans to customers that allocate a
certain number of resources to them. However, the crucial thing to notice is the fact that increasing (or
decreasing) the allocation of resources of a customer just takes seconds in cloud hosting as resources are
changed using software. In contrast, when there is a website stored on an individual server, replacing or
fixing an element results in downtime.
Cloud hosting is a practical option for the two parties.
Hosting companies won't have small websites hosted on powerful servers that are only using at 10%
capacity, so they could run a more efficient data centre (which is, in addition, great for the environment).
Website owners do not have to worry about traffic surges as the cloud is configured to support it. A cloud
hosting set up additionally reduces downtime, too, since sites are hosted from the other side of the cloud.
Let us look at the best cloud hosting firms online.

CSI Adhyayan [Oct.-Dec. 2015]

Page 15

www.csi-india.org
Digital Ocean:
Digital Ocean supply affordable SSD cloud servers that are based. They've data centers in New York, San
Francisco, London, Amsterdam, and Singapore. They have an easy interface that permits you to easily create
and manage your virtual servers. Linux distributions and applications may be installed at the tap of a button.
Digital Ocean promotes the fact these installations can be live in just 55 seconds and are proud of this. Their
cheapest strategy offers 20GB storage and 512MB RAM and retails at just $5 per month. While the most
expensive pre-defined bundle retails at $640.
GoGrid:
GoGrid is a well-known cloud hosting business that gives customers handled back-ups. In addition they offer
security services and server monitoring. They own three data centers in America and one in Amsterdam.
Data is backed up by a guarantee of a 30 minute response time for crises. GoGrid sell four distinct kinds of
cloud hosting server strategies: SSD, dedicated disc, high RAM, and standard.
Site5:
Site5 is an affordable website hosting company which has data centers in 22 places across the world.
Rackspace:
Rackspace have a good standing in the cloud hosting world. Their Public Cloud service might be used to host
sites or applications and offers scalability and high performance. They've three date centers in America, and
one in Hong Kong, London, and Sydney. Excluding planned care and crises, they promise 100% uptime. Their
most high-priced hosting plans cost a huge number of dollars.
SingleHop:
SingleHop are a seasoned hosting business that have one data center in Amsterdam and three in the United
States. Their cloud servers use VMware's vCloud Suite and could be utilized with Linux or Windows.
Hyve:
Hyve is a UK based hosting company that also have two in China and two data centers in the USA. VMWare's
SecureCloud platform is used by them and supply security features such as IPS and DDoS shield. They
specialise in providing cloud hosting for WordPress websites. Large databases can be spread across multiple
servers and static files could be served through a CDN.
Amazon EC2:
Amazon's EC2 service is a cloud hosting service which provides you with complete control over your hosting.
New server examples booted and can be produced in minutes, which allows you to scale your capacity up
and down when needed. There are lots of security features that are amazing also. The service puts a large
emphasis on reliability; with their deal promising 99.95% uptime for each of their Amazon EC2 regions.
Final Thoughts
Cloud hosting has become a popular way of hosting sites that are medium, small, and big. They specialize in
handling traffic spikes and providing high levels of uptime. Generally, cloud hosting is more costly than
hosting a website on a dedicated server, and those of you who run large companies may prefer having the
total control that a dedicated server offers.

CSI Adhyayan [Oct.-Dec. 2015]

Page 16

www.csi-india.org
Fire Host seem to be a respected hosting company within the cloud hosting niche, nevertheless their
managed support and security options do come at a price. If you need additional resources, you might need
to think about alternatives like HP Cloud, Microsoft Azure and Verizon Cloud and Amazon. They give you
more control over your server; however the technical knowledge will be needed by someone from your
business to operate it.

SOURCE:
http://cloudcomputingcostscomparison.blogspot.in/2015/10/cloud-vendors-comparisiom.html

Mr. S.N.P.Bharadwaj [CSI-01279223] is studying in IV year of B.Tech (CSE) at LENDI Institute Of


Engineering & Technology, Vizianagaram, Andhra Pradesh. Presently he is working as IBM Intern. His areas of
interest are Cloud Computing, Big Data, programming etc. He can be reached at snp.bharadwaj@gmail.com.

Mr. M.Vamsy Kiran [CSI-01279211] is studying in IV year of B.Tech (CSE) at LENDI Institute Of
Engineering & Technology, Vizianagaram, Andhra Pradesh. Presently he is working as IBM Intern. His areas of
interest are Cloud Computing, Artificial Intelligence, Android programming etc. He can be reached
at vamsykiran95@gmail.com.

CSI Adhyayan [Oct.-Dec. 2015]

Page 17

www.csi-india.org

B I G D A T A C LO U D D ATA B A S E & C O M P U T I N G
Compiled By:
P.Manonmai,

1.

INTRODUCTION:

The rises of cloud computing and cloud data stores have been a precursor and facilitator to the emergence
of big data. Cloud computing is the commodification of computing time and data storage by means of
standardized technologies. It has significant advantages over traditional physical deployments. However,
cloud platforms come in several forms and sometimes have to be integrated with traditional architectures.
This leads to a dilemma for decision makers in charge of big data projects. How and which cloud computing
is the optimal choice for their computing needs, especially if it is a big data project? These projects regularly
exhibit unpredictable, bursting, or immense computing power and storage needs. At the same time business
stakeholders expect swift, inexpensive, and dependable products and project outcomes. This article
introduces cloud computing and cloud storage, the core cloud architectures, and discusses what to look for
and how to get started with cloud computing.
Big data is a broad term for data sets so large or complex that traditional data processing applications are
inadequate. Challenges include analysis, capture, data curation, search, sharing, storage, transfer,
visualization, and information privacy. The term often refers simply to the use of predictive analytics or
other certain advanced methods to extract value from data, and seldom to a particular size of data set.
Accuracy in big data may lead to more confident decision making. And better decisions can mean greater
operational efficiency, cost reduction and reduced risk.

2.

CLOUD STORAGE AND COMPUTING:

Professional cloud storage needs to be highly available, highly durable, and has to scale from a few bytes to
petabytes. Amazons S3 cloud storage is the most prominent solution in the space. S3 promises a 99.9%
monthly availability and 99.999999999% durability per year. This is less than an hour outage per month. The
durability can be illustrated with an example. If a customer stores 10,000 objects he can expect to lose one
object every 10,000,000 years on average. S3 achieves this by storing data in multiple facilities with error
checking and self-healing processes to detect and repair errors and device failures. This is completely
transparent to the user and requires no actions or knowledge.
A company could build and achieve a similarly reliable storage solution but it would require tremendous
capital expenditures and operational challenges. Global data centered companies like Google or Facebook
have the expertise and scale to do this economically. Big data projects and start-ups, however, benefit from
using a cloud storage service. They can trade capital expenditure for an operational one, which is excellent
since it requires no capital outlay or risk. It provides from the first byte reliable and scalable storage
solutions of a quality otherwise unachievable.
This enables new products and projects with a viable option to start on a small scale with low costs. When a
product proves successful these storage solutions scale virtually indefinitely. Cloud storage is effectively a
boundless data sink. Importantly for computing performances is that many solutions also scale horizontally,

CSI Adhyayan [Oct.-Dec. 2015]

Page 18

www.csi-india.org
i.e. when data is copied in parallel by cluster or parallel computing processes the throughput scales linear
with the number of nodes reading or writing.
Cloud computing employs visualization of computing resources to run numerous standardized virtual servers
on the same physical machine. Cloud providers achieve with this economies of scale, which permit low
prices and billing based on small time intervals, e.g. hourly.
This standardization makes it an elastic and highly available option for computing needs. The availability is
not obtained by spending resources to guarantee reliability of a single instance but by their
interchangeability and a limitless pool of replacements. This impacts design decisions and requires to deal
with instance failure gracefully.
The implications for an IT project or company using cloud computing are significant and change the
traditional approach to planning and utilization of resources. Firstly, resource planning becomes less
important. It is required for costing scenarios to establish the viability of a project or product. However,
deploying and removing resources automatically based on demand needs to be focused on to be successful.
Vertical and horizontal scaling becomes viable once a resource becomes easily deployable.
Horizontal scaling refers to the ability to replace a single small computing resource with a bigger one to
account for increased demand. Cloud computing supports this by making various resource types available to
switch between them. This also works in the opposite direction, i.e. to switch to a smaller and cheaper
instance type when demand decreases. Since cloud resources are commonly paid on a usage basis no sunk
cost or capital expenditures are blocking fast decision making and adaptation. Demand is difficult to
anticipate despite planning efforts and naturally results in most traditional projects in over- or underprovision resources. Therefore, traditional projects tend to waste money or provide poor outcomes.

3.

CLOUD PROVIDERS:

A decade ago an IT project or start-up that needed reliable and Internet connected computing resources had
to rent or place physical hardware in one or several data centers. Today, anyone can rent computing time
and storage of any size. The range starts with virtual machines barely powerful enough to serve web pages
to the equivalent of a small supercomputer. Cloud services are mostly pay-as-you-go, which means for a few
hundred dollars anyone can enjoy a few hours of supercomputer power. At the same time cloud services and
resources are globally distributed. This setup ensures a high availability and durability unattainable by most
but the largest organizations.
The cloud computing space has been dominated by Amazon Web Services until recently. Increasingly serious
alternatives are emerging like Google Cloud Platform, Microsoft Azure, Rackspace, or Qubole to name only a
few. Importantly for customers a struggle on platform standards is underway. The two front-running

CSI Adhyayan [Oct.-Dec. 2015]

Page 19

www.csi-india.org
solutions are Amazon Web Services compatible solutions, i.e. Amazons own offering or companies with
application programming interface compatible offerings, and OpenStack, an open source project with a wide
industry backing. Consequently, the choice of a cloud platform standard has implications on which tools are
available and which alternative providers with the same technology are available.

CLOUD BIG DATA CHALLENGES


Vertical scaling achieves elasticity by adding additional instances with each of them serving a part of the
demand. Software like Hadoop are specifically designed as distributed systems to take advantage of vertical
scaling. They process small independent tasks in massive parallel scale. Distributed systems can also serve as
data stores like NoSQL databases, e.g. Cassandra or HBase, or filesystems like Hadoops HDFS. Alternatives
like Storm provide coordinated stream data processes in near real-time through a cluster of machines with
complex workflows.
The interchangeability of the resources together with distributed software design absorbs failure and
equivalently scaling of virtual computing instances unperturbed. Spiking or bursting demands can be
accommodated just as well as personalities or continued growth.
Renting practically unlimited resources for short periods allows one-off or periodical projects at a modest
expense. Data mining and web crawling are great examples. It is conceivable to crawl huge web sites with
millions of pages in days or hours for a few hundred dollars or less. Inexpensive tiny virtual instances with
minimal CPU resources are ideal for this purpose since the majority of crawling the web is spent waiting for
IO resources. Instantiating thousands of these machines to achieve millions of requests per day is easy and
often costs less than a fraction of a cent per instance hour.
Of course, such mining operations should be mindful of the resources of the web sites or application
interfaces they mine, respect their terms, and not impede their service. A poorly planned data mining
operation is equivalent to a denial of service attack. Lastly, cloud computing is naturally a good fit for storing
and processing the big data accumulated form such operations.
Organizations that are faced with architecture decisions should evaluate their security concerns or legacy
systems ruthlessly before accepting a potentially unnecessarily complex private or hybrid cloud deployment.
A public cloud solution is often achievable. The questions to ask are which new processes can be deployed in
the cloud and which legacy process are feasible to transfer to the cloud. It may make sense to retain a core
data set or process internally but most big data projects are served well in the public cloud due to the
flexibility it provides.

4.

BIG DATA COMPUTING.

Typical cloud big data projects focus on scaling or adopting Hadoop for data processing. MapReduce has
become a de facto standard for large scale data processing. Tools like Hive and Pig have emerged on top of
Hadoop which make it feasible to process huge data sets easily. Hive for example transforms SQL like queries
to MapReduce jobs. It unlocks data set of all sizes for data and business analysts for reporting and greenfield
analytics projects.
Data can be either transferred to or collected in a cloud data sink like Amazons S3, e.g. to collect log files or
export text formatted data. Alternatively database adapters can be utilized to access data from databases
directly with Hadoop, Hive, and Pig.

CSI Adhyayan [Oct.-Dec. 2015]

Page 20

www.csi-india.org
Ideally a cloud service provider offers Hadoop clusters that scale automatically with the demand of the
customer. This provides maximum performance for large jobs and optimal savings when little and no
processing is going on. Amazon Web Services Elastic MapReduce, for example, allows scaling of Hadoop
clusters. However, the scaling is not automatically with the demand and requires user actions. The scaling
itself is not optimal since it does not utilize HDFS well and squanders Hadoops strong point, data locality.
This means that an Elastic MapReduce cluster wastes resources when scaling and has diminishing return
with more instance. Furthermore, Amazons Elastic MapReduce requires a customer to explicitly request a
cluster every time when it is needed and remove it when it is not required anymore. There is also no user
friendly interface for interaction with or exploration of the data. This results in operational burden and
excludes all but the most proficient users.

REFERENCES
1. BigDatacomputing and clouds-Trends and Future Directions by Marcos D.Assuno. E-mail
Id:assuncao@acm.org(M.D.Assuno).

2. SupportedBigDatacomputing
and
analytics
solutions
by
2014ElsevierInc.Allrightsreserved.
3. Enterprise ClassHadoop and Streaming Data by McGraw-HillCompanies,Inc,2012
4. "Internet of Things Global Standards Initiative". ITU. Retrieved 26 June 2015.
5. "The Internet of Things: How the Next Evolution of the Internet Is Changing Everything" (PDF).
By Dave Evans (April 2011). Cisco. Retrieved 4 September 2015.

P.MANONMAI [CSI :01320026] is Student Of CE(3/4) in Andhra University College Of


Engineering For Women.

CSI Adhyayan [Oct.-Dec. 2015]

Page 21

www.csi-india.org

CLOUD COMPUTING AND THE INTERNET OF


THINGS: A MIXED APPROACH
Compiled bY:
P. Pranusha

ABSTRACT
Cloud computing and the internet of things are the two different technologies which are merged together to
enhance the technology in various fields. Their use is expected to increase further making them as the
important components of the future internet. So in this article I have discussed about the need of integration
and the applications of Cloudiot.

1.

INTRODUCTION:

Cloud computing, a buzz word five to six years ago, has ushered in a new way of computing that
uses networks and operating software to provide virtually unlimited computing capability
whenever it is needed [1].It is based on the user performs the computer tasks using services
delivered entirely through the Internet. Internet of things is about the devices that connected to
the internet to perform the processes and service that support our basics needs, economics, health
and environment. Hence, cloud computing acts as a front end to access Internet of Things. These
two complementary technologies merged together is expected to disrupt both current and future
internet. We call this new paradigm CloudIoT [2]. On one hand, IoT can benefit from the virtually
unlimited capabilities and resources of Cloud to compensate its technological constraints.
Specifically, the Cloud can offer an effective solution to implement IoT service management and
composition as well as applications that exploit the things or the data produced by them. On the
other hand, the Cloud can benefit from IoT by extending its scope to deal with real world things in a
more distributed and dynamic manner.

2.

CLOUD AND IOT: THE NEED FOR THEIR INTEGRATION

The two worlds Cloud and IoT have seen an independent evolution. However, several mutual
advantages deriving from their integration have been identified and are foreseen in the future.
Now, lets consider a real life scenario:
You are on your way to office. When you are halfway there, you realize that you did not turn off the
lights of your room. You will have only two options as of now. You will either start going back to
your home or you choose not to go back and will curse yourself the whole day for your blunder. But
some years down the lane, youll have a third option. You can happily proceed on your way to office
keeping in mind the fact that the lights will turn off automatically. How? They will have sensors to
detect whether a person is in the room or not and when you leave the house, they will
automatically switch off! Cool stuff. In the future, we wont have to rely on ourselves to remember

CSI Adhyayan [Oct.-Dec. 2015]

Page 22

www.csi-india.org
such trivial things, because everything will be automated. These sensors will sense our daily time of
leaving the house, and approximately at that time they will go off or they will remind us at the same
time daily to turn them off. This is the future of the Internet of Things, but it wont be made
possible by a jumble of wires. What makes it possible is cloud computing, combined with the glut of
sensors and applications all around you that collect, monitor and transfer data to where its
needed. All of this information can be sent out or streamed to any number of devices and services.
Of course, this means that theres going to be an awful lot of data flying around out there, data that
needs to be processed quickly .The problem is exacerbated somewhat by the fact that youre not
the only one who forget such stuff think about, how many millions of people are going through
exactly the same routine as you are? This is why the cloud is so important. The cloud can easily get
a handle on the speed and volume of the data thats being received. It possesses the ability to ebb
and flow according to demand, all the while remaining accessible anywhere from any device. Cloud
software is the only technology capable of handling this data and delivering it where it needs to be
in real time.
Essentially, the Cloud acts as intermediate layer between the things and the applications, where it
hides all the complexity and the functionalities necessary to implement the latter. This will impact
future application development, where information gathering, processing, and transmission will
produce new challenges to be addressed, also in a multi-cloud environment. In the following, we
summarize the issues solved and the advantages obtained when adopting the Cloud IoT paradigm.
STORAGE RESOURCES:
IoT involves a large amount of information sources which produce a huge amount of nonstructured or semi-structured data having the three characteristics typical of Big Data: volume,
variety, and velocity. This implies collecting, accessing, processing, visualizing, archiving, sharing,
and searching large amounts of data. Offering virtually unlimited, low-cost, and on-demand storage
capacity, Cloud is the most convenient and cost effective solution to deal with data produced by
IoT. This arises new opportunities for data aggregation , integration , and sharing with third parties
.Once into the Cloud, data can be treated in a homogeneous manner through standard APIs, can be
protected by applying top-level security , and directly accessed and visualized from any place .
COMPUTATIONAL RESOURCES:
IoT devices have limited processing resources that do not allow on-site data processing. Data
collected is usually transmitted to more powerful nodes where aggregation and processing is
possible, but scalability is challenging to achieve without a proper infrastructure. The unlimited
processing capabilities of Cloud allow IoT processing needs to be properly satisfied and enable
analyses of unprecedented complexity. Data-driven decision making and prediction algorithms
would be possible at low cost and would provide increasing revenues and reduced risks. Other
perspectives would be to perform real-time processing , to implement scalable, real time,
collaborative, sensor-centric applications , to manage complex events, and to implement task
offloading for energy saving.

CSI Adhyayan [Oct.-Dec. 2015]

Page 23

www.csi-india.org
COMMUNICATION RESOURCES:
One of the requirements of IoT is to make IP-enabled devices communicate through dedicated
hardware, and the support for such communication can be very expensive. Cloud offers an effective
and cheap solution to connect, track, and manage any thing from anywhere at any time using
customized portals and built-in apps. And the availability of high speed networks, it enables the
monitoring and control of remote things their coordination, their communications and the realtime access to the produced data.
NEW CAPABILITIES:
IoT is characterized by a very high heterogeneity of devices, technologies, and protocols. Therefore,
scalability, interoperability, reliability, efficiency, availability, and security can be very difficult to
obtain. The integration with the Cloud solves most of these problems also providing additional
features such as ease-of-access, ease-of-use, and reduced deployment costs.
NEW PARADIGMS:
The adoption of the CloudIoT paradigm enables new scenarios for smart services and applications
based on the extension of Cloud through the things :
SaaS (Sensing as a Service), providing ubiquitous access to sensor data;
SAaaS (Sensing and Actuation as a Service) ,enabling automatic control logics implemented in the
Cloud;
SEaaS (Sensor Event as a Service), dispatching messaging services triggered by sensor events;
SenaaS (Sensor as a Service) , enabling ubiquitous management of remote sensors;
DBaaS (DataBase as a Service) , enabling ubiquitous database management;
DaaS (Data as a Service) , providing ubiquitous access to any kind of data;
EaaS (Ethernet as a Service), providing ubiquitous layer-2 connectivity to remote devices;
IPMaaS (Identity and Policy Management as a Service), enabling ubiquitous access to policy and
identity management functionalities;
VSaaS (Video Surveillance as a Service) , providing ubiquitous access to recorded video and
implementing complex analyses in the Cloud.[3]

APPLICATIONS:

CSI Adhyayan [Oct.-Dec. 2015]

Page 24

www.csi-india.org
HEALTHCARE:
IoT and multimedia technologies have made their entrance in the healthcare field .Smart devices,
mobile Internet, and Cloud services contribute to the continuous and systematic innovation of
Healthcare and enable cost effective, efficient, timely, and high-quality medical services .Pervasive
healthcare applications generate a vast amount of sensor data that have to be managed properly
for further analysis and processing . The adoption of Cloud in this scenario leads to the abstraction
of technical details, eliminating the need for expertise in, or control over, the technology
infrastructure, and it represents a promising solution for managing healthcare sensor data
efficiently .Moreover, it enables the execution of secure multimedia-based health services,
overcoming the issue of running heavy multimedia & security algorithms on devices with limited
computational capacity and small batteries.
SMART CITY:

IoT can provide a common middleware for future-oriented Smart-City services acquiring
information from different heterogeneous sensing infrastructures, accessing all kinds of geolocation and IoT technologies and exposing information in a uniform way. A number of recently
proposed solutions suggest to use Cloud architectures to enable the discovery, connection, and
integration of sensors and actuators, thus creating platforms able to provision and support
ubiquitous connectivity and real-time applications for smart cities . Frameworks can consist of a sensor
platform and a Cloud platform for the automatic management, analysis, and control of big data
from large-scale, real world devices. This type hides the complexity of the underlying Cloud
infrastructure, whilst at the same time meeting complex public sector requirements for Cloud, such
as security, heterogeneity, interoperability, scalability, extensibility, high reactivity, and
configurability. Common issues are related to security, and real-time interactions.
VIDEO SURVEILLANCE:
Intelligent video surveillance has become a tool of the greatest importance for several securityrelated applications. As an alternative to in-house, self-contained management systems, complex

CSI Adhyayan [Oct.-Dec. 2015]

Page 25

www.csi-india.org
video analytics require Cloud-based solutions to properly satisfy the requirements of storage (e.g.,
stored media is centrally secured, fault-tolerant, on-demand, scalable, and accessible at highspeed) and processing (e.g., video processing, computer vision algorithms and pattern recognition
modules to extract knowledge from scenes).
AUTOMOTIVE AND SMART MOBILITY:
IoT is expected to offer promising solutions to transform transportation systems and automobile
services. A new generation of IoT-based vehicular data Clouds can be developed and deployed such
as increasing road safety, reducing road congestion, managing traffic, and recommending car
maintenance .Cloud based vehicular data platforms that merge Cloud computing and IoT
technologies. These platforms aim at providing real time, cheap, secure, and on-demand services to
customers ,through different types of Clouds, which also include temporary vehicular Clouds.
Vehicular Clouds are designed to expand the conventional Clouds in order to increase on demand
the whole Cloud computing, processing, and storage capabilities, by using under-utilized facilities of
vehicles. More in general, Ethernet and IP-based routing are claimed to be very important
technologies for future communication networks in electric vehicles, enabling the link between the
vehicle electronics and the Internet.
SMART ENERGY AND SMART GRID:
IoT and Cloud can be effectively merged to provide intelligent management of energy distribution
and consumption in both local and wide area heterogeneous environments. In the first case, for
instance, lighting could be provided where and when strictly necessary by exploiting the
information collected by different types of nodes. Such nodes have sensing, processing, and
networking capabilities, but limited resources. Hence, computing tasks should be properly
distributed among them, where more complex and comprehensive decisions can be made. In the
second case, the problem on energy alternative and compatible use can be solved by integrating
system data in the Cloud, while providing self-healing, mutual operation and participation of the
users, optimal electricity quality, distributed generation, and demand response. It increases the
concerns about security and privacy issues. Consumers should gain more confidence in sharing data
to help improving and optimizing services offered.

3. CONCLUSION:
This cloud computing is free to use anytime and anywhere as long as the computer is connected
with the Internet. The cloud is the only technology suitable for filtering, analyzing, storing, and
accessing the information in useful ways. The IoT will dramatically change the way we live our daily
lives and what information is stored about us.

CSI Adhyayan [Oct.-Dec. 2015]

Page 26

www.csi-india.org
REFERENCES:
1. http://ibmcai.com/2014/11/20/iot-internet-of-things-will-go-nowhere-without-cloudcomputing-and-big-data-analytics/
2. H.-C. Chao. Internet of things and cloud computing for future internet. In Ubiquitous
Intelligence and Computing, Lecture Notes in Computer Science. 2011
3. http://wpage.unina.it/walter.dedonato/pubs/iot_ficloud14.pdf
4. https://en.wikibooks.org/
P.PRANUSHA is a student of BTECH (CE) at AUCEW.

CSI Adhyayan [Oct.-Dec. 2015]

Page 27

www.csi-india.org

CLOUD COMPUTING
Compiled by:
A.Vanathi and Korlapati Sri Harsha
Cloud Computing simply means the use of computing resources as a service through networks typically
internet. Cloud Computing is both, a combination of software and hardware based computing resources
delivered as a networked service. This model of IT enabled services enables anytime access to a shared pool
of applications and resources.
With the Cloud Computing, companies can scale up to massive capacities in an instant without having to
invest in a new infrastructure, train new personnel or license new software. Cloud Computing is of particular
benefit to small and medium size business systems. Service consumer use what they need on the Internet
and pay only for what they use.

DISTRIBUTED COMPUTING:
Distributed Computing consists of multiple computers, but run as single system. It can be physically close
together and connected by a local network, or they can be geographically distant and connected by a Wide
Area Network.
There are three types of Distributed Computing.
1. Cluster Computing
2. Grid Computing
3. Cloud Computing

CLUSTER VS GRID:
Two or more computers connected through LAN to solve a common problem. It is a Homogeneous System of
resource type. Full computer is needed to create a Cluster. Resource is in same location.
More than one computing resource connected through LAN or WAN to solve a common problem. It is a
Heterogeneous System of resource type. For resource, it can be make use of spare computing power of a
desktop computer. Resource may be graphically distributed.

GRID VS CLOUD:

CSI Adhyayan [Oct.-Dec. 2015]

Page 28

www.csi-india.org
Grid is a computationally intensive operation. It cant be serviced at a time and need to be scheduled. Any
standard operating system is used for Grid Computing. Failure management is limited (often failed tasks or
applications are restarted). Resource management is distributed.
Computational focuses are standard and high level instances. Cloud Computing is a real-time services. It
requires a hyper vision (VM) on which multiple OS run. Failure Management is strong (VMs can be easily
migrated from one node to other). Resource Management is centralized or distributed.

CLOUD COMPUTING MODELS:


INFRASTRUCTURE AS A SERVICE(IAAS):
IaaS providers offer computers and more often virtual machines and other resources as a service. It provides
the infrastructure or storage required to host the services ourselves i.e. makes us the system administrator
and manage hardware, software, networking and computing resources.

PLATFORM AS A SERVICE(PAAS):
Cloud providers deliver a computing platform including operating system, programming language execution
environment, database and web server.

SOFTWARE AS A SERVICE (SAAS):


SaaS provides users to access large variety of applications over internets that are hosted on service
providers infrastructure.

NETWORK AS A SERVICE (NAAS):


It is the category of cloud services where the capability provided to the cloud service user is to use network
or transport connecting services. NaaS involves optimization of resource allocation by considering network
and computing resources as a whole.

COMMUNICATION AS A SERVICE (CAAS):


CaaS is evolved in the same lines of SaaS. CaaS is an outsourced enter prised communication solution that
can be leased from a single vendor. The CaaS vendor is responsible for all hardware and software
management and offers guaranteed Quality of Service (QoS). It allows businesses to selectively deploy
communication devices and modes on a pay-as-you-go, as-needed basis. This approach eliminates large
capital investments.

CHARACTERISTICS OF CLOUD COMPUTING:

High Scalability
Agility
High Availability and Reliability
Multi Sharing

CSI Adhyayan [Oct.-Dec. 2015]

Page 29

www.csi-india.org

Services in Pay-Per-Use Mode


Virtualization
Performance
Maintenance

ADVANTAGES OF CLOUD COMPUTING:


Cost Efficiency, Almost Unlimited Storage, Backup and Recovery, Automatic Software Integration
Easy Access to Information , Quick Deployment

CHALLENGES TO CLOUD COMPUTING:

Confidentiality
Integrity
Availability
Governance
Trust
Legal Issues and Compliance
Privacy
Audit
Data-Stealing
Architecture
Identity Management and Access Control
Incident Response
Software Isolation and Application Security

FUTURE:
Many of the cloud-based works are still centralized and are yet to be transformed to a distributed paradigm.
Adoption and shifting entirely to cloud computing will require additional modifications and standardizations
of the existing techniques. Interoperability of systems should be explored and worked upon. Environment
friendliness should be considered through green computing.

Mrs. A.Vanathi [CSI: I1500987] is currently Pursuing Ph.D in Acharya Nagarjuna University,
Guntur. She was a lecturer, Assistant Professor and currently working as an Associate professor
and Head Of the Department CSE, Aditya Engineering College, Surampalem, AP, India. Her
research interests include Information Security and Mobile Computing.

KORLAPATI SRI HARSHA , is studying in III B.Tech CSE in Aditya Engineering College, interested in Big
Data, Cloud Computing, Networking, Web Development, Database He was Selected as Microsoft Student
Partner, Organised a Workshop on Big Data in our College Technical Fest VEDA and Working on projects as
Team Lead.

CSI Adhyayan [Oct.-Dec. 2015]

Page 30

www.csi-india.org

CLOUD COMPUTING & SECURITY


CEPHALALGIA
Compiled by:
Anupam Tiwari
Cloud Computing, the buzzing word that in recent times has been the most sought after techno term for a
major part of IT enthusiasts and corporate houses, Engineering students, Researchers and Technology
fanciers. And why not ? The intersection of cheap computing, permeating mobility, and virtualization
technologies has created a possibility for prompter and cost-efficient applications and IT infrastructure like
never before that could be imagined. Did we ever imagine few years back about computing being available
as a service just like another product. Now when we say computing as a service what does it actually mean?
I will try and get more exonerated on this before I get more specific on Security issues related to Clouds.
Computing as a service means the customer pays for what he uses just like electricity. For eg a typical
customer in a residential colony gets a electricity bill on a monthly basis based on whatever electric units he
has used during the month. Now extrapolate the same in the computing domain services viz Hardware,
Software, Storage and applications etc. So the cloud offers the following variants :
(a)
Platform as a service that provisions computing platform that allows the creation of web
applications quickly and easily and without the complexity of buying and maintaining the software and
infrastructure underneath it.
(b)
Software as a service wherein applications are designed for end-users, delivered over the web
(c)
Infrastructure as a Service delivers infrastructure ie servers, storage, network and OS as an ondemand service. Rather than purchasing servers, software, datacenter space or network equipment, clients
instead buy those resources as a fully outsourced service on demand.
Cloud computing isnt so much a technology as it is the compounding of many preexistent technologies
which have maturated at unlike rates and in dissimilar settings, and were not designed as a coherent whole;
however, they have come together to create a technical ecosystem for cloud computing. Above gives us a
brief over view of what exactly cloud computing offers. It means a great deal for any user across. But sadly
what stops a typical user/organization welcoming this and adopting it with open arms is paranoia
surrounding it regarding privacy and security issues. And I must share my view here that this paranoia is just
not about it, it is for veridical and the user concerns are echt.
No specific reason exists for most of us intonating Cloud Computing as Good and Cloud Security as Bad. On
one side we have eager clouds waiting to be exploited and on the other hand is a security concerned user
who does not want to risk anything in the fear of losing critical bits much of it may actually be paranoia
surrounding the buzz. Cloud as a technology is ready to be adopted but are we ready keeping in the security
paranoia surrounding?
There are a horde of issues from a security point of view in context of the clouds few of which are discussed
below:

TRANSPARENCY IS A NECESSITY
Few years back when Cloud was still in inception stage with big promises, nobody was actually so concerned
and serious about how the whole thing works?.What mattered to everyone was delivery of the service. But
today when the whole thing is ready to be served in the platter steaming hot, questions have arisen as to
how it has been made, how it has been designed, what are the ingredients, what is the source and who all
are sharing it etc? Do we say wrong time to ask? The CSP today has no answers to all such questions that

CSI Adhyayan [Oct.-Dec. 2015]

Page 31

www.csi-india.org
include how virtual machines sharing a physical server are isolated from each other. How the data protected
and network is secured? or How is the network architecture that the CSP provisions for the user is
protected? Any vulnerability existing at any point is a single point of failure that can have cascading effects
to say the least. On the other hand even if a CSP makes the entire thing transparent there will be fewer users
to check the claims and intervene with amends. The transparency factor for any user and CSP is intensified
by multitenancy and availability of shared environments. In an ideal condition a typical CSP must be able to
instantly bring out fine-grained security specifications and details for the entire stack including software
versions, patch levels, firewall configuration settings, tracking server snapshots, user access rights, and so
on.

LACK OF STANDARDS
Cloud computing is like a powerful ,proven wonder baby in the land of technology wherein past technology
giants are already standing strong with stability in terms of proven set security standards and user
confidence who have been using the same for long years without major setbacks. Standards, Practices,
Regulations, Processes and most especially set outlooks from still drive the industries. Deprivation of
senesce cloud standards continue to be the prime reason for much of the dubiousness and gainsays fronted
by organizations bidding to embrace cloud computing. These reasons are genuine since the amount of
reputation, finances at stake, in case of a failure, are enough to crash an established organization on any
given day. The immediate future for expecting to see these standards come and settle are also not very
bright since standards by nature move at a decelerating pace. The regulators find it all but inconceivable to
provision even draft standards since the effected and variety domains the cloud offers is brobdingnagian.
The security aspect in the overall cloud environs gets more challenging with applications/services spread
over diverse intercontinental locations whilst regulations differ for each zone.

SECURITY AUDITS AND FORENSICS


What happens in a typical IT security breach incident? The figure below shows a typical sequence thats
supposed to be followed:

Imagine the above fields as seen in the figure in context of Cloud Computing. The versatile characteristics of
cloud computing simply perplex the basic prerequisite of forensics in Cloud. The Cloud Storage is not local; it
can range panning across continents. So in a typical case as above where does the forensic expert look for
extracting remnants of logs and data. What will he confiscate? In orthodox computer forensics case,
investigators have full hold over the evidence including router logs, process logs and hard disks thats
present right in front of their eyes. Regrettably, in the case of Cloud Computing, the hold over data varies as
per diverseness of service models available viz. SaaS, PaaS, and IaaS. Even the delivery model makes a lot of
difference incl Public, Private or Hybrid etc. Thus dispersed nature of the Cloud based systems directly
effectuates control over the functional layers.
Besides diverseness of logs in different formats and consequent colligation has been a known issuance in
network forensics and this is exasperated in cloud environs because it is exceedingly unmanageable to
commix these varieties from different sources and make some useful analytics. Currently in the real world

CSI Adhyayan [Oct.-Dec. 2015]

Page 32

www.csi-india.org
corporate houses invest in very thorough, one-time "snapshot" security audits which are definitely not
leading anywhere wrong approach. A security audit is supposed to be a real-time, continuous process and
should be performed every minute which currently has technological limitations.

MOBILE/BYOD IS ONLY ADDING TO CHAOS


BYOD, or bring your own device, has become widely adopted to refer to employees who bring their own
computing devices such as Smart phones, Laptops and PDAs to the workplace for use and connecting on
the assumed secure corporate network. BYODs have brought in appreciable security vulnerabilities and
complexness since based in cloud environs. Using a single device in form of a phone/tablet/Laptop for
personal work , including access to legitimate social networking sites and also malicious sites may be at
times, as well as office work gives a run for cover for forensics experts in most cases. The corporate houses
that allows BYODs currently need to harden and configure strict security configurations realizing the kind of
vulnerability they stand at. This is currently getting more and more difficult with applications getting more
and more intra platform. For e.g. whatsapp that used to work primarily on Smart phones has now moved to
web whatsapp with calling features too. File transfers, calls made over this application has immense security
issues wherein any breach incidents are involved. The servers located across countries with different
regulations and cyber policies combined with typical cloud characteristics is only adding to challenges.

ENTROPY IN VIRTUAL ENVIRONMENTS IS NOT EASY


Any device with a need to authenticate utilizes private encryption keys and for these private keys to
generate, they require an initial random seed which is used as a base formula to mathematically compute
the private keys, an initialization vector as its technically known. Producing these random number blocks,
is referred to as Entropy. Private keys, need to be as unique as possible. If keys are generated using the same
seed, then it would be trivial to compromise the system, or for a malicious system to falsely appear as a
trusted device. Encryption systems depend upon entropy. Virtualization which is inherent to any cloud setup
invariably is a key attribute of this entropy and this can introduce vulnerabilities if keys and seeds can be
predicted.

INFRASTRUCTURE SECURITY @THE NETWORK LEVEL


Public cloud services before being exploited must be addressed for the following significant risk factors: SLAs
to ensure Confidentiality and Integrity of organizations data-in-transit are taken care of. A simple mention
here to take care of C,I,A has huge connotations. It does not get over simply since a lot goes into ensuring
these three i.e. Confidentiality, Integrity and Availability.
Ensuring strict access controls to include authentication, authorization, and auditing to resources being used
vide services offered by the CSP.
An example is an Amazon Web Services (AWS) security vulnerability reported way back in December 2008
wherein a flaw was reported in the digital signature algorithm used when ... making Query (aka REST)
requests to Amazon SimpleDB, to Amazon Elastic Compute Cloud (EC2), or to Amazon Simple Queue Service
(SQS) over HTTP. Although use of HTTPS (instead of HTTP) would have mitigated the integrity risk, users not
using HTTPS (but using HTTP) did face an increased risk that their data could have been altered in transit
without their knowledge.

DATA SECURITY
Data security becomes more important when using cloud computing at all levels

CSI Adhyayan [Oct.-Dec. 2015]

Page 33

www.csi-india.org
i.e. IaaS, PaaS or SaaS. Various facets of data security include Data-at-rest, Data-in-transit , Data provenance,
Data lineage, Data remanence, Data Processing etc. Each of these has got a different level of sensitivity
depending on topology of the cloud being used by the organization.

CONCLUSION
The few issues bought out above are actually miniscule when actually seen in the horde many of which I
have not touched like Infrastructure security at Host level and Application level, Data security and Storage,
Data security mitigation, Identity and Access Management, Trust boundaries ,Relevant standards and
protocols for cloud services ,Security management in the cloud, Access control, Security Vulnerability, patch,
and Configuration Management and Audit and Compliances. As mentioned in the beginning of this article i
would again re-emphasize that cloud computing is a change in business model, and not a new technology.
The most critical still to be resolved issue pertaining to the security in clouds is the use of shared resources,
or multitenancy. The real source of concern for Information security practitioners is that it is not clear where
and how to define and standardize trust boundaries. The need of the hour is to take calculated risk and start
with the exploitation, for risks will be there always.100 % security is a myth and is going to remain like that
for many more years to come. Till then why leave this powerful tool without exploiting.

REFERENCES :
1.
http://www.rackspace.com/knowledge_center/whitepaper/understanding-the-cloud-computingstack-saas-paas-iaas
2.
http://www.webopedia.com/TERM/B/BYOD.html
3.
http://www.activestate.com/blog/2015/02/locking-down-cloud-18-security-issues-faced-enterpriseit
4.
http://www.daemonology.net/blog/2008-12-18-AWS-signature-version-1-is-insecure.html
5.
Cloud Security and Privacy by Tim Mather, Subra Kumaraswamy, and Shahed Latif from O'Reilly
Publishers

Anupam Tiwari [CSI: 01129557] is a CDAC Certified Cyber Security Professional, GFSU Certified
cyber Security Professional, Certified Ethical Hacker 8 with basic qualifications B.E and M.Tech (Computer
Science) from JNTU Hyderabad. He has 12 years plus experience in the field of Cyber Security domain which
is his passion and looks forward to as an independent alternate career. He is a senior member of the
Computer Society of India with loads of interest in Cyber Security domains. He has participated in national
and International Seminars as a guest speaker with papers published in leading defense and technical
journals. He is working with the Min of Defence. He can be reached at anupam.tiwari@nic.in

CSI Adhyayan [Oct.-Dec. 2015]

Page 34

www.csi-india.org

CLOUD COMPUTING- THE 3 WS


Compiled by:
C.Shanmukhi Lavanya

ABSTRACT
Cloud Computing is evolving as a key technology for sharing resources. Grid Computing, distributed
computing, parallel computing and virtualization technologies define the shape of a new era. Traditional
distance learning systems lack reusability, portability and interoperability. This paper sees cloud computing
ecosystem as a new opportunity in designing cloud computing educational platforms where learning actors
can reuse learning resources handled by cloud educational operating systems. To enhance learning objects
portability and interoperability not only cloud computing API standards should be advocated by the key
cloud providers but also learning resources standards should be defined by the Open Cloud Computing
Education Federation as proposed by this paper. Keywords-cloud computing, cloud educational operating
system, learning actors, open cloud computing education federation.

WHY!? WHAT!??WHERE!???
Learning is about changing people's behavior and lives. It is a transformation process. Existing Distance
Learning (DL) systems have limited synergistic capabilities because of lack of reusability, portability and
interoperability. We need a new technology that would do more and could be accessible to more with less.
Cloud Computing emerges. Cloud is a metaphor to describe web as a space where computing has been pre
installed and exist as a service; data, operating systems, applications, storage and processing power exist on
the web ready to be shared. A new reality is being defined and terms like SaaS (Software as a Service), and
HaaS (Hardware as a Service) characterize the new ecosystem. Cloud computing can provide a context
where a Learning Object is transformed into a Learning Object Service; in this way cloud computing
technology could shape a new domain that of the cloud computing based education looking at Education as
a Service (EaaS). SaaS, HaaS and EaaS share the same cloud characteristics of Elasticity, Flexibility, Efficiency
and Reliability

WHY??
CLOUD COMPUTING IS ABOUT LEARNING
The existing educational platforms designed and implemented by the different DL organizations should be
reengineered to take into consideration the cloud technology; already main components and resources of
traditional DL systems can be found openly freely at the cloud offered by cloud computing apps with the
highest personalization. The educational dimension of the cloud computing is here. Cloud computing has the
potential to change the face of the entire DL industry.

CSI Adhyayan [Oct.-Dec. 2015]

Page 35

www.csi-india.org
LEARNING OBJECT AND LEARNING OBJECT SERVICE
A typical digital Learning Environment encapsulates Learning Objects (LO) and Learning Object Services
(LOS). A Learning Object is a construct made of data and operations. Learning Objects are structured within
digital learning environments like Blackboard and MoodIe. A Learning Object exists because it satisfies an
educational goal. That educational goal may be autonomous or collaborative. By increasing their rate of
reuse their value increase as well. A Learning Object is transformed into one or more Learning Object
Services if it exists ready to be reused no matter what the client environment of the end user accessing the
resource is. Cloud computing provides a technological context for implementing Open Virtual Learning
Object Service collaborative systems. It is now more imperative than ever that Learning Objects are being
designed and developed following standards that have been defined by a consortium of the Distance
Learning key players B. Specifications If it is to provide Learning Object Service environments then we need
specifications and standards for what constitutes a Learning Object like a course or learning unit. All DL
infrastructures comply with the following: DL Courses have a duration of 10 weeks each, each week covering
a specific unit; each unit consists of the following learning objects: learning guide, reading assignment,
professor's notes, self quiz questions, written assignment, discussion forum question, learning journal,
assignment solutions. All DL systems store these learning objects in specific formats; any attempt to reuse
LOs in another context requires considerable effort. This situation is backed by strong economic interests. It
is a greedy attitude where sharing, reusing and contribution do not exist as a means of reforming a new
social status. For a learning object to be transformed into a LOS it is required to be ready to serve upon
request with no cost except perhaps the fee cost. This means that it is preinstalled in an environment ready
to be reused. If we decide upon quality object characteristics then we can design Cloud Educational
Operating Systems where objects have been preinstalled, set up and ready to be reused.

SPECIFICATIONS
If it is to provide Learning Object Service environments then we need specifications and standards for
what constitutes a Learning Object like a course or learning unit. All DL infrastructures comply with the
following: DL Courses have duration of 10 weeks each, each week covering a specific unit; each unit consists
of the following learning objects: learning guide, reading assignment, professor's notes, self quiz questions,
written assignment, discussion forum question, learning journal, assignment solutions. All DL systems store
these learning objects in specific formats; any attempt to reuse LOs in another context requires considerable
effort. This situation is backed by strong economic interests. It is a greedy attitude where sharing, reusing
and contribution do not exist as a means of reforming a new social status. For a learning object to be
transformed into a LOS it is required to be ready to serve upon request with no cost except perhaps the fee
cost. This means that it is preinstalled in an environment ready to be reused. If we decide upon quality
object characteristics then we can design Cloud Educational Operating Systems where objects have been
preinstalled, set up and ready to be reused

WHAT??
CLOUD COMPUTING
This is a technology that allows anyone connected to the internet to use hardware and software on demand.
In its description for cloud characteristics The US National Institute of Standards and Technology (NIST)
defines as cloud characteristics the following: On-demand self-service, Ubiquitous network access, Resource
pooling, Rapid elasticity (resources can be scaled up and down easily),

CSI Adhyayan [Oct.-Dec. 2015]

Page 36

www.csi-india.org

Metered service (resources' usage is measured) and Pay-as-you-Consume business models. Staffing,
upgrading, crisis situations, licensing, depreciation, maintenance, back-up, recovery become the service
provider's concern, leaving the internal team to concentrate on strategic decisions. The traditional concept
that hardware and software are capital expenditures, are now being challenged and IT service providers can
now rent hardware and software. The drives of the market are the efficiencies provided by improved
communication, a pay-peruse pricing model and the technologies such as virtualization and outsourced
storage. Many argue that IT should behave as a utility that can be turned up or down as usage fluctuates.
Cloud computing technology is based on grid computing, distributed computing and parallel computing
coupled with virtualization. The virtualization technology defines images of the operating systems,
middleware, and applications procreated and pro-allocated to physical machines or slices of a server stack.

HAAS
Hardware on demand defines Hardware as a Service (HaaS) that runs a web-based operating system offering
WebOS services. This new OS should not only drive the new application development race but also the
development of interoperable Cloud Educational Operating Systems (CEOS). Hardware as a Service (HaaS) is
similar to licensing. We started with the MSP modes; according to this model a Managed Service Provider
(MSP) remotely monitors and administers hardware on a client's site on a subscription basis. Because not all
hardware lends itself to this model, some MSPs provided, still provide, on-site hardware for clients and build
its cost into their fees. However, this arrangement puts financial burden on the MSP. The HaaS model, in
which the vendor allows customers to license the hardware directly, was developed to alleviate this financial
burden. The ability to build new systems and re-engineer existing ones without interfering with internal
hardware is making Hardware as a Service attractive. To increase availability and security, private clouds are
developed - provision of a cloud computing architecture over the corporate network rather than openly
across the internet. IBM, Google, Amazon, Yahoo, Microsoft have established many cloud computing centers
worldwide for customers working out their cloud strategies.

SAAS
Software as a Service is a considerable change as we see software. No expenditure capital, only service cost.
Software just like the processing power and storage is seen as a utility that clients can pay for only as
needed. The goal is to centralize administrative tasks while improving scalability and work loads. The
following illustrates this technology. Building a large-scale web commerce software has been always a
challenge. Amazon solves this problem by providing a ready-made infrastructure that has resulted to the
biggest worldwide online store. Amazon hides complexity and offers a simple API. The Amazon team takes
the full responsibility of storage, lookup and management of data - and turns them into pay-per-fetch and
pay-per-space web services. Business Week points out, "Wall Street is not going to jump on this. But the
SmugMug photo service did and other startups and small businesses will follow. So even if large corporations
will not come, there is plenty of money to be made" . However according to this paper this is only the start,

CSI Adhyayan [Oct.-Dec. 2015]

Page 37

www.csi-india.org
Wall Street companies are going to jump on this as the complexity and the associated costs of IT increase
dramatically while revenues decrease. Compromises are going to follow hard economic and social decisions.
A new huge market emerges as many organizations and individuals attack it aggressively. Different schemes
of charge systems are designed based on Gbyte for storage service, CPUhour for computational services or
per node server if one has a clustered application.

LEARNING ACTORS IN CLOUD COMPUTING


A Learning Actor is any entity involved in the learning process like management, students, instructors, lab
staff etc. There are five types of resources that can be provisioned and a Learning Actor can consume over
the Internet.
1) Infrastructure resources including computing power, storage, and machine provisioning.
2) Software resources including middleware (cloud-centric operating systems, application servers,
databases) and development resources (development, testing tools, and deployment tools).
3) Application resources. Educational Software applications are delivered through Software As A Service
(SaaS) model or mashups of value-added applications.
4) Learning processes. Applications exposed as utilities or tasks. Learning process sharing is the learningdriven application outsourcing that supports provisioning, reuse and composition.
5) Learning Objects This paper supports the open cloud manifesto ; however this is still a manifesto it does
not provide any standards yet and as a result cloud computing facilities and services do not support
interoperability. On the other hand cloud computing provides the context where DL can flourish openly and
widely. Cloud Learning Objects and Cloud Learning Processes will be greatly benefited by the following two
key technologies that will play very important roles in this revolutionary phase: virtualization technology and
Service-Oriented Architecture (SOA)
1) The virtualization technology manages
a) The imaging of the operating systems, middleware, and applications,
b) The pre-allocation of all the resources to the right physical machines or server stack slices; ideally,
images should be moved around and put into production environment on demand, c) the licensing
mechanism of all software layers in the cloud computing platform.
2) The SOA supports component-based software development improving reusability, extensibility, and
flexibility. In order to construct scalable cloud computing platforms, we need to leverage SOA to build
reusable components, standard-based interfaces, and extensible solution architectures. Creating a cloud
computing platform is crucial in enabling sharing and reusing of its resources. Building a unified, scalable and
reusable Cloud Computing Educational architecture to support sharing of all types of resources still faces big
challenges in the areas of technology breakthrough and best industry practices. The idea is when new
learning objects are needed we should be able to consume (reuse with the least effort) existing resources
and assemble new courses running on a Unified Cloud Computing Educational infrastructure.

WHERE??
APPLICATIONS
A. Collaboration: Cloud apps give employees access to their information from anywhere around the globe.
All you need is an Internet connection. This allows more collaborative working as multiple people can view
and edit the same information at once, ensuring your team works efficiently.
B. Automatic Updates: Software as a service (SaaS) allows companies to ensure all users of their
application are on the same version of the software. This is because they can provide automatic updates to

CSI Adhyayan [Oct.-Dec. 2015]

Page 38

www.csi-india.org
cloud applications, rather than waiting for users to do it themselves. This also helps with support, as the
company will know what version of the software is being used when issues are logged.
C. Everyone Benefits: Cloud apps allow companies to push new developments to all users at once, ensuring
everyone benefits at the same time.

CONCLUSION
We are at the beginning of the cloud era; however lessons of the past do not drive cloud strategic decisions.
In this race the learning domain should reframe its own moments designing a Cloud Educational Operating
System, an open and minimalist API and provide free learning objects following standards designed by
OCCEF. The vision is a world in which the desire to learn is fully met by the opportunity to do so anywhere in
the world - where everyone, everywhere is able to access affordable, educationally and culturally
appropriate opportunities to gain whatever knowledge or training they desire. The OCCEF consortium should
act to realize this vision by addressing one issue - that of access to high quality portable and interoperable
educational material - and by partnering with organizations addressing related problems like cloud security
that must also be solved to make this vision a reality.

C.Shanmukhi Lavanya graduating Btech Computer Science Department


University College Of Engineering for Women

CSI Adhyayan [Oct.-Dec. 2015]

in Andhra

Page 39

www.csi-india.org

CLOUD COMPUTING & SECURITY ISSUES


Compiled by:
U.Tejaswi

1. INTRODUCTION:
Cloud computing, or in simple "the cloud", focuses on maximizing the effectiveness of the shared resources
which are usually not only shared by multiple users but are also dynamically reallocated per demand. The
term cloud has been used to refer to platforms for distributed computing and cloud was used as a metaphor
for the Internet and a standardized cloud-like shape was used to denote a network in computer network
diagrams. With cloud computing, multiple users can access a single server to retrieve and update their data
without purchasing licenses for different applications.
Cloud computing has now become a highly demanded service or utility due to the advantages of high
computing power, cheap cost of services, high performance, scalability, accessibility as well as availability.
Cloud vendors are experiencing appreciable growth rates in their business.[2] But due to being in a stage of
infancy, it still has some pitfalls which need to be given proper attention to make cloud computing services
more reliable and user friendly.

Cloud computing and storage solutions provide users and enterprises with various capabilities to store and
process their data in third-party data centers. It relies on sharing of resources to achieve coherence and
economies of scale, similar to a utility over a network.[1]

2. CLOUD CHARACTERISTICS:
Cloud computing exhibits the following key characteristics:
Agility improves with users' ability to re-provision technological infrastructure resources.
Cost reductions claimed by cloud providers. A public-cloud delivery model converts capital expenditure to
operational expenditure.
Device and location independence [3] enable users to access systems using a web browser regardless of their
location or what device they use (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a
third-party) and accessed via the Internet, users can connect from anywhere.
Maintenance of cloud computing applications is easier, because they do not need to be installed on each
user's computer and can be accessed from different places.

CSI Adhyayan [Oct.-Dec. 2015]

Page 40

www.csi-india.org
Enables sharing of resources and costs across a large pool of
users thus allowing for:
Centralization of infrastructure in locations with lower cost
peak-load capacity increases
Utilization and efficiency improvements for systems that are
often only 1020% utilized.
Performance is monitored and consistent and loosely
coupled architectures are constructed using web services as
the system interface.[4]
Productivity may be increased when multiple users can work
on the same data simultaneously, rather than waiting for it
to be saved and emailed. Time may be saved as information
does not need to be re-entered when fields are matched,
nor do users need to install application software upgrades to
their computer.[5]
Reliability improves with the use of multiple redundant sites, which makes well-designed cloud computing
suitable for business continuity and disaster recovery.
Scalability and elasticity via dynamic provisioning of resources on a fine-grained, self-service basis in near
real-time without users having to engineer for peak loads.
Security can improve due to centralization of data, increased security-focused resources, etc., but concerns
can persist about loss of control over certain sensitive data, and the lack of security for stored kernels.

3. SECURITY ISSUES ASSOCIATED WITH THE CLOUD:


Cloud computing and storage solutions provide users and enterprises with various capabilities to store and
process their data in third-party data centers. There are a number of security issues/concerns associated
with cloud computing but these issues fall into two broad categories:
Security issues faced by cloud providers (organizations providing software-, platform-, or infrastructure-as-aservice via the cloud) and
Security issues faced by their customers (companies or organizations who host applications or store data on
the cloud).[6]
The provider must ensure that their infrastructure is secure and that their clients data and applications are
protected while the user must take measures to fortify their application and use strong passwords and
authentication measures.
When an organization elects to store data or host applications on the public cloud, it loses its ability to have
physical access to the servers hosting its information. As a result, potentially business sensitive and
confidential data is at risk from insider attacks. According to a recent Cloud Security Alliance Report, insider
attacks are the third biggest threat in cloud computing. Therefore, Cloud Service providers must ensure that
thorough background checks are conducted for employees who have physical access to the servers in the
data center.
The extensive use of virtualization in implementing cloud infrastructure brings unique security concerns for
customers or tenants of a public cloud service.[7] Virtualization alters the relationship between the OS and
underlying hardware. This introduces an additional layer - virtualization - that itself must be properly
configured, managed and secured. While these concerns are largely theoretical, they do exist. For example, a
breach in the administrator workstation with the management software of the virtualization software can
cause the whole datacenter to go down or be reconfigured to an attacker's liking.[9]

SECURITY AND PRIVACY

CSI Adhyayan [Oct.-Dec. 2015]

Page 41

www.csi-india.org
Identity management
Every enterprise will have its own identity management system to control access to information and
computing resources. Cloud providers either integrate the customers identity management system into
their own infrastructure, using federation or SSO technology, or a biometric-based identification system, or
provide an identity management solution of their own.
Physical security
Cloud service providers physically secure the IT hardware (servers, routers, cables etc.) against unauthorized
access, interference, theft, fires, floods etc. and ensure that essential supplies are sufficiently robust to
minimize the possibility of disruption. This is normally achieved by serving cloud applications from 'worldclass' data centers.
Personnel security
Various information security concerns relating to the IT and other professionals associated with cloud
services are typically handled through pre-, para- and post-employment activities such as security screening
potential recruits, security awareness and training programs, proactive security monitoring and supervision,
disciplinary procedures and contractual obligations embedded in employment contracts, service level
agreements, codes of conduct, policies etc.
Availability
Cloud providers help ensure that customers can rely on access to their data and applications; at least in part.
Application security
Cloud providers ensure that applications available as a service via the cloud (SaaS) are secure by specifying,
designing, implementing, testing and maintaining appropriate application security measures in the
production environment.
Privacy
Providers ensure that all critical data are masked or encrypted and that only authorized users have access to
data in its entirety. Moreover, digital identities and credentials must be protected as should any data that
the provider collects or produces about customer activity in the cloud.

4. CLOUD SECURITY CONTROLS:


Cloud security architecture is effective only if the correct defensive implementations are in place.
Efficient cloud security architecture should recognize the issues that will arise with security management.
The security management addresses these issues with security controls. These security controls are usually
be found in one of the following categories: [8]
Deterrent controls
These controls are intended to reduce attacks on a cloud system. Much like a warning sign on a fence or a
property, deterrent controls typically reduce the threat level by informing potential attackers that there will
be adverse consequences for them if they proceed. (Some consider them a subset of preventive controls.)
Preventive controls

CSI Adhyayan [Oct.-Dec. 2015]

Page 42

www.csi-india.org
Preventive controls strengthen the system against incidents, generally by reducing if not actually eliminating
vulnerabilities. Strong authentication of cloud users, for instance, makes it less likely that unauthorized users
can access cloud systems, and more likely that cloud users are positively identified.
Detective controls
Detective controls are intended to detect and react appropriately to any incidents that occur. In the event of
an attack, a detective control will signal the preventative or corrective controls to address the issue. System
and network security monitoring, including intrusion detection and prevention arrangements, are typically
employed to detect attacks on cloud systems and the supporting communications infrastructure.
Corrective controls
Corrective controls reduce the consequences of an incident, normally by limiting the damage. They come
into effect during or after an incident. Restoring system a backup in order to rebuild a compromised system
is an example of a corrective control.

5. CONCLUSION:
Cloud computing has reached a maturity that leads it into a productive phase. This means that most of the
main issues with cloud computing have been addressed to a degree that clouds have become interesting for
full commercial exploitation. The major Cloud technology developers continue to invest billions a year in
Cloud R&D.More industries are turning to cloud technology as an efficient way to improve quality services
due to its capabilities to reduce overhead costs, downtime, and automate infrastructure deployment.

REFERENCES:
1. "The NIST Definition of Cloud Computing" (PDF). National Institute of Standards and Technology.
2. "The economy is flat so why are financials Cloud vendors growing at more than 90 percent per
annum?". FSN. March 5, 2013.
3. "Jeff Bezos' Risky Bet". Business Week.
4. A Self-adaptive hierarchical monitoring mechanism for Clouds Elsevier.com
5. King, Rachael (2008-08-04). "Cloud Computing: Small Companies Take Flight". Bloomberg Business
Week.
6. "Swamp Computing a.k.a. Cloud Computing". Web Security Journal. 2009-12-28.
7. Winkler, Vic. "Cloud Computing: Virtual Cloud Security Concerns". TechNet Magazine, Microsoft.
8. Hickey, Kathleen. "Dark Cloud: Study finds security risks in virtualization". Government Security News.
9. Winkler, Vic (2011). Securing the Cloud: Cloud Computer Security Techniques and Tactics. Waltham,
MA USA: Elsevier. p. 59. ISBN 978-1-59749-592-9.

U.Tejaswi [CSI : 01320007] is 3rd year student of Computer Engineering Department


studying in Andhra University College of Engineering for Women .

CSI Adhyayan [Oct.-Dec. 2015]

Page 43

www.csi-india.org

STREAM ME IN CLOUD
Compiled by:
Ms.Shruthi.V and Tharak K.

ABSTRACT
In this paper we aim to present a cloud mobile media which is a centric multimedia streaming system. We
aim to provide a streaming service of users own multimedia contents on smart-phone at anytime and
anywhere. Cloud mobile media provides management and streaming service for users multimedia contents
using local storage in users personal NAS , users cloud storage and cloud centric media network and media
analytics. Cloud resource management and control in infrastructure-as-a-service and novel cloud based
systems and applications in software-as-a-service. We investigate the feasibility of cloud mobile media to
provide the cloud centric media network and media analytics based mobile cloud service model. The
aforementioned comparison was performed by using HTTP and FTP internet protocols with 3G and WI-FI
network interfaces. Our results show that cloud mobile media provides the smart-phones with much
functionality and saves smart-phone.

1.

INTRODUCTION

Smartphones are becoming increasing popular because of their capabilities and functionalities. Their small
size and light weight make them very easy to carry, and they provide useful services as they run PC-like
applications. Multimedia applications such as image, video playing are very much resource intensive in terms
of processing and data rates. Consequently they consume much energy and drain Smartphone battery very
quickly. They have unique constraints such as limited battery energy, processing and memory capacity.
These limitations can be eased off in the era of cloud computing by offloading heavy tasks to the cloud. We
address the multimedia cloud computing which shows how a cloud can perform distributed multimedia
processing and storage and provide Quality of service provisioning for multimedia services. In particular, a
heavy energy consuming application is offloaded to the CC for Smartphone energy saving.
The offloading of files into the cloud reduces the memory and energy capacity in the Smartphone and thus
increasing the performance of the Smartphone. The data is stored in the cloud and thus the user can store or
retrieve data without using the mobile storage. The data is stored in a secured way and it is protected from
other users. The user can retrieve data even if the internal or external memory gets crashed. The files are
viewed by using the date filter. The user enters the date and the files that are uploaded on that particular
date are being displayed. The energy cost of multimedia applications on Smartphone that are connected to
the Multimedia cloud Computing is evaluated.
Cloud computing is the use of computing resource (hardware and software) that are delivered as a service
over a network (typically the Internet). The name comes from the use of a cloud-shaped symbol as an
abstraction for the complex infrastructure it contains in system diagrams. Cloud computing entrusts remote
services with a user's data, software and computation. Cloud storage is a model of networked online
storage where data is stored in virtualized pools of storage which are generally hosted by third
parties. Hosting companies operate large data centers, and people who require their data to be hosted buy
or lease storage capacity from them. The data center operators, in the background, virtualizes the resources

CSI Adhyayan [Oct.-Dec. 2015]

Page 44

www.csi-india.org
according to the requirements of the customer and expose them as storage pools, which the customers can
themselves use to store files or data objects. Physically, the resource may span across multiple servers. The
safety of the files depends upon the hosting websites.
It is a model for enabling convenient, on-demand network access to a shared pool of configurable computing
resources that can be rapidly provisioned and released with minimal management effort or service provider
interaction. Location-based applications have become increasingly popular on Smartphone over the past
years. The active use of these applications can however cause device battery drain owing to their power
intensive location-sensing operations. The design principles of the framework involve substitution,
suppression, piggybacking, and adaptation of applications location-sensing requests to conserve energy. The
design principles are implemented on Android-based Smartphone as a middleware. The evaluation results
show that the design principles reduce the usage of the power-intensive GPS (Global Positioning System).

2. RELATED WORK
2.1 WORLD MOBILE APPLICATIONS MARKET - ADVANCED TECHNOLOGIES
. The healthcare industry has begun to utilize web-based systems and cloud computing infrastructure to
develop and creasing array of online personal health record (PHR) systems. Although these systems provide
the technical capacity to store and retrieve medical data in various multimedia formats, including images,
ideas, voice, and text, individual patient use remains limited by the lack of intuitive data representation and
visualization techniques.
As such, further research is necessary to better visualize and present these records, in ways that make the
complex medical data more intuitive. In this study, we present a web-based PHR visualization system, called
the 3D medical graphical avatar (MGA), which was designed to explore web-based delivery of a wide array of
medical data types including multi-dimensional medical images; medical videos; text-based data; and spatial
annotations. Mapping information was extracted from each of the data types and was used to embed spatial
and textual annotations, such as regions of interest (ROIs) and time-based video annotations. Our MGA itself
is built from clinical patient imaging studies, when available.

2.2 DEVELOPMENT PLATFORMS FOR MOBILE APPLICATIONS: STATUS AND TRENDS


The rapid proliferation of mobile computing technology has massive potential for providing access to
different services at any time and from anywhere. The mobile telephone is more than just making calls. It
allows accessing several applications and services via the internet connection or by building stand-alone
applications. The mobile telephone has a considerable effect in tourism by allowing the user to access the
contents from Internet or from an install application over the mobile devices. The existing tourist guide
applications use the latest technologies to enhance the application quality by satisfying the users
requirements. These applications encounter great challenges because of limited mobile resources. Several
development platforms for mobile applications are used to design tourist guide applications a caused to
mobile devices incompatibility.

CSI Adhyayan [Oct.-Dec. 2015]

Page 45

www.csi-india.org
3. PROBLEM STATEMENT
In existing system we can store multimedia files on the internal or external storage. The user can be able to
view the content only if storage is available. If internal or external storage get crashed or erased we cant
retrieve the data. The user can be able to use limited amount storage in the mobile phones

4. PROBLEM IDENTIFICATION
User cannot be able to retrieve data if internal or external memory gets erased or crashed. There is no
security for the user data and anybody can view the data. If internal or external memory get excess user may
not able to save data more amount of data on smart phones will reduce performance and energy of the
smart phones

5. PROPOSED MECHANISM
In proposed system we used cloud mobile Multimedia Streaming System for store data in cloud. Here we use
cloud storage and Cloud-Centric Media Network communication systems form mobile to cloud. The user
can open the data anywhere using cloud. The user can able to store and retrieve data without the use of
memory. The user can be able to search the file on cloud storage using data filter. In our proposed system
user can able to store and retrieve data without the use of mobile storage

5.1 ADVANTAGES
User can be able to uploading their personal files on cloud and can be able to view files The user can access
the multimedia file from anywhere because it is located in cloud storage User can be able to view the files
using our application itself. It provides security for our personal multimedia files and other cannot view the
files. User datas cannot be lost or erased.

6.

SYSTEM ARCHITECTURE

The architectural diagram (figure 1) shown above clearly depicts the project of MCC. It consists of three
sections such as the Smartphone, Internet interface and a Cloud. These three constitutes to the world of
MCC. Each and every section is critical for the transfer of files. This transferring of files is done through a
technique called OFFLOADING. This offloading technique involves transferring heavy tasks to the remote
server commonly called as cloud and thereby reducing the power wastage in client. The multimedia cloud
is referred to as the server and the handheld devices like Smartphone are considered to be clients. These
two act as a base for the transfer of files.
The Smartphone consists of desired multimedia files such as text, image, audio, and video. These are
collectively called as multimedia files. These multimedia files are stored in the Smartphone. The Smartphone
which we use for our project is HTC NEXUS ONE. The main drawback when storing these files in the
Smartphone is the energy consumption. This can be overcome by the offloading technique as mentioned
above. The files that are offloaded are stored in a centralized database in the cloud. In this, the user has to
go through an authentication process which involves the user to enter username and password. Only the
user whose request has been granted by providing the above details can be recognized as the authentic
user. This is the first step which is involved in MCC.

CSI Adhyayan [Oct.-Dec. 2015]

Page 46

www.csi-india.org
The second step which is involved after the successful login is the uploading of files. This file transfer is done
using two protocols namely FTP (File Transfer Protocol) and HTTP (Hyper Text Transfer Protocol). These two
protocols play an important role in transferring of multimedia files. Among these, FTP is considered to be
efficient since it involves faster data transfer. HTTP is the standard protocol that is used in the environment
of file transfer. In order to provide more security https can be used.

Figure 1: system architecture


Once the file that has to be uploaded is chosen, the internet access network which is responsible for the
transferring of files from one end to other has to be chosen. The access network can either be wired or
wireless network. In our project, we tend to make use of wireless network. The type of wireless network we
specifically try to use is the WIFI and 3G interfaces. This is shown in our diagram where these network
interfaces connect the Smartphone and the cloud server. In this, 3G is considered to be efficient than the
WIFI because of its higher data rates. But some people prefer WIFI since 3G supports only short distance.
The last step involved is the storage of files in cloud, where it stores different multimedia files. The current
tag location is being shown for text using GPS (Global Positioning System). This is the advanced feature
which increases the interactivity and user friendliness between the user and the device. The multimedia files
in the cloud are being viewed using date filter where the user can specify the corresponding date and the files
that are uploaded in that date alone can be retrieved. The multimedia functionalities partially or fully
interact with corresponding MCC
MODULE DESCRIPTION
1.Linking for Cloud Storage
2.Uploading Multimedia files in Cloud
3.Searching the file using Date filter
4.View file on cloud

6.1.1 LINKING FOR CLOUD STORAGE


Cloud storage is used to store ,access and manage file on online. To upload the multimedia file to cloud
storage create a account on cloud storage. There are different types of cloud storage is available. Here we
are using DropBox for storing the multimedia file in cloud storage. Create a account on the Dropbox and link
it with our application to sign in for our application.

CSI Adhyayan [Oct.-Dec. 2015]

Page 47

www.csi-india.org
6.1.2 UPLOADING FILES
The main aim of our project is to upload multimedia files in cloud storage using offloading method.
Offloading method is mainly used to upload data to server or cloud using 3G and Wifi. Offloading is an
effective method for extending the lifetime of handheld mobile devices by executing some components of
applications remotely (e.g., on the server in a data center or in a cloud).For the end users the purpose for
doing mobile data offloading is based on data service cost control and availability of higher bandwidth

6.1.3 SEARCHING FILES


Usually, there are thousands of files stored in our computer, and looking for something in that big a stash is
something very hard if you have to do it manually. A better, more efficient way to look for a file is to use a
filter tool.

6.1.4 VIEW FILES ON CLOUD


Using Dropbox Cloud Connect, you can edit a document online. Every time you sync a document, the
revisions of a document are stored so you can easily roll back to a previous version. You can go back to any
prior revision of a document at any time. Cloud Files allows users to store/retrieve files via a simple API key.
Users can have virtually unlimited file sizes in Cloud Files by using our Large File Support. This allows
customers to upload as many file as they wish. When searching for files or folders, setting up the proper
filters plays a very vital role in quickly locating the required file or document. This date filter lets you apply
separate date for files and sub-files, as well as add specific extensions to exclude from the search.

7.

CONCLUSION

The problem of running multimedia application on Smartphone is addressed, and thus investigating the
benefits of using MCC framework in this regard.MCC appears to be promising to fill the gap between
Smartphone performance limitations and expectation of the users by the Energy-as-a-Service (EaaS) service.
The experiments are conducted for evaluating the benefit of MCC to overcome Smartphone Constraints. The
results reveal that the potential of MCC reduces Smartphone energy consumptions on multimedia
applications. This clearly indicates that offloading multimedia applications from Smartphone to MCC is
beneficial. MCC significantly reduces the energy consumption on Smartphone by the EaaS service.
The limitation is that we need network connection to upload the files into the cloud otherwise files cannot
be uploaded into the cloud. The time taken to upload a file in 3G network is more compared to that of
uploading a file in WIFI.

7.1 FUTURE ENHANCEMENT


More experiments are to be conducted in order to generalize our finding which include playing games etc.
Optimum algorithms, architectures, and implementations for this offloading technique are needed to reach
best offloading case. This study opens up new opportunities to be investigated. Finally, modeling the MCC to
handle the offloading is important to implement efficiently.

CSI Adhyayan [Oct.-Dec. 2015]

Page 48

www.csi-india.org
REFERENCES
[1]W. Zhu, C. Luo, J. Wang, and S. Li, Multimedia Cloud Computing, IEEE Signal Processing Magazine, vol.
28, no. 3, pp. 5969, 2011.
[2] Zhiyuan Li, Cheng Wang, Rong Xu, Computation Offloading to Save Energy on Handheld Devices: A
Partition Scheme,In Proc.ACM Int. Conf. CASES, Nov. 2001.
[3] Ripal Nathuji, Canturk Isci, Eugene Gorbatov, Exploiting Platform Heterogeneity for Power Efficient Data
Centers.
[4] G. Chen, B. Kang, M. Kandemir, N. Vijaykrishnan, M. J. Irwin, R. Chandramouli, Energy-Aware
Compilation and Execution in Java-Enabled Mobile Devices.
[5] Han Qi, Abdullah Gani, Research on Mobile Cloud Computing: Review, Trend and Perspectives.
[6]Lin Zhong, Bin Wei, Michael J. Sinclair, SMERT: Energy-Efficient Design of a Multimedia Messaging
System for Mobile Devices.
[7] Karthik Kumar and Yung-Hsiang Lu, Cloud Computing For Mobile Users: Can Offloading Computation
Save Energy?,Published by the IEEE Computer Society, April 2010
[8]H. Falaki, D. Lymberopoulos, R. Mahajan, S. Kandula and D. Estrin, A First Look at Traffic on
Smartphones, in Proceedings of the 10thannual conference on Internet measurement, 2010, pp. 281287.
[9] J. F. M. Bernal, L. Ardito, M. Morisio and P. Falcarin, Towards an Efficient Context-Aware System:
Problems and Suggestions to Reduce Energy Consumption in Mobile Devices, in Proc. Ninth Int Mobile
Business and 2010 Ninth Global Mobility Roundtable (ICMB-GMR)Conf, 2010, pp. 510514.
[10]K. Naik, A Survey of Software Based Energy Saving Methodologies for Handheld Wireless
Communication Devices, Dept. of ECE, University of Waterloo, Waterloo, ON, Canada, Tech. Rep. 2010-13,
2010.

Ms.Shruthi.V [CSI-01316783] is studying in III year of B.Tech (IT) at Valliammai Engineering


College, Kancheepuram ,Tamilnadu. Her areas of interest are programming , cloud computing ,big data
etc. She can be reached at ragavanachu0000@gmail.com .
Tharak K. is studying in III year of B.Tech (IT) at Valliammai Engineering College, Kancheepuram ,Tamilnadu.
Her areas of interest are programming, cloud computing, big data etc. She can be reached
at tharak14196@gmail.com .

CSI Adhyayan [Oct.-Dec. 2015]

Page 49

www.csi-india.org

GOOGLE CLOUD PLATFORM


Compiled by:
Koneti Yamini
Generally, when we talk about cloud computing we always refer to taking the applications and running
them on the infrastructure other than our own. This is the general idea of cloud computing which
usually everyone has. Many companies have implemented this cloud computing and made their own
platform and even achieved success and development in the projects as cloud computing has made it
easy for the people working in the modern era of technology.
we believe were moving out of the ice age, the iron age, the industrial age, the in formation age, to
the participation age. You get on the net and you do stuff. You IM (instant message), you blog, you
take pictures, you publish, you podcast, you transact, you distance learn, you telemedicine. You are
participating on the internet, not just viewing stuff. We build the infrastructure that goes in the data
center that facilitates the participation age. We build that big frigging webtone switch. It has security,
directory, identity, privacy, storage, compute, the whole web services stack. - Scott Mcnealy

1. CLOUD COMPUTING
Cloud Computing is the model for enabling universal, convenient, on-demand access to a shared pool of
configurable computing resources. Cloud computing has given storage solutions which provide the users and
enterprises with various capabilities to store and process their data in third-party data centres. It relies on
sharing of resources to achieve consistency and economies of scale, similar to a utility over a network.
Cloud computing is budding into a new way of computing. A large amount of information is retained in one
memory i.e. cloud and even the accessing of it is done effortlessly. By moving the data to the cloud, one can
free their team from pressure and make them to focus on development, rather than maintaining servers.
This will result in saving the high costs of on-premises infrastructure.
Cloud computing has become far more secure than traditional computing, because many companies retain
on cyber-security personnel of a high quality. Security issues are always present in any place, though we can
try to overcome these situations and develop from traditional computing.

2. WHY CLOUD COMPUTING


Cloud computing has its own profits which are capable to support to the modern technology. It can benefit
the processor while storing the data, recovering the missing data or just accessing the data from the cloud.
The following are the benefits:

FLEXIBILITY
When a company needs more bandwidth than usual, a cloud-based service can instantly meet the demands
because of the vast capacity of the services remote servers. It has the ability to quickly meet business
demands which was an important reason to why we have to move to cloud computing.

CSI Adhyayan [Oct.-Dec. 2015]

Page 50

www.csi-india.org
DISASTER RECOVERY
When we having information stored in one system and if it gets lost, or accidently there will be a problem of
leaking the information. So, when companies start relying on cloud-based services, they no longer need to
undergo complex disaster recovery plans. Cloud computing providers take care of most issues required for
the recovery of data in a quick fashion. The recovery is nearly four times faster than businesses that didnt
use the cloud. The mid-sized businesses had the best recovery times of all, taking almost half the time of
larger companies to recover.

AUTOMATIC SOFTWARE UPDATES


Updates should be done to the softwares to be in touch with the progress of applications. But worrying of
applications can disturb the concentration which one needs in doing the coding. So, Cloud computing
suppliers always do the server maintenance including security updates, freeing up their customers time and
resources for other tasks.

CAP-EX FREE
When we use Cloud computing services are typically pay as you go, so theres no need for capital
expenditure, at all. And because cloud computing is much faster to organize, businesses which have minimal
project start-up costs and predictable on-going operating expenses can have no issues while using it and it
will help them in various ways.

INCREASED COLLABORATION
Cloud computing increases collaboration by allowing all employees, wherever they are to sync up and work
on documents and shared apps simultaneously and follow colleagues and records to receive critical updates
in real time.

WORK FROM ANYWHERE


If employees have internet access, they can work from anywhere. This flexibility positively affects knowledge
workers' work-life balance and productivity.

DOCUMENT CONTROL
If a company doesnt use the cloud, workers have to send files back and forth over email, meaning only one
person can work on a file at a time and the same document has tones of names and formats.
Cloud computing keeps all the files in one central location and everyone works off of one central copy.
Employees can even chat to each other whilst making changes together. This whole process makes
collaboration stronger, which increases efficiency and improves a companys bottom line.

CSI Adhyayan [Oct.-Dec. 2015]

Page 51

www.csi-india.org
SECURITY
When everything is stored in the cloud, data can still be accessed no matter what happens to a machine.

COMPETITIVENESS
The cloud grants SMEs (Small and Medium-sized Enterprises) access to enterprise-class technology. It also
allows smaller businesses to act faster than big, established competitors.

ENVIRONMENTALLY FRIENDLY
Businesses using cloud computing only use the server space they need, which decreases their carbon
footprint. Using the cloud, results in at least 30% less energy consumption and carbon emissions than using
on-site servers. And again, SMEs get the most benefit: for small companies, the cut in energy use and carbon
emissions is likely to be 90%.

GOOGLE CLOUD COMPUTING


One of the important missions for Google is to systematize the worlds information and make it universally
accessible and useful. When creating the next hit game, storing massive amounts of data, or running critical
business applications, teams are always under constant pressure to do more with small budgets and few
employees. With Cloud Platform, one can save dramatically over the high costs of on-premises
infrastructure, and allow your teams to focus on developing apps, not maintaining servers.
Google Cloud Platform is a cloud computing platform by Google that offers hosting on the same supporting
infrastructure that Google uses internally for end-user products like Google Search and YouTube. Cloud
Platform provides developer products to build a range of programs from simple websites to complex
applications.
Google Cloud Platform is a part of a suite of enterprise solutions from Google for Work and provides a set of
modular cloud-based services with a host of development tools. For example, hosting and computing, cloud
storage, data storage, translations APIs and prediction APIs.

WHY GOOGLE CLOUD PLATFORM


Google Cloud Platform is a set of services that enables developers to build, test and deploy applications on
Googles reliable infrastructure. Choose from computing, storage and application services for your web,
mobile and backend solutions.
Run On Googles Infrastructure Global Network
Google has one of the largest and most advanced computer networks. Googles backbone network has
thousands of miles of fiber optic cable, uses advanced software-defined networking and has edge caching
services to deliver fast, consistent and scalable performance.
Redundancy
Multiple points of presence across the globe provide strong redundancy. Data is automatically mirrored
across storage devices in multiple locations.

CSI Adhyayan [Oct.-Dec. 2015]

Page 52

www.csi-india.org
Cutting-edge Computer Science
Infrastructure innovation isnt just about hardware. Google has led the industry with innovations in software
infrastructure such as MapReduce, BigTable and Dremel.
Focus On The Product
Managed Services
Google pays attention to the database administration, server configuration, sharing and load balancing while
the users focus on their code. No need to carry a pager or write boilerplate code.
Developer Tools and SDKs
Google integrates with familiar development tools like Eclipse and provides API client libraries and a
command-line interface, which makes it easy to build the way users want.
Console and administration
It sees and manages all of the applications from a single console. It views the performance of the
applications and manage their account and billing with a simple interface.
Mix And Match Services
Compute
Cloud Platform offers both a fully managed platform and flexible virtual machines, allowing the user to
choose a system that meets their needs. Use App Engine, Googles Platform-as-a-Service, when user just
want to focus on the code and not worry about patching or maintenance. Get access to raw virtual machines
with Compute Engine and have the flexibility to build anything their need.
Storage
Google Cloud Platform provides a range of storage services that allows user to maintain easy and quick
access to the data. With Cloud SQL and Datastore, the users get MySQL or scheme less NoSQL databases,
while Cloud Storage provides flexible object storage with global edge caching.
Services
Use Google APIs and services to quickly enable a wide range of functionality for the application. Users dont
need to build these from scratch, just profit from the advantage of easy integration within Cloud Platform.
Scale To Millions Of Users
Scale-up
Cloud Platform is designed to scale like Googles own products, even when the users experience a huge
traffic spike. Managed services such as App Engine or Cloud Datastore give you auto-scaling that enable your
application to grow with the users.
Scale-down
Just as Cloud Platform allows to scale-up, managed services also allow the scale-down. There is no need to
pay for computing resources that are not required.

CSI Adhyayan [Oct.-Dec. 2015]

Page 53

www.csi-india.org
Performance You Can Count On
CPU, memory, disk
Google Cloud Platform provides fast and consistent performance across the range of computing, storage and
application services. With powerful processing, access to the memory is needed and high IOPS (Input/Output
Operations Per Second), then the application will deliver consistent performance to the users. One can enjoy
the benefits of reduced latency and avoid noisy-neighbour problems.
Global network
Google uses software-defined networking technology to route packets across the globe and enable fast
edge-caching so that the data is where it needs to be to serve the users. When every millisecond of latency
counts, Google makes sure that the content is delivered quickly.
Transparent maintenance
Virtual machines never go down for scheduled maintenance with new, built-in live-migration technology.
Google providers have the peace of mind when they know hosts are patched and data centres are
maintained without the headaches of downtime.
Get The Support You Need
Free community based support
All customers have access to free community based support including resources, training content,
and documentation. Google utilizes first and third party resources to actively monitor and answer questions
on Stack overflow which helps build out a highly curated, reliable source of public support information.
24x7 Phone Support
Access trained experts are present 24x7 over the phone, in English or Japanese with comprehensive 24x7
Gold Support package. Google always has rapid response times to make sure the issues are addressed with
the highest priorities. They also offer consultation on application development, best practices, and
architecture reviews for the specific use cases. For the most mission-critical applications secure direct access
to their Technical Account Management team is given for an even higher level of service.
GOOGLE CLOUD PLATFORM SERVICES
The Google Cloud Platform is composed of a family of products, each including a web interface, a commandline tool and a REST API.

Figure 1: Google Cloud Platform Services

CSI Adhyayan [Oct.-Dec. 2015]

Page 54

www.csi-india.org
COMPUTE:
Google App Engine is a Platform as a Service for sandboxed web applications. App Engine offers automatic
scaling with resources increased automatically to handle server load. It was released as a preview in April
2008.
Google Compute Engine is the Infrastructure as a Service component of the Google Cloud Platform that
enables users to launch virtual machines (VMs) on demand. It was released in December 2013.
Google Container Engine makes it easy to run Docker containers on Google Cloud Platform.
STORAGE:
Google Cloud Storage is an online storage service for files. It was launched in May 2010.
Google Cloud Bigtable is a fast, fully managed, massively scalable NoSQL database service that's ideal for
web, mobile, and Internet of Things applications requiring terabytes to petabytes of data.
Google Cloud Datastore is a fully managed, highly available NoSQL data storage for non-relational data that
includes a REST API.
Google Cloud SQL is a fully managed MySQL database that lives in the Google Cloud infrastructure. It was
released in February 2014.
Google Load Balancing balances traffic between Compute Engine instances, using either HTTP or Network
(TCP/UDP)
Google Interconnect connects the network to Googles directly or via the users carrier or VPN.
APP SERVICES:
Google Cloud DNS is a DNS service hosted in the Google Cloud infrastructure.
Google BigQuery is a data analysis tool that uses SQL-like queries to process big datasets in seconds.
Google Dataflow enables reliable execution for large scale data processing scenarios such as Extract,
Transform, Load (ETL), analytics, real-time computation, and process orchestration.
Google Cloud Pub/Sub connects to the services with reliable, many-to-many, asynchronous messaging
hosted on Google's infrastructure.
Google Cloud Endpoints is a tool to create services inside App Engine that can be easily connected from
iOS, Android and JavaScript clients.
Google Translate API is a way to create multilingual apps and translate text into other languages
programmatically. Thousands of language pairs are available.
Google Prediction API uses Googles machine learning algorithms to analyse data and predict future
outcomes using a familiar REST interface.
Google Cloud Monitoring gains insight into the performance and availability of our cloud-powered
applications.
Google Cloud Logging manages all our log data for Compute Engine and App Engine to investigate and
debug system issues, gain operational and business insights, and meet security and compliance needs.

CSI Adhyayan [Oct.-Dec. 2015]

Page 55

www.csi-india.org
Google Cloud Deployment Manager allows developers to easily design, share, deploy and manage complex
Cloud Platform solutions using simple, declarative templates.

HOW GOOGLE CLOUD PLATFORM WORKS


Cloud is about how you do computing, not where you do computing. Paul Maritz
Google has always gained success through their innovative idea. To bring change to the world, the change
should start from us. This statement is well proven with the Google cloud platform as we all know about
cloud computing and many softwares have been developed throughout all these years with the effort of
programmers. Google has progressed itself in the world of could computing as it is becoming the future
generations universal storage space.
Google Cloud Platform is a set of modular cloud-based services that allow you to create anything from
simple websites to complex applications. Cloud Platform provides the building blocks so you can quickly
develop everything from simple websites to complex applications.
Operational Approach of Google Cloud Computing:
Market innovative projects faster.
While infrastructure is the foundation of apps, maintaining it wont differentiate the business, bringing in
new revenue, or delight users. We generally want the limited engineering resources to spend time on
projects that grow the business and deliver value to the organization.
Spend on business growth, not server hardware
Building on-premises infrastructure requires heavy upfront capital expenditures like server hardware which
is expensive to purchase, install, configure, and maintain. With Cloud Platform, you pay-as-you-go and only
for the resources which the user can use, theres no need for upfront investment.
Lower on-going server and labour costs
Unlike on-premises infrastructure, cloud pricing is based only on the resources that the user use and those
resources are decreasing in cost over time. With Cloud Platform, the applications can run on the same
infrastructure that powers Google. Finally, the infrastructure can be managed properly and you save on the
labour cost involved with maintaining servers.
Scalable capacity reduces risk of launching new apps
Predicting the usage of your applications can be difficult. In the on-premises world, if the users buy too many
servers, theyll end up with expensive unused capacity. This will cause the customers to have a bad
experience with the application. With Cloud Platform, the developers can easily scale up capacity to meet
demand, and scale down capacity during quiet times to save cost.
Google Cloud Platform has reached new heights by its development in the society which made them groom
themselves into successful platforms and it is a platform which makes everyone to use their applications.
They are easy and simple to understand at the same time meets all the needs of customers.
On July 2012 Google created the Google Cloud Platform Partner Program. Google announced their biggest
price drop affecting all products between a 30% and 85%. On March 2014 Google announced Managed

CSI Adhyayan [Oct.-Dec. 2015]

Page 56

www.csi-india.org
Virtual Machines, a new feature to overcome the traditional limitations in Google App Engine. These
innovative designs helped them to reach new heights and break the records created before.

CONCLUSION
Cloud computing is changing the way IT departments buy IT. Businesses have a variety of paths to the cloud,
including infrastructure, platforms and applications that are available from cloud providers as online
services. Cloud solutions are simple to acquire, dont require long term contracts and are easier to scale up
and down as needed. Proper planning and migration services are needed to ensure a successful
implementation. Security Compliance and Monitoring is achievable with careful planning and analysis.
Google has introduced their version of cloud computing and it has been used worldwide and progressed in
technology which set its own height in the world of computing. Apart from Google, there are many
companies who are ripening in the world of Cloud Computing like Amazon, Azure, IBM, force, RollBase, etc.
One person can use any of the software related to the companies according to their usage, without
deteriorating the storage and fulfil all the needs from the cloud computing landscape.

REFERENCES
1.
2.
3.
4.
5.

Google Cloud Platform Wikipedia: https://en.wikipedia.org/wiki/Google_Cloud_Platform


Google Cloud Platform Developers Portal: https://cloud.google.com/developers
Google Developers Global Portal: https://developers.google.com
Google Cloud Platform Products list: https://cloud.google.com/products/compute-engine/
Understanding Google APIs: https://fethidilmi.blogspot.in/2013/01/understanding-google-apis.html

Miss Koneti Yamini [CSI-01320029] is studying in III year of B.Tech (CSE) at Andhra
University College of Engineering for Women, Visakhapatnam (Andhra Pradesh). Her areas of interest are
Web Development, Data Structures, Algorithms, programming. She can be reached at
yamini.koneti6@gmail.com.

CSI Adhyayan [Oct.-Dec. 2015]

Page 57

www.csi-india.org

GRID COMPUTING
Compiled by:
S. Varshini

1. INTRODUCTION
Todays world has seen major advancements in the field of technology. Computers have made our lives
easier in more ways than one. Therefore, optimising them has always been in the forefront of technological
developments. Everyone wants faster access to the internet, higher speeds and rapid computing. The Grid
technology comes into play in this aspect. Designed to run complex analyses, larger programs and to obtain
a flexible infrastructure of computing, Grid technology has become indispensible to organizations and
individuals.

2. WHAT IS GRID COMPUTING?


Grid computing is a collection of computer resources from various systems organised as a grid, and this
grid is used as a common resource pool. It facilitates the sharing of resources among individuals and
organizations by acting as a centralized control for the distributed resources. For instance, doctors may be
able to access terabytes of data from the grid on rare diseases and their cures, which would not have been
available to them otherwise. The Grid computing technology provides a transparent mode of accessing a
wide range of resources. The Search for Extraterrestrial Intelligence (SETI) project is one of the earliest grid
computing systems.
The grid may be a local one, confined to a specific organization for their personal use, entrusted to a few,
also known as Intragrid. Another type of grid is the one which is a consortium of the resources from various
partner organizations, known as Extragrid. This can be connected through a VPN (Virtual Private Network)
and is available to more number of people. The third type of grid is the Intergrid wherein many multiple
clusters of extragrids are integrated. The distinction between the types of grid is based on the configuration
of the information such as: security, scope of policies, obligations to the user of the infrastructure.

Extragrid
Intragrid
Grid1

Grid2

Other
grids

Intergrid
Grid
Cluster
s

Grid
Cluster
s

CSI Adhyayan [Oct.-Dec. 2015]

Grid
Cluster
s
Grid
Cluster
s

Page 58

www.csi-india.org
STRUCTURE OF A GRID:

SYSTEM #1

SYSTEM #2

CONNECTING
NODE /
Middleware

SYSTEM #6

SYSTEM #4

SYSTEM #5

3.

SYSTEM #3

WHY IS GRID COMPUTING NECESSARY?

In many organizations, there is a surplus of underutilized computing resources. This may either be the
desktop machines or the server itself. The Grid makes use of the idle CPU or storage or any other
component available in a system whose potential can be tapped into useful work. The Grid provides a
framework to exploit these resources and hence, collect the unused storage to turn it into a virtual data
store. This has an improved performance and reliability as compared to that of a single system.
The Grid helps in reducing the time taken to obtain the required output through faster computing speed. It
fulfils its mission by implementing the concept of parallel processing. The problem can be distributed into
parts and run simultaneously to reduce time. In this manner, each part of the problem will be executed
separately. This can also be used to schedule a large number of tasks by making use of the idle time of the
processor i.e. the unused processor cycles.
To sum it up, we can say that the grid provides a dependable way to balance the load on a wide range of
resources.

4.

FEATURES

The computing process in the grid is used to promote human-to-human interactions rather than the usual
human-to-machine interactions. The virtual space where the data and applications are shared is modified
and accessed by humans while the grid acts as an interface to facilitate easy access.
One of the main features is heterogeneity. The grid may contain varied data from various geographical
locations and domains. The grid is also adaptable and fault tolerant, the errors or hardware/software faults
are dealt with by the resource manager without impacting the data. It is scalable i.e. it can accommodate a
large number of systems or nodes in its network.

CSI Adhyayan [Oct.-Dec. 2015]

Page 59

www.csi-india.org
5.

HOW DOES GRID COMPUTING WORK?

There are three main components in a Grid.

SERVER
The server handles the administrative tasks and acts as a control node for the network of systems.

NETWORK OF COMPUTERS
Each computer has a dual role. They can act as an interface for the user as well as the storehouse of
resources which can be accessed by the user.

MIDDLEWARE
This is the main component of the grid. It is software which allows a user to run his process or application
across a network of machines. This acts as a scheduler to decide which grid jobs will run and where and
when. There is no fixed format for this middleware. Some Grid computing middleware are ARC, DIET, EMI,
gLite, Globus toolkit, GridWay, Alchemi etc.
When a machine on the grid has some large task to be performed, some steps are followed before execution
begins. Firstly, the program must be parallelized. Secondly, the flow of the program needs to be analysed so
that it can be separated into small independent modules. Next, the modules are sent to different machines
for execution. Finally, the results are sent to the original machine, where they are compounded into a whole.

6.

GRID ARCHITECTURE

APPLICATIONS (Business, Web services,


Media)

PLATFORMS (Storage Database/File;


Software Java/Python/.NET)

INFRASTRUCTURE (Security, Scheduling,


Computing and Storage)

HARDWARE (CPUs, Memory and other


resources)

CSI Adhyayan [Oct.-Dec. 2015]

Page 60

www.csi-india.org
7.

APPLICATIONS OF GRID

Grid is used in fields like Biomedical Research, Engineering Research, Architecture, Industrial Research,
Astronomical Surveys, Study of Weather and Climatic conditions, Chemical Compounds Identification,
Medical Research, Advanced Physics Applications, Data Communication, Space Research Organizations,
Government Business/Financial Services, Web Services, Accessing Scientific Instruments and Readings,
Business and commercial applications

8.

ADVANTAGE OF GRID OVER CLOUD

The Grid technology has an advantage over the Cloud, and that is the storage of data. Since the
infrastructure of the Grid is owned by the company and the servers with the data are within company
premises, there will not be any problems concerning the location of the Grid. Technical issues can be
addressed individually by the company without having to call forth an external agency. The main concern for
big organisations is the location of the data and this has been put to rest by using the Grid technology. This
assures the security of the data when information security is the first priority of the organisation.
The cost of storing the data is also reduced when the grid is used as is the ease of use of the resources within
the company. This is especially useful when the grid is a local one, and the data maintained is privy to limited
people.

9.

ADDRESSING THE CONCERNS ABOUT GRID COMPUTING

Linking two or more systems will raise a number of concerns among the users. The major concerns are :

Data privacy
Data security and integrity
Access control
Judicious usage of systems resources

The answer to all these concerns is the middleware. The middleware is software specially designed to
provide system and data security by encryption and user authentication methods. It also facilitates the
distribution of the process components to available resources for proper allocation and usage of the system
resources.

10.

CONCLUSION

The optimization of computing, operational flexibility, resilience and availability of infrastructure have been
the goals kept in mind while coming up with the concept of the Grid. There has been an increase in the use
of this technology in not only all the upcoming fields of scientific and technological research but also in
several commercial and management applications. What was once a concept has now become a reality and
it has made a huge difference to the scientific community in pursuit of knowledge. Research projects,
running complex computing algorithms of critical applications, access to data across different platforms
along with data security have all been made easier and faster with this technology.

CSI Adhyayan [Oct.-Dec. 2015]

Page 61

www.csi-india.org

DEEP WEB
Compiled by:
G.V.Seshasai
The web we surf today is roughly estimated to be 1% of the web that exists and the rest of the 99% of the
web is called as deep web/ dark net. This phenomenon is commonly explained with an example of an
iceberg in an ocean and guessing the size of the iceberg over the surface of the ocean is completely
pointless. Deep web/Dark net came in to existence to provide privacy to the user as the content
posted/uploaded to the Deep web/Dark net is anonymous.
Since that no regular search engines can surf the Deep web/Dark net one requires a unique software such a
Tor. Tor helps the user to stay anonymous on the Deep web/Dark net. As every great invention or
discovery even the Deep web/Dark net has the users who use it for both good as well as evil deeds

1. CLOUD COMPUTING ON DEEP WEB


Today in our ever changing world where everything around us is connected to everything that you can
possible think off. World Wide Web has taken over our markets, finance, services, technological innovations
in the fields of science, medical and media where information technology and being connected on the web
plays a major role but setting up a new business or getting staring on web based application and competing
with the major leading industries or technological giants is highly expensive as maintaining a web server to
hold on to your data, infrastructure that can keep all your servers and computers cool, a well-trained
technical staff that can handle this servers. Will this just might not do it as every software we possibly know
gets updated very often and keeping up with all this might just turn way too complicated and expensive for a
small business.
The solution for all this problems is as simple as cloud computing where small business, start-ups and new
applications can use shared servers to run or accesses there files without all the technical handling and
infrastructure. There are many companies that provide shared server services or cloud computing services
where all the infrastructure, technical assistance, the latest software upgrades and few additional services
are provided on monthly based fee plans.
This helps the organization to put all it major resources into the development of the organization, business
expansion, applications development, RND, customer support, etc.
The only problem with the Deep Web or Dark net is that it requires a special browser to access the web sites
in it, but the deep web has a large no of data server providers that can change the feature of small business.
This can be used for ones advantage, the business can use the free services for their internal business usage.
Tor the onion routing was developed by United States Naval Research Laboratory in mid-1990. It was
developed by Paul Syverson, Michael Reed and David Goldschlag with the purpose of protecting U.S
intelligence communications online. The Tor project launched on 20th September 2001 and in the year 2004
the naval research laboratory released the software under free licence. In 2006 mathewson and six others
founded the Tor project, a Massachusetts-based research education non-profit organization responsible for
maintaining Tor.

CSI Adhyayan [Oct.-Dec. 2015]

Page 62

www.csi-india.org
Tor project was funded by EFF, U.S International Broadcasting Bureau, University of Cambridge, Google and
many more. Tor has been praised for providing anonymity and privacy to vulnerable users such as political
activists who fear the surveillance, arrest and other consequences, users who seek circumvent censorship.

2. PURPOSE
The purpose of the existence of the deep web/dark net is very unique approach to solve the major concern
of the modern world where connectivity is much higher and privacy is getting lower day by day on the name
of censorship by many legal bodys and constant need of acquiring the data of user by major corporate
companies in the name of providing a better product is actually invading users privacy.
Deep web/dark net is filled with a huge set of users who are in need of privacy or a censor free internet.
Privacy is one of the basic need for every human, censorship free internet and anonymous internet usage is
very helpful and important for many users. There are also various situations in which a person might choose
to withhold their identity. Acts of charity have been performed anonymously when benefactors do not wish
to be acknowledged. A person who feels threatened might attempt to mitigate that threat through
anonymity. A witness to a crime might seek to avoid retribution, for example, by anonymously calling a
crime tip line. Criminals might proceed anonymously to conceal their participation in a crime. Anonymity
may also be created unintentionally, through the loss of identifying information due to the passage of time
or a destructive event.
Anonymous commercial transactions can protect the privacy of consumers. Some consumers prefer to use
cash when buying everyday goods, to prevent sellers from aggregating information or soliciting them in the
future. Credit cards are linked to a person's name, and can be used to discover other information, such as
postal address, phone number, etc. The electronic cash system was developed to allow secure anonymous
transactions. Another example would be Enmity, which actually makes a purchase on a customer's behalf.
When purchasing taboo goods and services, anonymity makes many potential consumers more comfortable
with or more willing to engage in the transaction. Many loyalty programs use cards that personally identify
the consumer engaging in each transaction, or that act as a numerical pseudonym, for use in data mining.
Most commentary on the Internet is essentially done anonymously, While these usernames can take on an
identity of their own, they are frequently separated and anonymous from the actual author. According to
the University of Stockholm this is creating more freedom of expression, and less accountability. However,
the Internet was not designed for anonymity: IP addresses serve as virtual mailing addresses, which means
that any time any resource on the Internet is accessed, it is accessed from a particular IP address. This
address can be mapped to a particular Internet Service Provider (ISP), and this ISP can then provide
information about what customer that IP address was leased to. This does not necessarily implicate a
specific individual but it provides regional information and serves as powerful circumstantial evidence.
Anonymizing services such as I2P and Tor address the issue of IP tracking. In short, they work by encrypting
packets within multiple layers of encryption. The packet follows a predetermined route through the
anonymizing network. Each router sees the immediate previous router as the origin and the immediate next
router as the destination. Thus, no router ever knows both the true origin and destination of the packet. This
makes these services more secure than centralized anonymizing services.
Cognitive computing based on cloud computing with the help of Deep Web:
Cognitive computing is the next major step for innovation in the science of information technology. As we
are eliminating a large amount of time in processing complex data that we require for cognitive computing
with the help of high powered processer and high band width thats being provided by our cloud server

CSI Adhyayan [Oct.-Dec. 2015]

Page 63

www.csi-india.org
provider will help the developer to develop algorithms and applications that will empower the new business,
innovations and ideas.

3. EXAMPLE:
Consider you run a restraint and have newly adopted a web based application to provide all your services on
a mobile application platform. With the help of cloud computing you can store, analyse all the user shared
data to provide the best and faster service possible and with the help of cognitive computing you can
understand users requirements and develop your products and services. Also anticipate the user
requirements and suggest the products and services that they might desire
This just doesnt help in the improvement/development of the organization but will also help us to create a
better customer relationships and loyal partners and the data gathered from deep web is completely private
to the organization and the developers can test run their applications for better customer reviews.

Mr. G.V.Seshasai [CSI-01328817] is studying in IV year of B.Tech (IT) at MLR Institute of


Technology, Hyderabad (Telangana). His areas of interest are Deep web/Dark Net, Cloud computing, AI
(artificial
intelligence),
Web
technologies,
programming
etc.
He
can
be
reached
at kanna.seshasai@gmail.com.

CSI Adhyayan [Oct.-Dec. 2015]

Page 64

www.csi-india.org

SECRET SPACE ENCRYPTOR


Compiled by:
Sarath. P
In this modern world we cannot assure that our sensitive datas like passwords and other confidential
informations are secure on our Android device. The only way to keep our data secure is through Encryption.
Nowadays most of the android smart phones are loaded with androids own native encryption feature. But
when we encrypt the data using androids own native encryption feature it will affect the performance of our
smart phone which we will not wish to compromise.
Fortunately there is an app called the Secret Space Encryptor by Paranoia Works. It is basically an encryption
app which is used to encrypt all the sensitive and confidential information on our android smart phones and
tablets. It is a cross-platform. Which means it can be used in Android devices and PCs that runs on Windows,
Linux and MAC. It is very easy to use and is available as free download from Google Play Store. The app also
helps us to encrypt the text data so that no one will be able to read the data unless he/she knows the key to
decrypt the data. Last but not the least the app helps to encrypt the entire folder. The most interesting thing
about this app is that it uses strong 256 bit encryption algorithms to encrypt the datas. Some algorithms
used in this app are AES, RC6, and Serpent.
The Secret Space Encryptor app has mainly 3 modules to secure the data.
1) Password Vault
2) Message Encryptor
3) File/Dir Encryptor
The Password Vault is used to store passwords and other sensitive informations. The Message Encryptor is
used to encrypt text datas. The File/Dir Encryptor is used to encrypt the entire files and folders.
With all these amazing features it makes Secret Space Encryptor a must have app on every android devices.

Mr. SARATH.P [CSI-01295310] is studying in II year of MCA at Amrita School of Arts and
Sciences College, Ernakulam (Kerala). His areas of interest are Cryptography, Image
Processing, Big Data processing etc. He can be reached at sarathpoduval6@gmail.com.

CSI Adhyayan [Oct.-Dec. 2015]

Page 65

www.csi-india.org

DIGITAL LIFE
Compiled by:
R M Sanbui

A Nation is called Developed Nation, when the Nation is Scientifically & Technologically

advanced and has


achieved the National Living Index Standard being 67.7% higher than that of under develop country like
ours. Undoubtedly , its citizen are enjoining the facilities of a Developed Nations ; namely USA , UK, and all
Western Countries , The level of developments are so much so that their standard of living Index are ahead
of 200 years from that of under developed countries like our India , & others namely Indonesia, Malaysia,
Brazil and Argentina and for more than 300 yrs falling backwards in case of Undeveloped Countries namely
African Countries, Bangladesh, Tibet, Nepal & Myanmar etc. respectively.
In case of citizens of USA, and other developed countries like Europe, Western Countries , New Zeeland,
Australia are enjoying the fruits of Developed Country having higher Living Index being 67.7% higher , and
on the contrary, we Indian are falling back wards by about 120-100 years from that of the citizens of the
countries whose Living Index Standards are nearest to our Nations Living Index like Italy ,Ireland etc.
Although It is acknowledged all over the world that we have entered in the groups of Five-Nation Nuclear
Club in 1998 , having achieved the Manufacturing capabilities of it. The rest all others Nations like African
Countries and Southeast Asian and Middle East countries are joining the category of Undeveloped
Nations their developments capability are much lower including National Living Index.
All developed countries are advanced mostly as they are getting benefits of scientifically advantages in their
daily living standards mainly because their Industrialisation started since Middle Age , by achieving with
the invention of Steams Locomotive , CIRCULAR Concept , Lights and Fire and beyond Middle Age with
the invention of mainly 101 types of important Gadgets its application of Scientific Appliances ,
together with even use of Digital Life of 20th centurys inventions which have made their life further
easy and comfortable , whereas we Indians need to go for further improvements in all our spheres , mainly
more handling of DATA STORAGE capability and by which we are able to take a greater leap to advance our
progress in getting the benefits although our living status remain as developed countries. Presently our
Govt .have taken a tremendous planning efforts and activities to catch up the developed countries Living
Index Criteria, by the application of Digital Life, even at the late years of Independence say, 69 years,
We got our independence in 1947 , under staying long 150-200 yrs of colonial status from British Rule.
Since then we have humbly started our progress in all fields specially with the advent of five- years Plan
concept from Soviet Union and got involved with its application in all spheres of life either and/or urban,
rural or even adheebashies level , We need to follow these processes and technique in all walks of life in
the following manners, apart from all other improvements of developments that were in progress with
industrial collaborations establishments called The Public Sector Units ( PSUs ) both under Central as well as
State Govts-level ; as per our Govt. Plan of action touching all aspects of life covering in all cross- sectional
layer of our society, in Education, Building Dams, Health , Roads and Hospital etc., but the progress was not
to the expected target levels like China, Japan or even like Koreans and our countrys progress was reached
hardly at GDP level up to 5-7% only.
At this juncture, we have taken tough stressed - stand to use of digital life, being the Guru Mantra for
developments & achieving to be coming the leader of its application in soft-ware to improve various
aspects in our day to day life , our engineers have already proved it in the world that how much we can

CSI Adhyayan [Oct.-Dec. 2015]

Page 66

www.csi-india.org
become better off than that of developed world in its application of soft-ware to modernise the
controlling systems for a complicated mechanism in all infrastructure developments during the new
millennium years till date ; be its Industrialisation in Railways, manufacturing & adaptation of
infrastructure, in developments of aerospace and missile Technology . As on today we could achieved in
science & technology with the help of advance collaboration from developed countrys technology namely
Transfer of Technology (TOT), Hence Digital life is giving rise into the status of big leap in Indias
Advancement which is growing faster towards the status of Developed Nation , compared to those so far
called as undeveloped countries.
Here we like to cite an example of the rise of Mr. Sunder Pichia of IIT Kharagpur, in this year he became
the CEO of Most advanced Digital Application Company called M/s Google has proved to be a master piece
of achievements by developed countries in Digital Life.
Our achievement counts , being a Nation of 1.25 Billions in the use of computer application in Indian
Railways net- works system , be it for use in passenger booking or freights of any size and distances , by
using super computer handling phenomenal data storage capability even for the period of six months or
so at a stretched , is first of its kind for the worlds Third largest Railways Systems , which carries a fleet of
not less than about three hundred millions of Indians in a single day , in 24x7 days per months , which has
been achieved by India in last 2-3 decades adding a golden feather in its cap in the progress of making India
Digital Life, and remarkably without making a single or severe failure of computer data.
For our further application of DIGITAL LIFE , in our Banking & Economics Sectors benefitting the
agricultural & Rural Developments in Health, Education, and Communication of Roads at national Levels etc.
including Census Control of 1.25 billion Indian citizens by the Indian Census Departments called ADHEER
Operation by virtue of digital data storage in biometric record by Govt. of India, which started since 2/3
years ago and still are in continuation. Now these are in operation and by virtue of it our citizens have
become a part of Govt. subsidies distributions by the present Govt., like Atal Janodhan Jojana, Priminister
Janadhan Jajona etcs to name a few.
These Digital LIFE are in used in keeping all our adaptation of data operations and distribution for our
day to day life apart from all scientifically vital & Climatic Warning & other Radar Warning Signalling Data
utilising from our own satellite manufactured under govt plan ; let it be for Indian Army, Air force and Naval
Operations are all for the use of scientifically maintaining Logistic & Administrative Data Storage also sharing
with undeveloped neighbouring counties with the satellite image on non- vital parameters in civilian usage.
Now we have to spread out its usage in operational Data storage with multi fold applications in storage of
memory in the form of Tablets , Microfilm , Liquid Crystal and LED diodes applications and what not ? by
virtue of its application one can minimise the size of capturing areas in the form of storage memories in
our day to day life making more of our life in Digital Life, which we call paperless office .
Before the digital application in all activities of Railways systems ; India have visualised the use of Medical
Practices of critical nature called Medicyne, which are being used amongst all under developed countries
being transported from develop countries, In practices one can make life more easy go- life in the following
manners as detail below ;
Use of digital communications, in the form of a smart phone, in remote area were even electricity not in
use, etc. Use of digital data storage in tablets even for the size of The Ramayana , The Mahabharata & The
Koran in any smaller size even within a physical area of (0.5cm x0.5 cm)
Use of railway booking of any kind keeping whole railway network system within its working area covering
six hundred thousand Kilo meter.

CSI Adhyayan [Oct.-Dec. 2015]

Page 67

www.csi-india.org
Use of Boarding card data entry during Purchasing of Air Tickets using biometric data capturing, Its
applications of Missile predetermined paths, as well as targeting of any submarine missile launching within
an accuracy of millionth of a second
For the daily life supremacy we need further accountability of the Digital life which are in the process to
development for covering the rural and urban Digital Life advancements as followings ; On line data transfer for any application required by any rural citizen , now one need not travel to
go in main city, which can be submitted with the facility of scanning other attachments as required,
Can get full information on- line for booking of railways by rural area,
Can transfer a/c details to any other a/c from his upgraded banking data systems,
A student even can copy the book materials from his fellow with the help of Tablets
All Govt instructions can be received by on line data process by rural citizens in no time.
All weather forecast that are serious bulletins can be captured in on- line from a distance within a
short time.

CONCLUSIONS
These are the milestone of achievements that are required to be taken up by the Present Govt., to fulfill the
need and aspirations of the much fall- back by Indian citizens due to the ex-govt non-attitudinal
developments of Digital Life, while the present Govt is determined to cater the need of Digital INDIA.
A highlights to achieve Digital India practically@ 500 Crs NRI( say)
The proposed project concept can be achieved to realise the above in reality by the Indian to become in the
forefront as developed Digital India : which will eventually modernise traffic congestion in all Metro Cities.
India being an expert nation in satellite lunching since we have shown our success in several decades by
launching satellites of our own or other countries satellites into the critical pre determined orbits to
established a so called micro -GPS system in the paths of satellites orientation that will be designed
manufactured & launched indigenously by Indian scientists of ISRO/VSCC who can now plan to utilize their
expertises, at least by launching either three (or multiple of three no.) of satellites which will cover up 60
degrees longitude to get the full operating latitude area at an altitude say 40 -50 KM height on earths
surface covering all metro cities within the operating channel - frequency being Ka & Ku band
transmissions (by M/s BEL) employing data of soft ware format of Google Map representing metro
cities location and positions by synchronization to capture the movement of earths surface of Indian Metro
CITIES matching with satellites movements/ positions at par with facing the satellites. Monitoring Stations
on surface of Earth also needs to be used suitably with the positions keeping the earths rotational position
synchronised accurately & most effectively and efficiently.
The overall project concept can be collaborated in- line with the Japanese concept or like all other
advanced technologies which have been brought in to India under collaboration to develop from sixties to
even at presents time in (Sukhoy a/c) Aeronautics/Cryogenics Engines or Avionics. This will be by virtue of
collaboration or its derivative process will be undertaken to reach goal, in shortest time & lowest project
cost, rather than innovative concept to start its operation, where as Innovation will cost more with
time consumption .So concept of TOT is recommended.
[Brief contribution achieved by the author in Self Realisation in the developments of Countries Defence Public Sector working by
the author with M/s BEL & HAL in 1967 to 1997]
As Designer of Man Pack Version of communication systems for 15 Watts, Tx Rx , (In a Design Team of four) at M/s Bharat
Electronics, Bangalore in 1967- 1970.
As designer of Cockpit Panel , Air borne Fuel Gauging for Cheetah & Chetak Helicoptrs at M/s HAL. Lucknow Division HAL in(19761980)Manufacturer of Series Production of IGs guiding system for PRITHVI Missile ( IRBM ) of DRDL designed at HAL HYDERBAD
Division (1991-1996)

CSI Adhyayan [Oct.-Dec. 2015]

Page 68

www.csi-india.org

COMPARATIVE ANALYSIS OF SQL AND


NOSQL CLOUD DATABASE SOLUTIONS
Compiled by:
Syed Zubeen Qadry

ABSTRACT
With an exponential rise in the production of Big Data, and the prediction of the International Data
Corporation, that forecasts a 22.8% rate of growth for the IT Cloud services, Cloud computing is increasingly
becoming an integral part of data storage by providing databases that run on a Cloud computing platform to
developers. Cloud computing platforms like Microsoft Azure, Amazon EC2 and Rackspace have provided
Cloud database services to its users, each having its own pros and cons. These Cloud databases can be
classified in terms of their data storage model, i.e. SQL and NoSQL. This paper provides an insight on Cloud
database solutions with reference to this classification and presents a comparison between the two.

1. INTRODUCTION
Since its evolution, internet has provided a host of services to the user. The amount of growth that the
internet has seen in the past decade has been immense. With the rapid development of the internet, the
needs of the user have also grown. From providing access to information through standalone web-pages and
checking E-mails to storing large chunks of personalized data on Clouds, internet has come a long way. With
the increasing needs of users, it has become difficult to process, handle and store large volumes of data, so
that the user can access it whenever he wants. Storage of data has always been of prime concern to all
organizations. Modern day data centers are well equipped for handling large volumes of data transactions,
but the cost of maintaining a data center is very high. Cloud computing has provided a low cost alternative to
this problem by providing data storage capabilities in the form of Cloud storage. In recent years we have
seen phenomenal advances made in the field of Cloud Computing. Cloud database is one such new trend
that helps to employ databases using a Cloud Computing platform.
The Relational approach of implementing databases works fine for data sets that are not very large in
volume and are still being conveniently used by a large number of organizations to store and modify their
data. For handling such databases a DBA (Database Administrator) is generally hired to monitor the database
and keep it updated. Problems generally arise when the data sets become very large and complex, and it
becomes very difficult to implement them using the conventional RDBMS. If the relational approach is
persisted with, it results in employing more personnel to configure, implement, update and maintain a
database holding a large amount of data with huge complexities. This in turn increases the cost of storage
and maintenance of data within an organization and also affects the ease of accessibility of the data. Cloud
Computing provides services in the form of Cloud databases wherein users can run the databases on the
Cloud independently or can make use of the DaaS (Database as a service) which is provided as a paid service
by the Cloud database service providers. These databases provided by the Cloud Computing platform can
either SQL based or working on a NoSQL based data model [1].

CSI Adhyayan [Oct.-Dec. 2015]

Page 69

www.csi-india.org

SQL BASED DATABASE


SQL Databases are basically relational databases that deal with structured data using SQL as the querying
language. These types of databases have data stored in various tables having complex relationships with
relevant attributes. SQL databases are support ACID (Atomicity, Consistency, Isolation, Durability)
transactions for their database, thereby ensuring consistency for data stored at all times. Having a
centralized architecture, these type of databases are suited for handling critical user data as they offer
reliability and data consistency. It is generally more difficult to scale SQL databases in comparison to NoSQL
databases as distributing the data stored in such databases on different servers is not feasible. Some of the
commonly available SQL based Cloud databases are mentioned below.
Amazon RDS
Amazon Relational Database Service or Amazon RDS [3] is a fully managed SQL database service designed for
developers and business organizations. This Cloud based service enables users to access MySQL, PostgreSQL,
Oracle and SQL Server database engines on their Amazon RDS Cloud-based database instance. Users can
launch any MySQL, PostgreSQL, Oracle instance directly without any added configuration. In addition to
providing highly scalable databases, if you prefer using SQL Server, then the Amazon RDS gives a storage
capacity of 1TB along with an IOPS (Input/Output per second) of 10,000 per database instance. Automated
database backup facility is provided with Amazon RDS which enables point-in-time recovery of any DB
instance. Along with this it provides Snapshot recovery tool, which is basically a user initiated DB backup.
This backup created is stored with the Amazon RDS and can be used to create a new DB instance as and
when the user wants.
One of the good features of Amazon RDS is that it provides a built-in replication tool using the Multi-AZ
(Availability Zone) deployment. Apart from this, replication is also carried out using MySQL or PostgreSQL
engines default replication functionalities. Being one of the oldest to provide the Cloud database service
Amazon provides an ease of access to the user, through RDS as it doesnt require an active AWS account to
run an instance of Amazon RDS. One important feature provided by Amazon RDS is that it automatically
patches the database software to keep it up to date. Amazon RDS allows users to create, delete or modify
any relational database instance using the AWS Management Console.
Google Cloud SQL
Currently operating for the Google App Engine, the Google Cloud SQL [2] is fully functional MySQL based
database that works on the Google Cloud platform. Hosting MySQL database versions 5.5 and 5.6 on the
Google Cloud, it allows users to create, modify and update relational databases for App Engine Applications
that have been written in either Python, Java, PHP or Go. Being a fully managed service, the Google Cloud
SQL is thus very easy to use. It provides import and export functionalities so that users can move their entire
database to the Cloud and can use them with the App Engine. Offered as a paid service, developers can
make use of this Cloud based relational database for as low as $1 per month. Some of the basic features
include support of 16GB of RAM and 500GB data storage for each instance of the Cloud SQL, Java and
Python compatibility, Synchronous or Asynchronous replication among various physical locations,
Automated backup facility available. Instances can be managed and accessed through a command line
interface or a web console provides for easy migration of data to the Cloud. All connections supported
through IPv4 and IPv6 Encryption of data Import and Export databases using mysqldump Offers flexible
pricing plans
Inspite of all these offerings the Google Cloud SQL has a few limitations which include

CSI Adhyayan [Oct.-Dec. 2015]

Page 70

www.csi-india.org
User defined functions are not supported
Language restrictions
QPS (query per second) has been limited to 16MB
Some MySQL statements such as CREATE FUNCTION, LOAD_FILE(), LOAD DATA INFILE are not supported
Instances have been capped at 100GB
Microsoft Azure SQL Database
For handling, storing and manipulating relational data, Microsoft Azure [4], the Cloud Computing platform
created by Microsoft, provides DaaS (Database-as-a-Service) functionalities through the recently unveiled
Microsoft Azure SQL Database. The Cloud based database is made available using the PaaS (Platform-as-aService) model which gives customers the option of running the SQL database either on their Cloud or
purchase virtual machine instances. SQL Server instances are installed on machines at the Microsoft Data
Centers which enable to provide SQL Server and SQL Server Management Services to the customers.
Clustered index is required for entering data into tables of the database. Several copies of the database are
stored on servers to ensure availability of data at all times.
Based on the SQL Server, Microsoft Azure follows the multi-tenant architecture, which in general refers to a
partitioning scheme ensuring that each tenants data is secure and the applications are scalable. To ensure
the scalability of databases, SQL database makes use of federation. Federation uses Sharding, which is a
technique used to split large databases into many smaller databases for a greater utilization of resources.
Upon splitting up a federation member, the resulting two federation members have the same schema.
Further the two federation members can be merged together to form the parent member again. Azure SQL
provides support for ACID transactions.
NoSQL BASED DATABASE
When comparing SQL and NoSQL (Non-SQL) based databases, the primary comparison is carried out keeping
in mind two important concepts which are: ACID and BASE (Basically, Available, Soft state, Eventually
consistent) models. ACID refers to a set of properties that ensure that the database transactions carried out
are consistent and reliable. While most SQL database provide support for ACID transactions, the NoSQL
database follow the BASE model approach. BASE, which is an alternative to ACID, ensures that the database
remains in a consistent state, whether or not the transactions are consistent. It sort of relaxes the conditions
of the ACID and works primarily on three governing factors:
Basic Availability, Soft State
Eventual Consistency
Basic Availability ensures that if multiple failures occur, the data remains available at all times irrespective of
it being in consistent. Soft State implies that over a period of time while the database undergoes changes, to
maintain eventual consistency, the state of the system thus is always soft. Eventually consistent, refers
to the fact that no matter what happens, sooner or later the data in the database will always become
consistent after it stops receiving any inputs. Some common examples of Cloud data management system
operating on the BASE model are CouchDB, SimpleDB and Amazon DynamoDB.

CSI Adhyayan [Oct.-Dec. 2015]

Page 71

www.csi-india.org 2015

Characteristics

Google Cloud SQL

Amazon RDS

Microsoft Azure
SQL Database

Release

2014

2009

2009

DB size limit

500 GB

1 TB

500 GB

Automated Patching

Yes

Yes

Yes

DB engines supported

MySQL

MySQL,
Oracle,
Server, PostgreSQL

Scalability

Less scalable

Less scalable

Backup/Restore

Automated
backups Automated backups and Automated backups and
and
point-in-time DB Snapshots
point-in-time restore
recovery

Application
support

language Java and Python

All language support

SQL SQL Server

More scalable

.NET, Java, Ruby, PHP,


Python, Node.js

Uses
Master-slave
relationship,
where
data is replicated from
master instance to
replicated
slave
instances

Replication
provided Asynchronous replication
using two features : provided using standard
Multi-AZ
Deployments geo-replication
and Read Replicas

Data Encryption

Yes

Yes

Yes

Provisioned IOPS

No

Yes

No

Query Language

GQL

SQL

T-SQL

Pricing

Packages and Per- Per-usage plans (2GB DB Per-usage plans (2GB DB


usage plans (2GB DB ~$0.068/hour)
~$0.0067/hour)
~$0.122/hour)

SLA

99.95%

Replication

99.95%

99.99%

Over a period of time many organizations have tend to adopt the NoSQL versions of the database, so
that they can store their unstructured data into these databases which the SQL databases do not
support. The NoSQL databases are classified into three categories [6].

CSI Adhyayan [Oct.-Dec. 2015]

Page 72

www.csi-india.org 2015
Key value stores, Column-oriented Databases, Document Oriented Databases
Key value stores, refer to databases that have values in them which are indexed by keys. These keys are
used for accessing values during retrieval. Amazons SimpleDB is an example of a key value store.
Column-oriented databases group a set of attributes into one and have them stored as one column
family. Cassandra is an example of a Column oriented database. Document oriented databases used for
semi-structured data store data in the form of documents. The length of the document as well as the
size of the fields can be of any length. MongoDB is a document oriented database. NoSQL databases on
the whole have faster data processing speeds in comparison to the SQL based databases which are
governed by the ACID properties. Since NoSQL databases dont use SQL for query processing, hence
manual query programming is required for carrying out transactions which sometimes can be difficult to
program for a complex query [6].
Amazon SimpleDB
Amazon SimpleDB is an auto-indexed NoSQL database provided by Amazon. Data-sets are stored in the
form of domains which basically are a collection of items described by attribute-value pairs [11]. The
data stored in domains is similar to tables stored in relational databases, the only difference being that
domains in SimpleDB can have a different set of attributes for each item. Like other NoSQL databases,
SimpleDB does asynchronous replication and provides two read consistency options: 1. Eventually
Consistent Reads 2. Consistent Reads. Eventual consistency maximizes the performance by lowering
latency and increasing the throughput. The pay per usage policy of Amazon allows users to decide the
number of domains they require, however Amazon SimpleDB caps the limit of each domain to 10 GB.
Written in Erlang this NoSQL cloud database is easily scalable and extensible.
MongoDB
Developed by software company MongoDB Inc., this NoSQL database was initially planned as PaaS
product but was later on shifted to an open-source model. Written in C, C++, Java this widely popular
document-oriented database is developed to be platform independent. It carries out replication by
creating mongod instances. A replica set forms a set of these instances, and one of these mongod
instances acts as a primary, receiving all the write operations [10]. The primary is the only member of a
replica set that accepts write operations. To increase scalability and tackle itself with large data sets,
MongoDB implements automatic Sharding or horizontal scaling by dividing data sets over multiple
servers known as shards, thereby reducing the data each shard has to handle. Since the shards store the
data, hence to provide a high amount of data consistency and availability, each shard acts as a replica
set which in turn is a group of mongod instances. MongoDB doesnt support multi-document
transactions, however provides atomic operations on a single document. The NoSQL database uses a Btree data structure for its indexes and provides a number of different index types to support different
types of data and queries.

CSI Adhyayan [Oct.-Dec. 2015]

Page 73

www.csi-india.org 2015
Characteristics

Amazon SimpleDB

MongoDB

CouchDB

Release

2007

2009

2005

Programming Language

Erlang

C++, JavaScript, C

Erlang

Query Language

Similar to SQL

JSONiq

JavaScript

Database Model

Key-value store

Document-oriented

Document-oriented

Replication

Yes

Supports Master-Slave Supports


Masterreplication
Master replication
Atomic operation

Integrity Model

On fields

Multi-Version
Concurrency
(MVCC)

Automatic Sharding

No

Yes

No

Triggers

No

No

Yes

License

Commercial

Open Source

Open Source

Control

CouchDB
Written in Erlang, Apache CouchDB or CouchDB is a NoSQL database released in 2008. Couch which
serves as an acronym for Cluster of Unreliable Commodity Hardware offers a schema free documentoriented type of database model. A document-oriented database model contains a series of documents,
in which the data resides and not in related tables as in relational databases. Being a schema free
modeled database, CouchDB relies on views to create and manage relationships between documents.
Queries which are referred to as views when looked in the context of CouchDB are carried out
successfully with the help of a B-tree storage engine. Views in CouchDB are constructed using JavaScript
and the view function takes a CouchDB document as an argument and performs a set of computations
so that the desired output is generated by the view created. Adhering to the ACID properties, the file
layout of CouchDB ensures data consistency at all times [7]. Replication functionalities in this NoSQL
based database are provided using asynchronous replication, thereby providing multiple copies of the
data, to be modified at any given point of time [8]. Replication in CouchDB doesnt get affected in case
any sort of failure occurs and continues the process from point of occurrence of the failure the next time
it runs. CouchDB also guarantees eventual consistency to its users, which refers to the fact that any sort
of changes made to a data item in the database get copied to all its replicas and if no sort of change has
occurred then all the replicas of the database are logically equivalent [9].

CSI Adhyayan [Oct.-Dec. 2015]

Page 74

www.csi-india.org 2015
2.

CONCLUSION

A good database incorporates in itself rich features such as maintaining strict consistency at all times,
providing high amount of scalability to the consumer/organization to mention a few. Although the type
of database to be used primarily depends upon the type of data to be stored, but making a well
informed choice also depends on considering other important factors such as data model, performance,
capacity, pricing plans, community support and reliability. This paper provides an overview of some of
the SQL and NoSQL based options currently available to the consumers and compares them on several
aspects. The consumer, based on these aspects, can decide for himself as to which database suits his
requirements.

REFERENCES
1.
2.
3.
4.
5.
6.
7.
8.
9.

Cloud database, http://en.wikipedia.org/w/index.php?title=Cloud_database&oldid=627334290


Google Cloud SQL, https://Cloud.google.com/sql/
Amazon RDS, http://aws.amazon.com/rds/
Microsoft Azure,http://msdn.microsoft.com/en-us/library/azure/
http://docs.aws.amazon.com/AmazonRDS
Leavitt, Neal. "Will NoSQL databases live up to their promise?" Computer 43, no. 2 (2010): 12-14.
CouchDB, http://docs.couchdb.org
Cattell, Rick. "Scalable SQL and NoSQL data stores." ACM SIGMOD Record39.4 (2011): 12-27.
Bailis, Peter, and Ali Ghodsi, "Eventual consistency today: limitations, extensions, and
beyond." Communications of the ACM 56.5 (2013): 55-63.
10. MongoDB, https://docs.mongodb.org
11. Amazon SimpleDB, http://aws.amazon.com/simpledb

Mr. Syed Zubeen Qadry [CSI-I1501972] has done his Post-Graduation in Mathematics and
Computer Science from Jamia Millia Islamia, New Delhi. Current he is working as a Subject Matter Expert
for Chegg Inc. His areas of interest include Databases, Cloud Computing and Data Mining. He can be
reached at zubeenqadry@gmail.com.

CSI Adhyayan [Oct.-Dec. 2015]

Page 75

www.csi-india.org 2015

SPATIAL DATABASE SYSTEMS AND ITS


APPLICATION IN MILITARY - AN
OVERVIEW
Compiled by:
Abdul Wahid and Md Tanwir Uddin Haider

INTRODUCTION
In various fields where we need to store and process data related to spatial objects. Spatial object may
be any 2-dimensional (for example GIS, road network, VLSI design etc.) or 3-dimensional (the universe,
molecule structure etc.) or any point of interest like hotels, hospitals, river, shopping malls etc.
Spatial database systems can handle both spatial data as well as non-spatial data in its data model.
Spatial data means data related to space and it can be processed by spatial attributes that have location
on the surface of the earth. On the other hand non-spatial data are alphanumeric data for example
name, email id, contact no. etc.
In other words we say that spatial database systems are same as traditional database systems with
additional abilities to handle spatial data by using spatial data types and spatial operators. To handle
spatial data it use (i) spatial data-types, (ii) spatial operators and (iii) spatial indexing .Which is shown in
Fig.1.

Fig.1 Spatial database system.


Spatial database systems has ability to handle spatial data as well as non-spatial data concurrently that
distinguishes it from other traditional/conventional database systems. In traditional/conventional

CSI Adhyayan [Oct.-Dec. 2015]

Page 76

www.csi-india.org 2015
relational database systems there was no provision to store and process spatial data represented by
spatial data types like points, lines, and polygons. To store and process spatial data, object-oriented
database and object relational database allow us to define some abstract data types. Several vendors of
database software use this ability to handle spatial data.

APPLICATION OF SPATIAL DATABASE SYSTEMS IN MILITARY


In real world there are various fields (like transportation, military application, management,
medical, Geographic Information System (GIS) etc. where we use spatial data and spatial queries. Spatial
data plays a very important role in military operation to take any decision in an efficient ways. In this
article we present military application where we use spatial data using GIS, and GPS (Global Positioning
System) to process all the operation. In military all the operations (like command, control,
communication, etc.) are dependent on accurate information and mainly concerned with spatial data
that makes easy to take any decision. Major General Gurbaksh Singh VSM says in the article of
Electronic Today November 1996 about the importance of spatial information for taking appropriate
decision in military application as:
The lessons gained from military history indicate that the key to military victory lies (Regardless of
military size of the opposing forces) in remaining ahead of the enemy in time sensitive SCORE loop of
C4I2(Command, control, communication, coordination, information and interoperability) process.
If a defending force or weapon system can with some accuracy and sufficient warning finds out where
the attacker is or his future course of action would be, it would be easier to defeat him by occupying
position of advantage or by massing a superior force at the point of decision.
In military operation GIS also plays a very important role and military forces use GIS in various different
field like terrain analysis, monitoring terrorist activity, battle field management, finding location of
enemy, remote sensing etc.to minimize the risk of errors.
Terrain analysis:-In terrain analysis military commanders use land map with accurate information on the
land to improve effectiveness of mission and optimizing resource utilization. For any military operation
based on land it use road networks condition, communication path information with accuracy that
makes easy to take any decision. Ministry of defense of any country gathers these information and after
filtering and analyzing process takes decision. Due to random deployment and flexible response it may
endanger the operation.
Air operation: - Air operation needs all the information (like target location, civilians areas, and terrain
analysis) that is required for land operation with additional of height information accurately for
targeting. In all the operation either air operation or land operation or sea weather plays a very
important role in battlefield to complete a task. For successful completion of any task in battlefield it is
required to know the information regarding weather conditions, wind conditions, visibility conditions,
and temperature condition of battlefield.
Naval operation: - At sea naval vessels navigate the position by using various electronic gadgets.
Recently various technology provide us to navigate position with accurate information. Electronic chart
display and information systems (ECDIS) support the navigator to navigate in all the weather conditions.

CSI Adhyayan [Oct.-Dec. 2015]

Page 77

www.csi-india.org 2015
Electronic navigation chart (ENC) is used as a tool for navigation which is different from conventional
chart that takes information related to depth, hazards within the area as input. In military application
GIS use Electronic navigation chart as the spatial database.
Military operations are mainly depends on GIS and GPS to take decisions. GIS with satellite image in
military service is use to understand any terrain area that plays a major role to determine ideal
locations, and also has the capability to hide all the equipments/weapons and troops, supplies weapons
on demand, finding best routes etc.

REFERENCES
1. ALBERT K.W. YEUNG and G.BRENT HALL, Spatial database Systems- Design, Implementation
and project Management, Springer, ISBN-13 978-1-4020-5392-4.
2. http://en.wikipedia.org/wiki/Spatial_database.
3. http://www.gislounge.com/military-and-gis.
4. http://en.wikipedia.org/wiki/Geographic_information_systems_in_geospatial_intelligence

Abdul Wahid has completed the B.Tech in Information Technology from Muzaffarpur
Institute of Technology Muzaffarpur under Aryabhatta knowledge University Patna in 2014. He is
currently pursuing M.Tech in Computer Science and Engineering Department from National Institute of
Technology Patna, Bihar. His research interests include Spatial Database, Distributed Database and
Intelligent Transport System (ITS).

Md. Tanwir Uddin Haider has done his B.E, M.Tech and Ph.D degree in Computer Science
and Engineering and currently working as Assistant Professor in Computer Science and Engineering
Department at NIT Patna. His research area include Database, Semantic Web, E-Learning System, and
Spatial Database.

CSI Adhyayan [Oct.-Dec. 2015]

Page 78

www.csi-india.org 2015

BLUE BRAIN
Compiled by:
Salomi Thomas

ABSTRACT
Blue brain is the worlds first virtual brain. That means a machine that can function as human brain.
Human brain is a most valuable creation of god. The man is called intelligent because of brain. Within 30
years, we will be able to scan ourselves into the computers. So, even after the death of a person we will
not lose the knowledge, intelligence, personalities, feelings and memories of that man that can be used
for the development of the human society.

1. INTRODUCTION
The Blue Brain Project is an attempt to reverse engineer the human brain and recreate it at the cellular
level inside a computer simulation. The project was founded in May 2005 by Henry Mark ram at the
EPFL in Lausanne, Switzerland. Goals of the project are to gain a complete understanding of the brain
and to enable better and faster development of brain disease treatments. Human Brain, the most
valuable creation of god. The man is called intelligent because of brain. But we loss the knowledge of a
brain when the body is destroyed after the death. BLUE BRAIN - the name of the worlds first virtual
brain. That means a machine that can function as human brain.
BLUE BRAIN:- BLUE BRAIN would be the worlds first virtual brain. The IBM is now developing a virtual
brain known as BLUE BRAIN. Within 30 years, we will be able to scan ourselves into the computers.

WHAT IS VIRTUAL BRAIN?

A machine that can function as a brain.


It can take decision.
It can think.
It can response

CSI Adhyayan [Oct.-Dec. 2015]

Page 79

www.csi-india.org 2015
It can keep things in memory.
WHY WE NEED VIRTUAL BRAIN?
To upload contents of the natural brain into it.
To keep the intelligence, knowledge and skill of any person forever.
To remember things without any effort.
FUNCTIONING OF BRAIN
Sensory input: Receiving input such as sound, image, etc. through sensory cell.
Interpretation: Interpretation of the received input by the brain by defining states of neurons in the
brain.
Motor outputs: Receiving of electric responses from the brain to perform any action.
Now there is no question for how the virtual brain will work. But the question is how the human brain
will be up loaded into it. This is also possible due to the fast growing technology.
UPLOADING HUMAN BRAIN
The uploading is possible by the use of small robots known as nanobots. These robots are small enough
to travel throughout our circulatory system. Travelling into the spine and brain, they will be able to
monitor the activity and structure of our central nervous system.

UPLOADING HUMAN BRAIN

They will be able to provide an interface with computer while we still reside in our biological form.
Nanobots could also carefully scan the structure of our brain, providing a complete readout of the
connection. This information when entered into the computer could then continue to function as us.
Thus the data stored in the entire brain will be uploaded into the computer.

CSI Adhyayan [Oct.-Dec. 2015]

Page 80

www.csi-india.org 2015

NANOBOTS

EXAMPLE OF BLUE BRAIN


A very good example of utilization of blue brain is the case short term memory. For the above reason
we need a blue brain. It is simple chip that can be installed into the human brain for which the short
term memory and volatile memory at the old age can be avoided.

2. BLUE BRAIN PROJECT OBJECTIVES


The project will search for insights into the human beings and remember. Scientists think that blue brain
could also help to cure the Parkinsons disease. The brain circuitry is in a complex state of flux, the brain
rewiring itself every moment of its existence. If the scientists can crack open the secret of how and why
the brain does it, the knowledge could lead to new breed of super computers.
RESEARCH WORK
IBM developing the BLUE BRAIN.IBM in partnership with scientists at Switzerlands evolve
polytechnique federal de Lausannes brain and mind institute will begin simulating the brains biological
systems.
HARDWARE AND SOFTWARE REQUIREMENT

A Super computer
Processor
A very wide network
Very powerful nanobots

HARDWARE AND SOFTWARE REQUIREMENTS

CSI Adhyayan [Oct.-Dec. 2015]

Page 81

www.csi-india.org 2015
ADVANTAGES

Remembering things without any effort.


Making decision without the presence of a person.
Using intelligence of a person after the death.
Understanding the activities of animals and allowing the deaf to hear via direct nerve
stimulation.

DISADVANTAGES
We become dependent upon the computer.
Others may use technical knowledge against us.
A very costly procedure of regaining the memory back.

3. CONCLUSION
We will be able to transfer ourselves into computers at some points. It will bring both benefits and harm
to human society.

REFERENCES
1. www.artificialbrains.com/blue-brain-project
2. https://en.wikipedia.org/wiki/Blue_Brain_Project
3. www.slideshare.net/NnVvy/blue-brain-25892678

Ms. Salomi S. Thomas is studying in II year of B.Tech (CSE) at Dronacharya College Of


Engineering, Gurgaon (Haryana). Her areas of interest are programming, Biotechnology,
Bioinformatics, Aritifical Intelligence, Cloud computing, Biomedical Science etc. She can be reached
at Salomist1996@gmail.com or Salomi.17125@ggnindia.dronacharya.info.

CSI Adhyayan [Oct.-Dec. 2015]

Page 82

www.csi-india.org 2015

PUZZLE
Compiled by:
Vishnu Sanker
FIND WORDS:

B
A
A
I
R
F
R
A
M
E

C
F
O
K
B
O
K
T
U
V

L
G
P
G
N
M
T
C
A
E

U
K
I
A
E
U
A
A
N
E

S
S
Q
H
S
L
R
M
I
I

T
T
A
F
Y
T
M
P
A
N

E
U
Z
P
M
I
C
R
A
T

R
M
T
A
G
T
H
M
N
R

T
U
D
C
H
E
O
A
E
E

S
V
J
H
L
N
K
A
P
K

A
T
W
E
I
A
A
S
S
C

A
K
K
U
M
N
K
A
T
O

D
V
M
N
A
T
O
N
A
D

B
A
A
I
R
F
R
A
M
E

C
F
O
K
B
O
K
T
U
V

L
G
P
G
N
M
T
C
A
E

U
K
I
A
E
U
A
A
N
E

S
S
Q
H
S
L
R
M
I
I

T
T
A
F
Y
T
M
P
A
N

E
U
Z
P
M
I
C
R
A
T

R
M
T
A
G
T
H
M
N
R

T
U
D
C
H
E
O
A
E
E

S
V
J
H
L
N
K
A
P
K

A
T
W
E
I
A
A
S
S
C

A
K
K
U
M
N
K
A
T
O

D
V
M
N
A
T
O
N
A
D

Solution:

1. Airframe: An open source cloud computing platform targeted at organizations in the thinking
stage of adopting a private cloud services model or evaluating options and alternatives for
private cloud solutions.
2. Cluster: A group of linked computers that work together as if they were a single computer, for
high availability and/or load balancing.
3. Eucalyptus : is free and open-source computer software for building Amazon Web Services
(AWS)-compatible private and hybrid cloud computing environments

CSI Adhyayan [Oct.-Dec. 2015]

Page 83

www.csi-india.org 2015
4. DAAS: Desktop as a Service (DaaS) is a cloud service in which the back-end of a virtual desktop
infrastructure (VDI) is hosted by a cloud service provider
5. MULTITENANT: The existence of multiple clients sharing resources (services or applications) on
distinct physical hardware. Due to the on-demand nature of cloud, most services are multi
tenant.
6. CAMP: CAMP, short for Cloud Application Management for Platforms, is a specification ... and
deployment -- across public and private cloud computing platforms.
7. DOCKER: Open-source software that automates the deployment of applications inside
virtualized software containers.
8. IAAS: Cloud infrastructure services in which a virtualized environment is delivered as a service
by the cloud provider. This infrastructure can include servers, network equipment, and
software, including a complete desktop environment such as Windows or Linux.

Mr. VISHNU SANKER M[CSI-9037247080] is studying in I year of Int. MCA at Amrita School
of Arts and Sciences, Kochi (Kerala). His areas of interest are Network Security, Hacking, and
Programming etc. He can be reached at vishnusankar7@gmail.com.

CSI Adhyayan [Oct.-Dec. 2015]

Page 84

www.csi-india.org 2015

Call for Contributions in CSI Adhyayan


(A National Publication dedicated to IT Education, Research and
Student Community)
Indias IT sector continues to a trajectory of high growth since 1990s. Our
education system, the prime mover of industrial growth and modern
development, has seen a phenomenal growth in terms of quantity and quality
- making it the third largest education system in the world after the US and
China. With double digit economic growth demanding a sustained supply of
knowledge workers, India has emerged as one of the worlds largest
consumer of education services.
India has the potential to provide the best education services with strong
relationships among education, research and industry sectors.
Today, IT is a trillion dollar opportunity so is higher education. We can
proudly say that both the Indian IT and Indian guru are now revered
globally. Both have potential and ability to scale up with global mindset.
With regard to emerging technologies, they typically follow a strategy Start
small, Grow real fast and Attempt to conquer. In the backdrop of the above
and with a view to consolidate the achievements of more than four decades of
Computer Society of India (CSI) and new found vitality in education and
research community, we have revived our publication of CSI Adhyayan after
a gap.
CSI Adhyayan is being positioned as a nation publication dedicated for IT
education, research and student community. This quarterly electronic
publication performs the functions of a newsletter, a magazine and journal.
We take this opportunity to invite the contributions in this venture. Your
invaluable contributions, suggestions and wholehearted support will be
highly appreciated. We appeal to all our Chapters, Student Branches and
member academic institutions for encouraging and motivating the students in
terms of contributing innovative ideas, exploring new vistas of knowledge
and new findings through CSI Adhyayan.
We especially invite news and updates from our member institutions and
student branches.
Please send your article to csi.adhyayan@csi-india.org.
For any kind of information, contact may be made to Dr. Vipin Tyagi via
email id dr.vipin.tyagi@gmail.com.

On behalf of CSI Publication Committee


Prof. A.K. Nayak
Chairman
Email - aknayak@iibm.in C S I A d h y a y a n [ O c t . - D e c . 2 0 1 5 ]

Page 85

www.csi-india.org 2015

Computer Society of INDIA


HEAD OFFICE
Samruddhi venture park, unit no. 3, 4th Floor, MIDC,
Andheri (E) - Mumbai - 400093
Maharashtra

CONTENTS COMPILED AND EDITED BY :

DR. NILESH MODI AND DR. VIPIN TYAGI

CONTACT : CSI.ADHYAYAN@CSI-INDIA.ORG
CSI Education Directorate :
CIT Campus, 4th Cross Road, Taramani,
Chennai-600 113, Tamilnadu, India
Phone : 91-44-22541102
Fax : 91-44-22541103 : 91-44-22542874
Email : director.edu@csi-india.org

CSI Adhyayan [Oct.-Dec. 2015]

Page 86

Das könnte Ihnen auch gefallen