Sie sind auf Seite 1von 8

Available online at www.sciencedirect.

com
Available online at www.sciencedirect.com

ScienceDirect
ProcediaComputer
Procedia ComputerScience
Science122
00 (2017)
(2017) 1061–1068
000–000
www.elsevier.com/locate/procedia

Information Technology and Quantitative Management (ITQM 2017)

Managing Cloud Service Evaluation and Selection


Nitin Upadhyay*
Information Technology, Goa Institute of Management, India

Abstract

This paper addresses an important issue in evaluating performance of cloud services for the prevalence of cloud
computing nowadays. The attempt is to propose an evaluation and ranking framework for cloud services. Firstly,
importance of cloud computing and the significance of the quality of service (QoS) selection problem is
introduced. Then the framework is provided to illustrate the QoS evaluation approach. The paper use an example
adapted from literatures to evaluate the framework.
© 2017 The Authors. Published by Elsevier B.V.
© 2017 The Authors. Published by Elsevier B.V.
Peer-review under responsibility of the scientific committee of the 5th International Conference on Information Technology
Selection
and and/or peer-review
Quantitative Management,under
ITQM responsibility
2017. of the organizers of ITQM 2017

Keywords: Quality of Service, Cloud Service, Cloud Computing, Cloud Service Evaluation, Performance

1. Introduction

Cloud computing (CC) is a paradigm shift in the computing model [1]. In CC model, Cloud provider (CP)
provide cloud service (CS) to the cloud consumer (CCon) on the basis of pay-as-you-go anytime, anywhere
model [2-3]. The model at large provide many benefits to small, medium and large scale industries. In a
conventional business model, small and medium enterprises (SMEs) had to invest hugely on upfront capital
expenditure (CAPEX) for procuring IT infrastructure, hiring and maintaining skilled developers and system
administrators [4]. Moreover, SMEs those were not in an IT mainstream business had to take the burden of the
high total cost of ownership. CC has emerged as a paradigm to deliver on-demand resources and services (e.g.,
infrastructure, platform, software, etc.) to cloud customers/consumers [5-7]. The CC also provide CCon a
flexibility in using and/or invoking cloud service anytime, anywhere. Generally, CC architecture deliver three
main services - Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (IaaS),
according to the requirements of CCon [4, 8-9]. Apart from the three main services, CC architecture also provide
services under the term “XaaS”, where ‘X’ is a variable and various entities can be associated with it [10]. For

* Dr. Nitin Upadhyay, Information Technology, Goa Institute of Management. Tel.: +91-08322366751; fax: +91-0832.
E-mail address: upadhyay.nitin@gmail.com

1877-0509 © 2017 The Authors. Published by Elsevier B.V.


Peer-review under responsibility of the scientific committee of the 5th International Conference on Information Technology and
Quantitative Management, ITQM 2017.
10.1016/j.procs.2017.11.474
1062 Nitin Upadhyay / Procedia Computer Science 122 (2017) 1061–1068
Nitin Upadhyay/ Procedia Computer Science 00 (2017) 000–000

example, “Data” can be substituted to ‘X’ to make it “DaaS” - Data as a Service. CC aims to deliver a network
of virtual services that can be accessed by the CCon from anywhere and at any time at a competing pricing model
[11]. CC model provides significant benefits to the CCon(s) as they need not to pay huge CAPEX, thus making
them to be more focus on innovation and creating business value for their services. The growth of public Cloud
offerings has resulted into the proliferation of the similar cloud services from different cloud providers [12-13].
In case of service offerings, it has become a challenge for CCon to select a particular service from the CP as one
provider might be cheap for a storage service, but might be expensive for the computation. For example, the
offerings of Amazon EC2 [14-16]. IaaS services differ in pricing based on region where the service is being
offered. It is now paramount importance to CCon to understand the functional and non-functional requirements
of the CP service quantitatively, so that the services can be benchmarked for the quality against the services
provided by the multitude of the CPs [17]. Therefore, given the multitude of CPs and their Cloud service offerings,
it is not just sufficient for a CCon to identify multiple cloud providers for the similar service but also to evaluate
the optimal and suitable service as per the requirement. Moreover, deciding which service fit best with the
required functional and nonfunctional requirements is a decision problem [2, 7, 18]. In this context, this paper
proposes a QoS evaluation and ranking framework for cloud services. The framework is flexible to undertake
multiple quality attribute of interest to the CCon for evaluating and ranking the cloud services.
This paper is organized as follows. Section 2 discusses related work. In Section 3, a QoS evaluation framework
is presented. Section 4 presents an illustrative example. Section 5 concludes the paper.

2. Related Work

In this section, related work is presented in order to understand the gaps in the evaluation and ranking of QoS
for the cloud service. Not many frameworks have been developed to evaluate and rank the cloud service since
it is a comparatively new concept as compared to the web services, where the most related work is mentioned
in [6-7, 19]. The CC model at any given time can offer services to thousands of users located in a different
geographical location. These consumers may access diverse types of services [20] that have dynamic
requirements [21].
The review of the related work is divided into three main stream – evaluation framework and its application,
evaluation techniques and evaluation criteria and metrics.

2.1 Evaluation framework and its application

Researchers mainly focused on the performance of the various type of applications - scientific computing, e-
commerce and web applications. For example, the work of Iosup et al. [22] focus on analyzing many-task
applications on Clouds by considering the performance parameter only. In CloudCmp [23] framework the
performance of different Cloud services such as Amazon EC2, Windows Azure and Rackspace is compared by
considering low level performance parameters such as - CPU and network throughput. To measure the quality
and prioritizing the service authors have proposed frameworks in [14, 24]. This framework lacks in considering
the flexible view of quality and thus limits the overall evaluation. Saravanan and Kantham [25] in their work
proposed a novel framework for ranking and advanced reservation of cloud services using Quality of Service
(QoS) attributes. Not all the QoS characteristics that haven been discussed considered for the evaluation purpose.
In [21], the authors have proposed a QoS ranking prediction framework for Cloud services by considering
historical service usage data of the consumers. This framework facilities inexpensive real-world service
invocations. In [26], the authors have proposed a generic QoS framework consists of four components for Cloud
workflow systems. However, the framework is not suitable for solving complex problems such as multi QoS
based service selection, monitoring and violation handling.
Nitin Upadhyay / Procedia Computer Science 122 (2017) 1061–1068 1063
Nitin Upadhyay Procedia Computer Science 00 (2017) 000–000

2.2 Evaluation techniques

One of the popular decision-making technique utilized for the cloud service evaluation and raking is Analytical
Hierarchical Process (AHP). The framework proposed in [14, 24] utilizes AHP based ranking mechanism which
can evaluate the Cloud services based on different applications depending on QoS requirements. Various other
works have been reviewed in [2] have utilized AHP technique for the cloud resources. In [17], based on AHP
and expert scoring system various SaaS products are evaluated. In [14, 24], AHP is utilized to evaluate the IaaS
products. Though other techniques have also been employed to evaluate and rank cloud services, they limit in
considering both qualitative and quantitative criteria and computation effectively [2]. Although AHP is an
effective decision-making tool, disadvantages such as complex pair wise comparison and subjectivity makes it a
complex tool. Also, the comparisons become computationally intensive and unmanageable when the criteria and
alternatives are large in number. Thus, this paper utilizes an effective mechanism by employing Technique for
Order Preference by Similarity to Ideal Solution (TOPSIS), that can consider large number of criteria (qualitative
and quantitative) and alternatives to evaluate and rank cloud service. The technique is computational effective
and easily manageable.

2.3 Evaluation criteria and metrics

In [22], metrics for measuring scalability, cost, peak load handling, and the fault tolerance of Cloud environments
is proposed. The framework proposed in [14, 24] is used only for quantifiable QoS attributes such as
Accountability, Agility, Assurance of Service, Cost, Performance, Security, Privacy, and Usability. It is not
suitable for non-quantifiable QoS attributes such as Service Response Time, Sustainability, Suitability, Accuracy,
Transparency, Interoperability, Availability, Reliability and Stability. Garg et al. [14] proposed metrics for Data
Center Performance per Energy, Data Center Infrastructure Efficiency, Power Usage Efficiency, Suitability,
Interoperability, Availability, Reliability, Stability, Accuracy, Cost, Adaptability, Elasticity, Usability,
Throughput and efficiency and Scalability. Few of the metrics such as suitability is not computationally effective.
General common criteria mentioned and covered in many studies fall under following - cost, performance,
availability, reliability, security, usability, agility and reputation [20].

3. QoS Evaluation Framework

In this section, QoS evaluation and raking framework for cloud service is proposed. Fig.1 illustrates QoS
evaluation and raking framework in which a plurality of cloud services can be evaluated and ranked.

Fig 1. QoS evaluation and ranking framework


1064 Nitin Upadhyay / Procedia Computer Science 122 (2017) 1061–1068
Nitin Upadhyay/ Procedia Computer Science 00 (2017) 000–000

The framework includes components such as - cloud administrator, cloud data discovery, cloud service discovery
and cloud services. Cloud administrator is composed of cloud service measurement component and cloud
manager component. The cloud administrator component communicates with cloud data discovery component
for getting required service parameter data. The cloud data discovery component is composed of cloud monitor
component and history manager component. In evaluating and ranking a cloud service, the proposed framework
does not necessarily select the cost effective i.e. least expensive cloud service provider. This is so as the service
measurement depends in multiple other parameters which directly or indirectly effect the cost of the service.The
cloud administrator component is responsible for computing the QoS of cloud service by generating cloud service
ranking in the form of indices. The cloud service measurement component receives the customer’s request for
the cloud service evaluation. It collects all their requirements and performs the discovery and ranking of suitable
services using other components. The cloud manager component keeps track of SLAs of customers with Cloud
providers and their fulfillment history. The cloud service measurement component uses one or more QoS
parameters to generate service index as to which cloud service provides best fit the user service request
requirements. The cloud manager component, manages the smooth gathering of information from cloud
administrator component and delegating it to the cloud service measurement component. The cloud data
discovery component deals with gathering requisite service level data to compute QoS of cloud service ranking.
The cloud monitor component first discovers the Cloud services which can satisfy user’s essential QoS
requirements. Then, it monitors the performance of the Cloud services. The history manager component stores
the history of services provided by the cloud provider. The cloud monitor component gathers data such as speed
of VM, memory, scaling latency, storage performance, network latency and available bandwidth. It also keeps
track of how SLA requirements of previous customers are being satisfied by the Cloud provider. The history
manager component stores the past customer feedback, interaction and service experience information about the
cloud service for each cloud vendors. The cloud service discovery component stores the services and their
features advertised by various Cloud providers. The cloud service vendors provide the information of vendors
who provides cloud related services. The cloud service measurement component utilizes decision-making method
as described in Fig. 2 in a structured way.

Fig 2. Evaluation of Cloud Services


Nitin Upadhyay / Procedia Computer Science 122 (2017) 1061–1068 1065
Nitin Upadhyay Procedia Computer Science 00 (2017) 000–000

4. Illustrative Example

To validate the proposed framework, experiments have been conducted for three cloud services – Service1 (S1),
Service2 (S2) and Service3 (S3) and seventeen quality attributes, (Qi1, Qi2…Qi17) for two different user
requirements. The QoS data is collected from various evaluation studies for three IaaS Cloud providers: Amazon
EC2, Windows Azure, and Rackspace [14, 16, 22, 24]. Largely, the data for the attributes as shown in Table 1
and Table 2 have been taken from the data specified in [24] in order to maintain the consistency in the usage of
the measure values for the quality attributes for the cloud service evaluation and ranking. Preferences as weights
(WQi1, WQi2…WQi17) associated to respective quality attributes were captured from [14, 16], see Table 3.
These weights are also approved by the expert key stakeholder. Accountability (Qi1), CPU Capacity (Qi2),
Memory Capacity (Qi3), Disk Capacity (Qi4), Availability (Qi5), CPU Service stability (Qi6), Memory Service
Stability (Qi7), Free support serviceability (Qi8), Type of Support Serviceability (Qi9), and security (Qi10) are
benefit attributes. Time Elasticity time (Qi11), Service stability upload time (Qi12), On-going VM cost (Qi13),
On-going data cost (Qi14), On-going storage cost (Qi15), service response time range (Qi16) and service
response average value (Qi17) are cost attributes.

Table 1. QoS attributes and values of clouded services - Service 1 (S1), Service 2 (S2) and
Service 3 (S3) (adapted from [14, 16, 22, 24])
QoS quality attributes Service 1 Service 2 Service 3
Accountability - Level: 0-10 4 8 4
CPU 9.6 12.8 8.8
Capacity Memory 15 14 15
Disk 1690 2040 630
Agility Elasticity Time 80-120 520-780 20-200

Availability --- 99.95% 99.99% 100%


Service Upload time 13.6 15 21
stability CPU 17.9 16 23
Memory 7 12 5
Serviceabilit Free support 0 1 1
Assurance y Type of 24/7, 24/7, Phone, 24/7,
support Phone, Urgent, Phone,
Urgent Response, Urgent
Response, Diagnostic
Diagnostic Response
Tools
Tools
VM cost 0.68 0.96 0.96
Cost On-going Data cost 10 10 8
cost
Storage cost 12 15 15

Range 80-120 520-780 20-200


Performance Service Average 100 600 30
response value
time
Security Level: 0-10 4 8 4
1066 Nitin Upadhyay / Procedia Computer Science 122 (2017) 1061–1068
Nitin Upadhyay/ Procedia Computer Science 00 (2017) 000–000

Table 2. QoS attributes and values of clouded services - Requirement 1, Requirement 2 (adapted from [14, 16])
QoS quality attributes Requirement 1 Requirement 2
Accountability - Level: 0-10 4 4
CPU 6.4GHZ 9GHZ
Capacity Memory 10GB 12GB
Disk 500GB 700GB
Agility Elasticity Time 60-120 Sec 70-120Sec

Availability --- 99.9% 99%


Service Upload -- --
stability time
CPU --- ---
Memory --- ---
Assurance Serviceability Free --- ---
support
Type of 24/7, 24/7,
support Phone, Phone,
Urgent
Response,

VM cost <1$/HRS <1$/HRS


Cost On-going cost Data cost 100GB/month 120GB/month
Storage 1000GB 1000GB
cost
Range 60-120Sec 70-120Sec
Performance Service Average --- ---
response time value
Security Level: 0-10 4 4

Table 3. Consumer requirement weights for the quality attributes ((Qi1, Qi2…Qi17)). (adapted from [14, 16])
WQi1 WQi2 WQi3 WQi4 WQi5 WQi6 WQi7 WQi8 WQi
9
0.05 0.5 0.3 0.2 0.7 0.4 0.3 0.7 0.3
User Qi10 Qi11 Qi12 Qi13 Qi14 Qi15 Qi16 Qi17
requirement 1 0.05 0.6 0.3 0.6 0.2 0.2 0.5 0.5
WQi1 WQi2 WQi3 WQi4 WQi5 WQi6 WQi7 WQi8 WQi
9
User 0.05 0.1 0.2 0.3 0.9 0.4 0.3 0.7 0.3
requirement 2
Qi10 Qi11 Qi12 Qi13 Qi14 Qi15 Qi16 Qi17
0.05 0.6 0.5 0.6 0.2 0.2 0.5 0.5

In the current work, the cloud environment is created for three Cloud services from three Cloud Providers
and two user’s requirements (requirement1, requirement2) [16]. The result of the computation performed on the
three services is shown in Table 4
Nitin Upadhyay / Procedia Computer Science 122 (2017) 1061–1068 1067
Nitin Upadhyay Procedia Computer Science 00 (2017) 000–000

Table 4. Case Study – QoS Evaluation and Ranking of cloud services - Service 1 (S1), Service 2 (S2) and Service
3 (S3) (adapted from [14, 16, 24])
S+ S- Ci* Ranking
S1 0.5292 0.7950 0.6003 2
User requirement 1
S2 0.8888 0.500 0.3822 3
S3 0.2691 1.0109 0.7897 1

S+ S- Ci* Ranking
User requirement 2 S1 0.523 0.805 0.6064 2
S2 0.889 0.557 0.3855 3
S3 0.290 1.0108 0.7765 1
The proposed framework is flexible to consider multitude qualitative and quantitative data for the evaluation
and ranking of QoS of cloud services. The result shows that for the user requirement 1 the three cloud services
result into the ranking as – S3 > S1 > S2, see Fig. 3. And for the user requirement 2 the three cloud services result
into the ranking as - S3 > S1 > S2, see Fig. 4. The results produced by the framework is consistent with the results
mentioned in [14, 16, 24] but the proposed framework is effective, manageable, computationally economical and
simple to use. The framework provides benefits to the different stakeholders in cloud service evaluation e.g.
architect, decision makers, modelers, analysts, designers, developers, quality analysts, domain experts, testers,
product manager and consultants in designing, acquiring, composing and offering effective cloud services. The
framework is capable enough to consider multiple and extended quality attributes of interest for the stakeholders
in order to evaluate and rank the cloud service.

Fig 3. QoS ranking of cloud service for User requirement 1 Fig 4. QoS ranking of cloud service for User
requirement 2

5. Conclusion

Cloud computing (CC) has enabled various organizations and industries - small, medium or large scale, to
outsource their various IT needs. The CC model provide a challenging task to the cloud consumers to select the
optimal and best cloud service from the multitude of cloud services as per the requirements. In this paper, a
systematic QoS evaluation and ranking framework for cloud services is presented. The proposed framework is
computationally effective, easily manageable and flexible to consider multitude of qualitative and quantitative
data for the evaluation and ranking of cloud services. An illustrative case study example is described to validate
and demonstrate the applicability of the proposed framework. In the future work, the framework will be extended
to capture the uncertainty in the QoS requirements for cloud.
1068 Nitin Upadhyay / Procedia Computer Science 122 (2017) 1061–1068
Nitin Upadhyay/ Procedia Computer Science 00 (2017) 000–000

References

[1] Cloud Computing Innovation Council of India. 2014. Cloud Computing Innovation in India: A Framework and Roadmap - White
Paper 2.0 , IEEE-SA, USA.
[2] Whaiduzzaman, M., Gani A., Anuar, N. B., Shirazm M., Haque, M. N., and Haque, I. T. 2014. “Cloud service selection using
multicriteria decision analysis.,” ScientificWorldJournal (2014), pp. 1-10.
[3] Kossmann, D., Kraska, T., and Loesing, S. 2010. “An Evaluation of Alternative Architectures for Transaction Processing in the Cloud,”
Proc. 2010 Int. Conf. Manag. data - SIGMOD ’10, pp. 579
[4] Wu, L., Garg, S. K., and Buyya, R. 2012. “SLA-based admission control for a Software-as-a-Service provider in Cloud computing
environments,” J. Comput. Syst. Sci., (78:5), pp. 1280–1299.
[5] Hashem, I. A. T., Yaqoob, I., Anuar, N. B., Mokhtar, S., Gani, A., and Khan, S. U. 2015. “The rise of ‘big data’ on cloud computing:
Review and open research issues,” Inf. Syst. (47), pp. 98–115
[6] Vouk, M. A.. 2008. “Cloud computing - Issues, research and implementations,” Proc. Int. Conf. Inf. Technol. Interfaces, ITI, pp. 31–40.
[7] Jula, A., Sundararajan, E., and Othman, Z. 2014. “Cloud computing service composition: A systematic literature review,” Expert Syst.
Appl., (41:8), pp. 3809–3824.
[8] Giessmann, A. 2013. “Business Models of Platform as a Service ( PaaS ) Providers : Current State and Future Directions,” Journal of
Information Technology and Application, (13:4), pp. 31–55.
[9] Alam, M. I., Pandey, M., and Rautaray, S. S. 2015. “A Comprehensive Survey on Cloud Computing,” I.J. Inf. Technol. Comput. Sci.,
(2:2), pp. 68–79.
[10] Le, S., Dong, H., Hussain, F. K., Hussain, O. K., and Chang, E. 2014. “Cloud service selection: State-of-the-art and future research
directions,” J. Netw. Comput. Appl., (45) pp. 1–17
[11] Esposito, C., Ficco, M., Palmieri, F., and Castiglione, A. 2015. “Smart Cloud Storage Service Selection Based on Fuzzy Logic, Theory
of Evidence and Game Theory,” IEEE Trans. Comput., 1–14.
[12] Puthal, D., Sahoo, B. P. S., Mishra, S., and Swain, S. 2015. “Cloud Computing Features, Issues, and Challenges: A Big Picture,” 2015
Int. Conf. Comput. Intell. Networks, no. Cine, pp. 116–123.
[13] Tao, F., Zhang, L., Venkatesh, Luo, V. C., Y., and Cheng, Y. 2011 “Cloud manufacturing: a computing and service-oriented
manufacturing model,” Proc. Inst. Mech. Eng. Part B J. Eng. Manuf., (225), pp. 1969–1976.
[14] Garg, S. K., Versteeg, S., and Buyya, R. 2012. “A framework for ranking of cloud computing services,” Futur. Gener. Comput. Syst.,
(29:4), pp. 1012–1023.
[15] Gass, S. I. 2005. “Model world: The great debate - MAUT versus AHP,” Interfaces (Providence)., (35:4), pp. 308–312.
[16] Mamoun, M. H., and Ibrahim, E. M. 2014. “A Proposed Framework for Ranking and Reservation of Cloud Services,” (4:9), pp. 536–
541.
[17] Godse, M., and Mulik, S. 2009. “An approach for selecting Software-as-a-Service (SaaS) product,” CLOUD 2009 - 2009 IEEE Int.
Conf. Cloud Comput., pp. 155–158.
[18] Menzel, M., and Ranjan, R. 2012. “CloudGenius: Decision Support for Web Server Cloud Migration,” WWW 2012 – Sess. Web Eng.
2, vol. abs/1203.3, pp. 979–988.
[19] Kokash, N. and Andrea, V. D., (2007) “Evaluating Quality of Web Services: A Risk-driven Approach.” Lecture Notes in Computer
Science, (4439), pp.180-194.
[20] Bardsiri, A. K. and Hashemi, S. M. 2014. “QoS Metrics for Cloud Computing Services Evaluation,” Int. J. Intell. Syst. Appl., (6:12),
pp. 27–33.
[21] Zheng, Z., Wu, X., Zhang, Y., Lyu, M. R., and Wang, J. 2013. “QoS ranking prediction for cloud services,” IEEE Trans. Parallel
Distrib. Syst., (24:6), pp. 1213–1222.
[22] Iosup, A., Ostermann, S., and Yigitbasi, N. 2011. “Performance Analysis of Cloud Computing Services for Many-Tasks Scientific
Computing,” IEEE Transactions on Parallel and DIstributed Systems (22:6), pp. 1–16.
[23] Li, A., Yang, X., Kandula, S., and Zhang, M. 2010. “CloudCmp: Comparing Public Cloud Providers,” Proc. 10th Annu. Conf. Internet
Meas. - IMC ’10, p. 1.
[24] Garg, S. K., Versteeg, S., and Buyya, R. 2011. “SMICloud: A framework for comparing and ranking cloud services,” Proc. - 2011 4th
IEEE Int. Conf. Util. Cloud Comput. pp. 210–218.
[25] Saravanan, K. and Kantham, M. L. 2013. “An enhanced QoS Architecture based Framework for Ranking of Cloud Services,” Int. J.
Eng. trends Technol., (4:4), pp. 1022–1031.
[26] Liu, X., Yang, Y., Yuan, D., Zhang, G., Li, W., and Cao, D. 2011. “A generic QoS framework for cloud workflow systems,” Proc. -
IEEE 9th Int. Conf. Dependable, Auton. Secur. Comput. DASC 2011, pp. 713–720.

Das könnte Ihnen auch gefallen