Sie sind auf Seite 1von 21

Benchmarking: An International Journal

Comparative evaluation of Indian technical institutions using distance based approach


method
Manik Chandra Das Bijan Sarkar Siddhartha Ray
Article information:
To cite this document:
Manik Chandra Das Bijan Sarkar Siddhartha Ray, (2013),"Comparative evaluation of Indian technical
institutions using distance based approach method", Benchmarking: An International Journal, Vol. 20 Iss 5
pp. 568 - 587
Permanent link to this document:
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

http://dx.doi.org/10.1108/BIJ-06-2011-0030
Downloaded on: 10 November 2016, At: 07:38 (PT)
References: this document contains references to 54 other documents.
To copy this document: permissions@emeraldinsight.com
The fulltext of this document has been downloaded 342 times since 2013*
Users who downloaded this article also downloaded:
(2013),"Benchmarking the quality function deployment models", Benchmarking: An International Journal,
Vol. 20 Iss 6 pp. 825-854 http://dx.doi.org/10.1108/BIJ-07-2011-0052
(2008),"Research in technical colleges", Education + Training, Vol. 50 Iss 1 pp. 32-39 http://
dx.doi.org/10.1108/00400910810855469

Access to this document was granted through an Emerald subscription provided by emerald-srm:203778 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald for
Authors service information about how to choose which publication to write for and submission guidelines
are available for all. Please visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well as
providing an extensive range of online products and additional customer resources and services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee
on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive
preservation.

*Related content and download information correct at time of download.


The current issue and full text archive of this journal is available at
www.emeraldinsight.com/1463-5771.htm

BIJ
20,5 Comparative evaluation of Indian
technical institutions using
distance based approach method
568
Manik Chandra Das
Automobile Engineering Department, MCKV Institute of Engineering,
Received 16 June 2011
Revised 1 November 2011 Liluah, India
19 November 2011 Bijan Sarkar
Accepted 19 November 2011
Production Engineering Department, Jadavpur University,
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

Kolkata, India, and


Siddhartha Ray
Mechanical Engineering Department,
National Institute of Technical Teachers’ Training & Research,
Kolkata, India

Abstract
Purpose – Due to liberalization, privatization and globalization, the need of competent technical
manpower at an economical cost is increasing rapidly. Many foreign multinationals are focusing on
India for employable talents. Many technical institutions with cutting edge technologies and leading
edge techniques are being set up by foreign collaboration, national and private initiatives. The
objective of this study is to propose a model for performance evaluation and benchmarking of Indian
technical institutions from perspective of all stakeholders.
Design/methodology/approach – For the proposed framework, a multiple criteria decision-making
tool, distance-based approach (DBA) methodology is applied for performance evaluation of seven
Indian technical institutions taking into account some selected criteria like, faculty strength (FS),
student intake (SI), number of PhD awarded (PhD), number of patents applied for (patent), the campus
area in acres (CA) and tuition fee per semester in rupees (TF). Consulting the experts in various fields
with the help of certain questionnaire and aggregating their views by conducting ameliorated nominal
group technique session, we select these evaluation criteria. The subjective weights of the criteria are
determined using analytic hierarchy process (AHP). For the analysis, the required data are collected
from annual report published by Ministry of Human Resource and Development (MHRD) for the year of
2007-08.
Findings – In this paper, we have chosen seven centrally funded technical institutions for study and
the institutions are coded as A, B, C, D, E, F and G. The result of the study reveals that A is the best
and F is the worst. The ranking we get is in the order of A s B s E s C s G s D s F. From the
result it is understood that A can be considered as benchmark for B, C and E (which form the second
group) and this second group can be considered as an improvement target for the rest. At the end a
holistic technical education system model (HTESM) is proposed.
Originality/value – This paper is one of the few studies that evaluate the performance of technical
institutions in India. The novelty in the approach is that DBA and AHP are being used as a
Benchmarking: An International benchmarking technique in a simple methodology which is generic in nature.
Journal
Vol. 20 No. 5, 2013 Keywords Indian technical institutions, Performance evaluation, Benchmarking, DBA, Ranking,
pp. 568-587
q Emerald Group Publishing Limited Performance appraisal, India
1463-5771 Paper type Research paper
DOI 10.1108/BIJ-06-2011-0030
1. Introduction Indian technical
To make India a knowledge-based society the most important area that needs to be institutions
addressed first is education. Education makes Man and therefore it builds the nation.
As envisaged by Indian erudite scholars around 1,000 years ago, education is a never
ending journey from less light to more light (Tamaso Ma Jyotirgamaya). It is the
manifestation of perfections already present in man. Indian educational system has a
glorious past. Nalanda University which was visited by the great Chinese Buddhist 569
monk Hsüan-tsang is one of the most ancient centers of higher learning in India. It is a
wonder in today’s day and date that how the reputation of such institution spread far
and wide without the aid of telephone, fax internet, etc.
The primary objective of education is to impart knowledge and so as to improve the
quality of life. United Nations Development Programs (UNDP) under the leadership
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

of Prof. Amartya Sen and Prof. M. Haque devised a composite index called Human
Development Index (HDI) to measure the quality of life. The components of HDI are life
expectancy, literacy rate and gross domestic product (GDP) of the country. According
to 1997 HDI value India ranks 139th out of 174 countries. Education can improve the
HDI value. We need education to be enlightened and upgraded. Indian patriotic monk
Swami Vivekananda said, “we want that education by which character is formed,
strength of mind is increased, the intellect is expanded and by which one can stand on
one’s own feet”.
Technical education plays a vital role in human resource development of the country
by creating skilled manpower, enhancing industrial productivity and improving the
quality of life. The technical education system in the country can be broadly classified
into three categories such as – Central Government funded institutions, State
Government/State funded institutions and self-financed institutions. In 2007-2008, there
were 52 centrally funded institutions (CFI) of technical and science education. The
breakup of these 52 institutions is furnished in Table I. These institutions function
following the guidelines stipulated by All India Council for Technical Education
(AICTE) and the Council of architecture. As of now 2,300 engineering colleges are
running in India and 600,000 students are passing out in each year (Biswas et al., 2010).
As the uniform quality output has become the prime concern today, therefore,
performance evaluation and benchmarking of these technical institutions have become
a research issue. All the stakeholders want to get optimum benefits at shortest period of
time and at an economical cost to improve the quality of life. Therefore, this is high time
to do the performance evaluation of these technical institutions. In this context

Name of the institutions Number of institutions

Indian Institutes of Technology (IITs) 7


Indian Institutes of Management (IIMs) 7
Indian Institute of Science (IISc), Bangalore 1
Indian Institutes of Science, Education & Research (IISERs) 3
National Institutes of Technology (NITs) 20
Indian Institutes of Information Technology (IIITs) 4
National Institutes of Technical Teachers’ Training & Research (NITTTRs) 4
Others 6 Table I.
Total 52 List of CFI
BIJ benchmarking has been considered as one of the popular management practices adopted
20,5 by organizations to understand how well these are performing relative to their
competitors. The idea of benchmarking is not new. It was conceptualized long ago by
English poet P. B. Shelley (1792-1822). In his poem Ozymandias, he wrote “My name is
Ozymandias, king of kings: look on my works, ye Mighty, and despair!” The term “king of
kings” is synonymous to the concept of “best of best” in benchmarking. Benchmarking is
570 not a destination but an endless journey towards continuous quality improvement.
Benchmarking is defined as the process of identifying best practice in relation to
both products and the processes by which those products are created and delivered.
The search for best practice can take place both inside a particular organization
and also in other organizations. The objective of benchmarking is to understand
and evaluate the current position of an organization in relation to best practice and
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

to identify areas and means of performance improvement. Over the years various
benchmarking concepts have been developed (Augusto et al., 2008; Kablan and
Dweiri, 2003; Andersen and Jordan, 1998) by many researchers. The concept of
benchmarking can be used in fuzzy environment also (Maravelakis et al., 2006).
As seen in Figure 1, this concept can be applied almost all the sectors of society due to
its generic nature. However, benchmarking process may change depending on area of
its application – such as R&D benchmarking (Hurmelinna et al., 2002), airport
services benchmarking (Schmidberger et al., 2009), maintenance performance
benchmarking (Komonen, 2002), etc.

Regression Analysis

B Administration Education
a
l
a
n
c
e D
d E
Healthcare Performance A
S Measurement and Finance
c Benchmarking
o
r
e
c
a
r Tele-
d Communication
Transportation
Figure 1.
Benchmarking tools and
areas of applications
Other MCDM Tools
Through exhaustive literature survey, we come to know about a lot of performance Indian technical
measurement and benchmarking tools. Out of these tools data envelopment analysis institutions
(DEA) is most commonly found in literature (Debnath and Shankar, 2008; Hilmola,
2011; Singh et al., 2011; Beriha et al., 2011; Kumar, 2011; Goncharuk, 2011; Sufian,
2011). Other performance evaluation and benchmarking tools are balanced scorecard
(Punniyamoorthy and Murali, 2008; Doloi, 2010; Valmohammadi and Servati, 2011;
Kollberg and Elg, 2011; Khan et al., 2011), regression analysis (Gomes and Yasin, 2011; 571
Bahri et al., 2011), etc. Apart from these some other MCDM tool like-AHP (Tavana,
2004; Sarmiento and Thomas, 2010), ANP (Kirytopoulos et al., 2008), TOPSIS and goal
programming (Mehregan et al., 2010) are also used for the purpose of benchmarking.
During the past two decades considerable volume of research has been conducted
regarding university evaluation worldwide. Most of the studies have focused on the
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

UK or Australia. The use of league table (Yorke, 1997, 1998; Herbert and Thomas,
1998) is found to rank academic institutions in the UK. League tables are generally
used to compare academic performance of various institutions by considering a set of
well-defined criteria such as – student satisfaction, research assessment/quality, entry
standards, student-staff ratio, academic services spend, facilities spend, good honours,
graduate prospects and completion rate. A statistical technique known as
Z-transformation is applied to each criterion to create a score for that criterion.
Weighted Z-scores on each criterion help to determine the final rank of the institution.
Apart from the concept of league table researches on universities in the UK include
those by Athanassopoulos and Shale (1997), Glass et al. (1995), Johnes (1996, 2006),
Casu and Thanassoulis (2006) and Flegg et al. (2004). In the UK Portela and
Thanassoulis (2001) have investigated the efficiency of schools also. Plenty of studies
have been reported on efficiency analysis of Australian universities. Among the
authors that have written about it we can mention Avkiran (2001), Abbott and
Doucouliagos (2003), Worthington and Lee (2008), etc. Kao and Hung (2008) have
concentrated on performance evaluation of academic departments in Tiwan.
Fandel (2007) makes a study on German universities. Korhonen et al. (2001) analyse
18 research units at Helsinki school of Finland. Elsewhere Hashimoto and Cohn (1997)
have investigated Japanese universities, McMillan and Datta (1998) have investigated
Canadian universities. In India, Tyagi et al. (2009) have done similar study dealing with
assessment of academic departments of Indian Institute of Technology (IIT) Roorkee.
All the study mentioned above use various DEA models for the purpose.
In the list of the best technical institutions in India, the first name comes to mind is a
group of institutions called IITs. The purpose of this paper is to assess the relative
performance of these IITs based on multiple criteria. In this paper, we have considered
seven IITs located at Kharagpur, Bombay, Madras, Kanpur, Delhi, Guwahati and
Roorkee for study and these are coded as A, B, C, D, E, F and G, respectively. These
institutions are declared as “institutions of national importance”. The main objective of
IITs is to impart world class education in engineering and technology, to conduct
research in the relevant field, and to further advancement of learning and
dissemination of knowledge. As mentioned earlier lot of research work dealing with
performance evaluation of academic institutions worldwide have been reported in the
last 20 years. Several approaches have been applied for this purpose like performance
indicators, parametric methods (such as ordinary least square method, stochastic
frontier method) and non-parametric method – such as various DEA models.
BIJ Each method has its strength and limitations. For single input and single output case
20,5 the ratio style performance indicators can work well. But when multiple criteria (which
may be conflicting in nature) exist, it is unable to draw right inference. Parametric
methods require explicit functional form for technology. In this paper we suggest an
integrated multi-criteria decision-making model consisting of analytic hierarchy
process (AHP) and distance-based approach (DBA) method to assess the relative
572 performance of the IITs. The contribution of the present work is that, this model is
robust; it is easy to deal with; complex mathematics is not required and the evaluation
criteria encompass stakeholders’ preference. Computation of the degree of relative
importance for evaluation criteria is made through AHP. DBA method helps to
compute an overall score on the basis of which benchmarks are identified.
The paper is organized as follows: Section 2 comprises of application procedure of
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

proposed method. Section 3 gives information about data and computation. A holistic
technical education system model (HTESM) is proposed in Section 4. In the last section
conclusion is given.

2. Research design
2.1 Selection of evaluation criteria
The performance of technical institutions in absolute sense is very difficult to measure.
There are lot of factors/criteria/attributes/objectives those affect the performance of the
institutions and the measurement result is very much sensitive to the selection of the
criteria. In the literature mentioned above, the criteria are categorized either inputs or
outputs to conform to DEA algorithms. Thus, the selection of criteria plays a crucial
role in performance evaluation. According to Barros (2005, p. 504):
[. . .] the criterion of available data is frequently used, since it encompasses the other two criteria
applied to the selection of the determinants. The first of the two is the literature survey, which is a
way to ensure validity of research and therefore a criterion to take into account. The remaining
criterion for measurement selection is the professional opinion of senior management.
For our research we form an expert committee. Aggregating the views of the
committee members by conducting ameliorated nominal group technique session, we
select the following attributes for the performance evaluation of technical institutions:
.
faculty strength (FS);
.
student intake (SI);
.
number of PhD awarded (PhD);
.
number of patents applied for (patent);
.
the campus area in acres (CA); and
.
tuition fee per semester (TF) in rupees.

Instead of classifying the criteria into output and input, in our study we call these either
beneficial or non-beneficial criteria, respectively. In most of the literatures FS and SI are
considered as inputs whereas we consider them beneficial criteria, i.e. higher the better
according to Taguchi’s concept. The reason is that India is the second most populous
country in the world. This country is blessed with the availability of human resources in
the working age group. The challenge before the country today is that this available
manpower has to be made employable by imparting necessary training and skill through
technical education to cater to the need of expanding economy. It is understood from the Indian technical
published news that the IITs are suffering from acute shortage of qualified faculties. institutions
Desired teacher student ratio, as prescribed by AICTE (an autonomous body under
Ministry of Human Resource and Development (MHRD)) should be 1:15. Many
institutions are running with a lower value of this ratio. Therefore, these two criteria are
considered as beneficial criteria from socioeconomic point of view. In this paper we
consider TF in rupees as the only non-beneficial criterion. 573
The data used in this paper are all public data. These are collected from Annual
Report (2007-2008) published by MHRD. Some of the data are taken from internet and
individual web site of the institutions. Figure 2 shows the hierarchy of the criteria for
the assessment of technical institutions using DBA method.
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

2.2 The proposed model


Multiple criteria decision making is not an esoteric subject. Irrespective of field, it can
be employed to select and prioritize the alternatives in the set. Lot of multiple criteria
analysis tools like-AHP (Saaty, 1980), TOPSIS (Hwang and Yoon, 1981), DEA
(Charnes et al., 1978), ELECTRE (Roy, 1968), MOORA (Brauers and Zavadskas, 2006),
etc. are available for performance evaluation and ranking of alternatives. In this paper
we use AHP to determine the weights of the evaluation criteria and DBA (Kumar and
Garg, 2010) method which is a deterministic quantitative model for performance
evaluation of technical institutions. Figure 3 shows schematic view of the proposed
model which can be divided into three phases. The phase-I deals with the team
working. The weights of the criteria are determined in phase-II and finally the
alternatives are ranked in phase-III.
2.2.1 DBA method. To work with this method, in the beginning, we define the
optimal state of the overall objective which is basically the set of optimum
simultaneous attributes values. This vector can be represented as RP(x1, x2, . . . , xn).

Performance based
Goal/objective
ranking of institutions

Criteria

Number of Number of Tuition fee


Faculty Student Campus
Ph.D patents per
strength intake area
awarded applied for semester
(FS) (SI) (CA)
(Ph.D) (Patent) (TF)

Alternatives

Figure 2.
A B C D E F G Hierarchy of the criteria
BIJ
Goal/Objective:
20,5 Performance
evaluation

Expert committee
574 formation

Phase-I
Ameliorated
nominal group
technique session
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

Selection of evaluation
List of Alternatives
criteria

Knowledge base

Determination of Phase-II
criteria weight by
AHP

Performance
evaluation by DBA
Phase-III

Figure 3.
Schematic view of the
Final ranking
proposed model

In a multidimensional space, this vector RP is called reference point (RP). The optimal
good value is basically the best value that exists within the range of values of
attributes. It seldom happens that a particular alternative has the best value for all
criteria. The numerical difference of each alternative from the so-called RP indicates
the effectiveness of alternatives to achieve the optimal state of the objective. The
smaller the difference, the closer the alternative towards the RP, and vice versa.
Therefore, the decision problem becomes the following.
The decision problem becomes:
Minimize d{Ai ðxij Þ; RP}
Subject to xij # x

where Ai(xij) and d represent the decision alternative in the multidimensional space and
the distance from the RP, respectively. This way the entire optimization problem depends
on the choice of the RP and the distance metric d. Therefore, in multidimensional space, the Indian technical
distance metric d can be written as:
institutions
vffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
uX
u n
d¼t {RP ij 2 Ai ðxij Þ}2
j¼1
575
where:
i ¼ 1, 2, . . . , m are the decision alternatives.
j ¼ 1, 2, . . . , n are decision attributes/criteria.
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

This distance can be shown graphically as in Figure 4 for two dimensional case.
In general, the above method can be implemented in following steps.
Step 1. A decision matrix is formed and expressed as follows:

F1 F2 ... Fj ... Fn
2 3
A1 x11 x12 ... x1j ... x1n
6 7
A2 6 x21 x22 ... x2j ... x2n 7
6 7
6 7
... 6 ... ... ... ... ... ... 7
6 7
6 7
D ¼ ½xij  ¼ . . . 6 ... ... ... ... ... 7
6 7
6 7
Ai 6 xi1 xi2 ... xij ... xin 7
6 7
6 ... ... ... ... ... ... 7
... 6 7
6 7
... 6 ... ... ... ... ... ... 7
6 7
4 5
Am xm1 xm2 ... xmj ... xmn

X2 RP

δ
RP (xi2) – Ai (xi2)

Ai

RP (xi1) – Ai (xi1)

Figure 4.
Distances of real vector
X1
BIJ where Ai represents the alternatives, i ¼ 1,2, . . . , m; Fj represents jth factor or attribute
or criterion, j ¼ 1, 2, . . . , n, related to ith alternative; and xij indicates the performance
20,5 of each alternative Ai with respect to each criterion Fj.
Step 2. The above matrix is converted into adjusted decision matrix, D0 ¼ [Xij]:
Where Xij ¼ xij 2 minðxij Þ; for benefit criteria and ð1Þ
576 Xij ¼ maxðxij Þ 2 xij ; for cost criteria: ð2Þ
Step 3. To avoid the influence of different dimensions of attributes, the adjusted matrix
is converted into standardized form using z-score statistic as expressed below:

X ij 2 X j
Zij ¼ ð3Þ
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

sj
where:
Pm
X j ¼ mean value of jth attributes for all alternatives ¼ ð1=mÞ i¼1 X ij ; and
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
P
S j ¼ standard deviation ¼ ð1=mÞ m i¼1 ðxij 2 xj Þ .
2

The standardized matrix can be presented as:


2 3
Z 11 Z 12 . . . Z 1j ... Z 1n
6Z 7
6 21 Z 22 . . . Z 2j ... Z 2n 7
6 7
6 ... ... ... ... ... ... 7
6 7
6 ... ... ... ... ... ... 7
6 7
Zij ¼ 6 Z i1 Z i2 . . . Z ij ... Z in 7
6 7
6 ... ... ... ... ... ... 7
6 7
6 7
6 ... ... ... ... ... ... 7
4 5
Z m1 Z m2 . . . Z mj ... Z mn

Step 4. The next step is to obtain the set of Reference Point (RP) vector as the set of best
values for all the attributes in the standardized matrix Zij. Say, RP ¼ (Zbj), “b” stands
for best value and j ¼ 1,2, . . . , n. Then we compute the difference or distance from each
alternative to the RP by subtracting each element of RP from the corresponding
element in the alternative set. This results the following interim matrix:
2 3
Z b1 2 Z 11 Z b2 2 Z 12 . . . Z bj 2 Z 1j . . . Z bn 2 Z 1n
6Z 2Z Z b2 2 Z 22 . . . Z bj 2 Z 2j . . . Z bn 2 Z 2n 7
6 b1 21 7
6 7
6 ... ... ... ... ... ... 7
6 7
6 . . . . . . . . . . . . . . . . . . 7
6 7
Z0ij ¼ 6 Z b1 2 Z i1 Z b2 2 Z i2 . . . Z bj 2 Z ij . . . Z bn 2 Z in 7
6 7
6 ... ... ... ... ... ... 7
6 7
6 7
6 ... ... ... ... ... ... 7
4 5
Z b1 2 Z m1 Z b2 2 Z m2 . . . Z bj 2 Z mj . . . Z bn 2 Z mn
In the above matrix all the attributes are considered to be equally important. However, Indian technical
if the aggregated preference weight Wj for selection criteria j is introduced, the revised
form of the above interim matrix will be:
institutions
2 3
ðZ b1 2 Z 11 ÞW 1 ðZ b2 2 Z 12 ÞW 2 . . . ðZ bj 2 Z 1j ÞW j . . . ðZ bn 2 Z 1n ÞW n
6 7
6 ðZ b1 2 Z 21 ÞW 1 ðZ b2 2 Z 22 ÞW 2 . . . ðZ bj 2 Z 2j ÞW j . . . ðZ bn 2 Z 2n ÞW n 7
6 7 577
6 ... ... ... ... ... ... 7
6 7
6 7
6 ... ... ... ... ... ... 7
6 7
Z00ij ¼ 6 7
6 ðZ b1 2 Z i1 ÞW 1 ðZ b2 2 Z i2 ÞW 2 . . . ðZ bj 2 Z ij ÞW j . . . ðZ bn 2 Z in ÞW n 7
6 7
6 7
6 ... ... ... ... ... ... 7
6 7
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

6 ... ... ... ... ... ... 7


6 7
4 5
ðZ b1 2 Z m1 ÞW 1 ðZ b2 2 Z m2 ÞW 2 . . . ðZ bj 2 Z mj ÞW j . . . ðZ bn 2 Z mn ÞW n

where Wj is the weight of jth attribute, which can be determined applying Saaty’s
AHP. This tool which is first introduced by T. L. Saaty, works on eigenvalue approach
to the pairwise comparison. In this method the relative preference of the qualitative
factors are expressed in terms of Saaty’s nine-point scale. The AHP method is based on
three principles: first, structure of the model; second, comparative judgment of the
alternatives and the criteria; third, synthesis of the priorities. A brief description of
AHP is given in the Appendix. As observed in the review paper on AHP (Ho, 2008), we
find that due to its simplicity and wide applicability, it has been successfully combined
with other tools, like – mathematical programming, quality function deployment
(QFD), meta-heuristics, SWOT analysis and DEA.
Step 5. Finally the Euclidian composite distance, d, between each alternative to RP
is derived using the following formula:
vffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
uX
u n
d¼t {ðZ bj 2 Z ij ÞW j }2 ð4Þ
j¼1

On the basis of this distance metric, d, the technical institutions will be compared
and ranked. Higher the composite distance of individual institution from RP, lower is
the rank.

3. Data and computation


For the purpose of performance evaluation of the technical institutions, the quantitative
data used are shown in Table II. In the table the institutions are considered as the
alternatives and these are placed in the rows. The criteria or attributes are placed in
columns. The graphical representations of the same are also shown in the Figure 5.
The following table (Table III) shows the adjusted decision matrix D0 , the elements of
which are calculated by using equations (1) and (2). The mean and standard deviation
value for each attribute column of D0 are also shown in the last two rows of the same table.
By using the z-score statistic formula in equation (3), the standardized matrix as
determined is shown in Table IV.
BIJ
Criteria !
20,5 Optimization direction ! FS SI PhD Patent CA TF
Alternatives # Max. Max. Max. Max. Max. Min.

A 519 1,901 167 13 2,100 22,596


B 433 1,550 152 10 548.2 22,601
578 C 392 967 125 13 622.9 20,400
Table II. D 311 914 86 24 1,046.5 25,542
Quantitative data for E 419 1,550 145 14 320 22,305
performance evaluation F 193 796 16 5 710.1 22,800
of alternatives G 367 1,467 107 0 360 26,820
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

180

160

140

120 FS ('10s)
criteria value

SI ('100s)
100
PhD

80 Patent
CA ('100 Acres)
60 TF ('1,000 Rs)

40

20
Figure 5.
Graphical display of
0
criteria value of the
A B C D E F G
alternatives
Alternatives

Alternatives FS SI PhD Patent CA TF

A 326 1,105 151 13 1,780 4,224


B 240 754 136 10 228.2 4,219
C 199 171 109 13 302.9 6,420
D 118 118 70 24 726.5 1,278
E 226 754 129 14 0 4,515
F 0 0 0 5 390.1 4,020
G P 174 671 91 0 40 0
Table III. X j ¼ ð1=mÞ m i¼1 X ij 183.2857 510.4286 98 11.28571 495.3857 3,525.14286
Adjusted decision qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
Pm ffi
matrix, D0 S j ¼ ð1=mÞ i¼1 ðxij 2 xj Þ2 102.9542 413.8643 51.27052 7.565586 615.8943 2,162.21418
In the next step we obtain the set of Reference Point (RP) vector as the set of best values Indian technical
for all the attributes in the standardized matrix Zij. The elements of the RP vector are: institutions
1.386192, 1.436634, 1.033732, 1.680542, 2.085771 and 1.33883922, respectively.
In stead of giving equal importance to the attributes, we determine the aggregated
preference weight Sj according to Saaty’s AHP method. For this we consult several
experts in various fields over the country and aggregating their views we get the
following set of weights for the attributes with allowable consistency ratio which are 579
shown in Table V.
Considering the above weights for the attributes we compute the difference or
distance from each alternative to the RP by subtracting each element of RP from the
corresponding element in the alternative set. This results the final composite distance
matrix Zij00 as given in the Table VI.
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

From the above table we determine the sum of square of elements row-wise that
indicates the metric d2 and finally we determine the composite distance metric d as
explained in equation (4). The results are shown in Table VII.

Alternatives FS SI PhD Patent CA TF

A 1.386192 1.436634 1.033732 0.22659 2.085771 0.32321365


B 0.550869 0.58853 0.741167 20.16994 2 0.43382 0.32090121
C 0.152634 20.82014 0.214548 0.22659 2 0.31253 1.33883922
D 20.63412 20.94821 20.54612 1.680542 0.37525 2 1.0392786
E 0.414886 0.58853 0.604636 0.358767 2 0.80434 0.45779792
F 21.78026 21.23332 21.91143 20.83083 2 0.17095 0.22886592 Table IV.
G 20.09019 0.387981 20.13653 21.49172 2 0.73939 2 1.6303394 Standardized matrix, Zij

Attributes Weights (Wj) lmax, CI, RI CR Graphical view of weights

FS 0.4000 lmax ¼ 6.3195 FS


SI 0.0960 SI
PhD 0.2742 CI ¼ 0.0639 0.0484
Attributes

Ph.D
Patent 0.1473 Patent
CA 0.0434 RI ¼ 1.32 CA
TF 0.0391 Table V.
TF
Results obtained
0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45

Wj
with AHP

Alternatives FS SI PhD Patent CA TF

A 0 0 0 0.2141081 0 0.0397382
B 0.3341529 0.0813847 0.0802209 0.2725012 0.1094171 0.0398287
C 0.4934584 0.216562 0.2246186 0.2141081 0.1041501 0
D 0.8081838 0.2288509 0.433193 0 0.0742821 0.0930482
E 0.3885499 0.0813847 0.1176574 0.1946437 0.1255075 0.0344724 Table VI.
F 1.2666727 0.256211 0.8075574 0.3698231 0.0980016 0.0434297 Final composite distance
G 0.5905959 0.1006295 0.3208837 0.467145 0.1226871 0.1161746 matrix Zij00
BIJ The above table shows the composite distance metric (d) for each technical institution.
20,5 The overall ranking is determined on the basis of d. The alternative technical institution
with lowest d is given the first rank that with second lowest d value is given second rank,
and so on. This way, we get the preference order of alternate institutions as
A s B s E s C s G s D s F. The following Figure 6 shows the histogram of the
reciprocal of the composite distance metric d which is considered as performance score.
580 In the figure we see that the (1/d) value is the highest for institutions A and this is value
changes a little for institutions B, C and E. Therefore, institution A can be considered as
benchmark or improvement target for B, C and E which form the second group.
Similarly this second group can be considered as the improvement target for the rest
institutions, i.e. D, F and G.
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

4. A holistic technical education system model


In this paper, a HTESM which should be globally competitive is proposed. In the
proposed model we consider the impacts of political, economical, social and
technological (PEST) environments on the technical education system. After the
independence in 1947, India has directed its education policies to meet the challenges of
economic and technological developments. A lot of technical institutions have come up
in the last decade to cater to the huge diverse population of India. All the technical
institutions functions following the guidelines stipulated AICTE which is an

Alternate
institutions A B C D E F G
2
Table VII. Sum (d ) 0.0474214 0.2125324 0.3975433 0.9073658 0.2262644 2.4705123 0.7086692
Rank of alternate d 0.2177646 0.4610124 0.6305104 0.9525575 0.4756726 1.5717863 0.8418249
institutions Rank 1 2 4 6 3 7 5

5
4.5 (1/'delta')
4 mean of
Reciprocal of 'delta'

3.5 (1/'delta)

3
2.5
2
1.5
1
Figure 6. 0.5
Histogram of the
0
reciprocal of the composite
A B C D E F G
distance metric d
Technical institutions
autonomous body under MHRD. A stable government with positive vision and mission Indian technical
is required to spread the technical education all over the country so that irrespective of institutions
socioeconomic position, most of the people of the country can enjoy the benefits of
technical education. At the same time, no political interventions is desired in the field
of education in respect of manpower deployment, revision of curriculum or any kind of
activity related to education.
India is a society with sharp division along caste and class line. Though at present 581
we are expecting the GDP growth rate of the country to be 8.5-9 percent, 37 percent of
its population is still living below the poverty line. The government has implemented
the reservation policy for the people belonging to the schedule caste, schedule tribe and
other backward class to bring them back into the main stream of technical education, it
has been observed that participation of above category people in technical education is
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

not satisfactory. The reasons behind this may be lack of public awareness, high tuition
fee taken by the technical institutions, etc. The social aspect of the technical education
is to improve quality of life of the people in terms of increasing HDI. To become
globally competitive, the new development of technologies should be eco friendly and
sustainable. Therefore, as shown in Figure 6, this is the high time for the government
to reframe its technical education policies from PEST viewpoint.
The human resource for HTESM should be developed in such a way that there
should be provision for necessary orientation, training and up gradation program. On
behalf of government, a monitoring system for continuous reporting and assessment of
the technical institutions should be there. The institutions should be interconnected
with other institutions in India and abroad by networking and partnership. The
HTESM should be flexible to incorporate any strategic changes for short- and
long-term basis under requirement perspective.
It is understood from the Figure 7 that we need five prime inputs to develop HTESM.
These inputs are students, teachers, curriculum, infrastructure and management. These
inputs which form the inner core of HTESM are connected by AND gate to the HTESM.
Infrastructure has a meaning in education. World class infrastructural facilities,
i.e. libraries, laboratories, classrooms, hostels, etc. help to attract good students. The
curriculum of the HTESM should meet the global requirements. The most important
input for HTESM is the number of qualified teachers in the institution. The management
of the institutions should initiate positive effort to attract the best talents in the teaching
field. From the part of the truth table (Table VIII) we see that the HTESM will be evolved
if and only if all these five inputs are positive.

5. Conclusion
The primary mission of a technical institution is to explore and transmit knowledge.
The former is achieved via research while the later is achieved via teaching. In this age
of the knowledge economy, technical institutions play a key role in the development of
a country. As the number of young people enrolling in technical education courses in
India is increasing, the resources received from the government must be used more
efficiently to meet the increasing demand for education. Hence, the institutions should
investigate not only the educational outputs produced by the departments, but also the
resources utilized in producing the outputs. There has been a national concern of
improving the standards of technical education. Many manufacturing and service
organizations are feeling the need to introduce modern research facilities to nurturing
BIJ Development & Training
20,5

A
Political Economical
S s
582 t
r
s
e
a s
t s
e m
g e
i Teacher n
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

e Student t
s
Curriculum HTESM &
f Infrastructure
o R
r Management e
p
C o
h r
a t
n i
g n
e g
Technological Social
Figure 7.
Holistic technical
education system model
Networking & Partnership

Inputs
Teacher Student Curriculum Infrastructure Management Response

þ þ þ þ þ þˆ
2 þ þ þ þ 2
þ 2 þ þ þ 2
þ þ 2 þ þ 2
Table VIII. þ þ þ 2 þ 2
Part of the truth table þ þ þ þ 2 2

a research culture in most of the technical institutions, so that these can produce
manpower with very high levels of creativity and innovative abilities and thus help
Indian industries to increase competitiveness of their product and services. Many
initiatives have been taken in the last few years both by the Government and private
sector to improve and maintain the standard of technical education. Since we are in the
domain of knowledge, we feel that there is an urgent need to do performance evaluation
and benchmarking as a part of continuous quality improvement of Indian technical
institutions by considering some relevant criteria. This paper applies integrated
AHP and DBA methodology to evaluate the performance of the seven technical Indian technical
institutions in India. This technique is sound surrogate to the traditional techniques. institutions
The study reveals that the institution A is the best and F is the worst. However, the
institutions F and G need special attention to improve their performance in respect to
the criteria of PhD and Patent. According to the proposed benchmarking method the
ranking we get is in the order of A s B s E s C s G s D s F. Almost all the
published studies on performance evaluation of universities use various DEA models, 583
we are proposing a different MCDM model for the same. The advantage of this model
is that the evaluation criteria need not to be classified as inputs and outputs. The
alternatives which are called DMUs in DEA study maximize their relative efficiency by
minimizing input utilization. The criteria like SI and FS are always considered as
inputs in earlier studies using DEA. Keeping in mind the socioeconomic condition of
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

our country we consider these criteria as beneficial, i.e. higher the better type. The
proposed model is robust in the sense that it does not bother about input output
classification of evaluation criteria. Also this model is generic in nature. It can be
applied for performance evaluation, benchmarking and ranking of a group of
organizations irrespective of field. However, evaluation criteria should be selected very
carefully because the criteria play a vital role in the determination of relative ranking.

References
Abbott, M. and Doucouliagos, C. (2003), “The efficiency of Australian universities: a data
envelopment analysis”, Economics of Education Review, Vol. 22 No. 1, pp. 89-97.
Andersen, B. and Jordan, P. (1998), “Setting up a performance benchmarking network”,
Production Planning & Control, Vol. 9 No. 1, pp. 13-19.
Annual Report (2007-2008), Department of School Education and Literacy & Department of
Higher Education, Ministry of Human Resource Development, Government of India.
Athanassopoulos, A. and Shale, E. (1997), “Assessing the comparative efficiency of higher
education institutions in the UK by means of data envelopment analysis”, Education
Economics, Vol. 5 No. 2, pp. 117-134.
Augusto, M., Lisboa, J., Yasin, M. and Figueira, J.R. (2008), “Benchmarking in a multiple criteria
performance context: an application and a conceptual framework”, European Journal of
Operational Research, Vol. 184 No. 1, pp. 244-254.
Avkiran, N. (2001), “Investigating technical and scale efficiency of Australian universities through
data envelopment analysis”, Socio Economic Planning Sciences, Vol. 35 No. 1, pp. 57-80.
Bahri, M., St-Pierre, J. and Sakka, O. (2011), “Economic value added: a useful tool for SME
performance management”, International Journal of Productivity and Performance
Management, Vol. 60 No. 6, pp. 603-621.
Barros, C.P. (2005), “Performance measurement in tax offices with a stochastic frontier model”,
Journal of Economic Studies, Vol. 32 No. 6, pp. 497-510.
Beriha, G.S., Patnaik, B. and Mahapatra, S.S. (2011), “Safety performance evaluation of Indian
organizations using data envelopment analysis”, Benchmarking: An International Journal,
Vol. 18 No. 2, pp. 197-220.
Biswas, G., Chopra, K.L., Jha, C.S. and Singh, D.V. (2010), Profile of Engineering Education in
India: Status, Concerns and Recommendations, Narosa Publishing House, New Delhi.
Brauers, W.K.M. and Zavadskas, E.K. (2006), “The MOORA method and its application to
privatization in a transition economy”, Control and Cybernetics, Vol. 35 No. 2, pp. 445-469.
BIJ Casu, B. and Thanassoulis, E. (2006), “Evaluating cost efficiency in central administrative
services in UK universities”, Omega, Vol. 34 No. 5, pp. 417-426.
20,5
Charnes, A., Cooper, W.W. and Rhodes, E.L. (1978), “Measuring the efficiency of decision making
units”, European Journal of Operational Research, Vol. 2 No. 6, pp. 429-444.
Debnath, R.M. and Shankar, R. (2008), “Benchmarking telecommunication service in India: an
application of data envelopment analysis”, Benchmarking: An International Journal,
584 Vol. 15 No. 5, pp. 584-598.
Doloi, H. (2010), “Benchmarking a new design management system using process simulation
approach”, Construction Innovation: Information, Process, Management, Vol. 10 No. 1,
pp. 42-59.
Fandel, G. (2007), “On the performance of universities in North Rhine-Westphalia, Germany:
Government’s redistribution of funds judged using DEA efficiency measures”, European
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

Journal of Operational Research, Vol. 176 No. 1, pp. 521-533.


Flegg, A.T., Allen, D.O., Field, K. and Thurlow, T.W. (2004), “Measuring the efficiency and
productivity of British universities: an application of DEA and the Malmquist approach”,
Education Economics, Vol. 12 No. 3, pp. 231-249.
Glass, J.C., McKillop, D.G. and Hyndman, N. (1995), “Efficiency in the provision of university
teaching and research: an empirical analysis of UK universities”, Journal of Applied
Econometrics, Vol. 10 No. 1, pp. 61-72.
Gomes, C.F. and Yasin, M.M. (2011), “Toward the promotion of effective performance of
entry-level managers: the case of Portugal”, Competitiveness Review: An International
Business Journal Incorporating Journal of Global Competitiveness, Vol. 21 No. 3,
pp. 288-305.
Goncharuk, A.G. (2011), “Benchmarking for investment decisions: a case of food production”,
Benchmarking: An International Journal, Vol. 18 No. 5, pp. 694-704.
Hashimoto, K. and Cohn, E. (1997), “Economies of scale and scope in Japanese private
universities”, Education Economics, Vol. 5 No. 2, pp. 107-115.
Herbert, D.T. and Thomas, C.J. (1998), “School performance, league tables and social geography”,
Applied Geography, Vol. 18 No. 3, pp. 199-223.
Hilmola, O.-P. (2011), “Benchmarking efficiency of public passenger transport in larger cities”,
Benchmarking: An International Journal, Vol. 18 No. 1, pp. 23-41.
Ho, W. (2008), “Integrated analytic hierarchy process and its applications – a literature review”,
European Journal of Operational Research, Vol. 186 No. 1, pp. 211-228.
Hurmelinna, P., Peltola, S., Tuimala, J. and Virolainen, V.-M. (2002), “Attaining world-class R&D
by benchmarking buyer – supplier relationships”, International Journal of Production
Economics, Vol. 80 No. 1, pp. 39-47.
Hwang, C.L. and Yoon, K. (1981), Multiple Attribute Decision Making Methods and Applications,
Springer, Berlin.
Johnes, J. (1996), “Performance assessment in higher education in Britain”, European Journal of
Operational Research, Vol. 89 No. 1, pp. 18-33.
Johnes, J. (2006), “Measuring teaching efficiency in higher education: an application of data
envelopment analysis to economics graduates from UK universities 1993”, European
Journal of Operational Research, Vol. 174 No. 1, pp. 443-456.
Kablan, M. and Dweiri, F. (2003), “A mathematical model for maximizing the overall
benchmarking effectiveness without exceeding the available amounts of resources”,
Production Planning & Control, Vol. 14 No. 1, pp. 76-81.
Kao, C. and Hung, H.T. (2008), “Efficiency analysis of university departments: an empirical Indian technical
study”, Omega, Vol. 36 No. 4, pp. 653-664.
institutions
Khan, M.H.-U.-Z., Halabi, A.K. and Sartorius, K. (2011), “The use of multiple performance
measures and the balanced scorecard (BSC) in Bangladeshi firms: an empirical
investigation”, Journal of Accounting in Emerging Economies, Vol. 1 No. 2, pp. 160-190.
Kirytopoulos, K., Leopoulos, V. and Voulgaridou, D. (2008), “Supplier selection in pharmaceutical
industry: an analytic network process approach”, Benchmarking: An International 585
Journal, Vol. 15 No. 4, pp. 494-516.
Kollberg, B. and Elg, M. (2011), “The practice of the balanced scorecard in health care services”,
International Journal of Productivity and Performance Management, Vol. 60 No. 5, pp. 427-445.
Komonen, K. (2002), “A cost model of industrial maintenance for profitability analysis and
benchmarking”, International Journal of Production Economics, Vol. 79 No. 1, pp. 15-31.
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

Korhonen, P., Tainio, R. and Wallenius, J. (2001), “Value efficiency analysis of academic
research”, European Journal of Operational Research, Vol. 130 No. 1, pp. 121-132.
Kumar, R. and Garg, R.K. (2010), “Optimal selection of robots by using distance based approach
method”, Robotics and Computer – Integrated Manufacturing, Vol. 26 No. 5, pp. 500-506.
Kumar, S. (2011), “State road transport undertakings in India: technical efficiency and its
determinants”, Benchmarking: An International Journal, Vol. 18 No. 5, pp. 616-643.
McMillan, M.L. and Datta, D. (1998), “The relative efficiency of Canadian universities: a DEA
perspective”, Canadian Public Policy, Vol. 24 No. 4, pp. 485-511.
Maravelakis, E., Bilalis, N., Antoniadis, A., Jones, K.A. and Moustakis, V. (2006), “Measuring and
benchmarking the innovativeness of SMEs: a three-dimensional fuzzy logic approach”,
Production Planning & Control, Vol. 17 No. 3, pp. 283-292.
Mehregan, M.R., Nayeri, M.D. and Ghezavati, V.R. (2010), “An optimisational model of
benchmarking”, Benchmarking: An International Journal, Vol. 17 No. 6, pp. 876-888.
Portela, M.C.A.S. and Thanassoulis, E. (2001), “Decomposing school and school-type efficiency”,
European Journal of Operational Research, Vol. 132 No. 2, pp. 357-373.
Punniyamoorthy, M. and Murali, R. (2008), “Balanced score for the balanced scorecard: a
benchmarking tool”, Benchmarking: An International Journal, Vol. 15 No. 4, pp. 420-443.
Roy, B. (1968), “Classement et choix en presence de points de vue multiples (la methode
ELECTRE)”, R.I.R.O No. 8, pp. 57-75.
Saaty, T.L. (1980), The Analytic Hierarchy Process, 1st ed., McGraw-Hill, New York, NY.
Sarmiento, R. and Thomas, A. (2010), “Identifying improvement areas when implementing green
initiatives using a multitier AHP approach”, Benchmarking: An International Journal,
Vol. 17 No. 3, pp. 452-463.
Schmidberger, S., Bals, L., Hartmann, E. and Jahns, C. (2009), “Ground handling services at
European hub airports: development of a performance measurement system for
benchmarking”, International Journal of Production Economics, Vol. 117 No. 1, pp. 104-116.
Singh, M.R., Mittal, A.K. and Upadhyay, V. (2011), “Benchmarking of North Indian urban water
utilities”, Benchmarking: An International Journal, Vol. 18 No. 1, pp. 86-106.
Sufian, F. (2011), “Benchmarking the efficiency of the Korean banking sector: a DEA approach”,
Benchmarking: An International Journal, Vol. 18 No. 1, pp. 107-127.
Tavana, M. (2004), “Quest 123: a benchmarking system for technology assessment at NASA”,
Benchmarking: An International Journal, Vol. 11 No. 4, pp. 370-384.
BIJ Tyagi, P., Yadav, S.P. and Singh, S.P. (2009), “Relative performance of academic departments
using DEA with sensitivity analysis”, Evaluation and Program Planning, Vol. 32 No. 2,
20,5 pp. 168-177.
Valmohammadi, C. and Servati, A. (2011), “Performance measurement system implementation
using balanced scorecard and statistical methods”, International Journal of Productivity
and Performance Management, Vol. 60 No. 5, pp. 493-511.
586 Worthington, A. and Lee, B.I. (2008), “Efficiency, technology and productivity change in Australian
universities, 1998-2003”, Economics of Education Review, Vol. 27 No. 3, pp. 285-298.
Yorke, M. (1997), “A good league table guide?”, Quality Assurance in Education, Vol. 5 No. 2,
pp. 61-72.
Yorke, M. (1998), “The times’ ‘league table’ of universities, 1997: a statistical appraisal”, Quality
Assurance in Education, Vol. 6 No. 1, pp. 58-60.
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

Appendix. AHP method


In this method a pairwise comparison matrix (PWCM) of criteria is constructed by transferring
the subjective judgments into numerical values ranging from 1 to 9 as can be seen from Table AI.
The elements of the PWCM are determined by answering the experts questions such as which
criteria is more important regarding to the decision goal and to what extent (1-9). These answers
form an (nxn) PWCM which is defined as:
2 3
C 1 a11 a12 . . . a1n
6 7
C2 66 a21 a22 . . . a2n 7
7
A ¼ ðaij Þnxn ¼ . 6 . . . 7
.. 6
6 .. .. . . . .. 7
7
4 5
C n an1 an2 . . . ann

where aij represents the element of the matrix with aii ¼ ajj ¼ 1 and aij ¼ 1/aji for i, j ¼ 1, . . . , n.
Let W be the column vector of relative weights wi, then we can say A is consistent if AW ¼ nW.
For the case where A is not consistent, the relative weight wi is approximated by the average of
the ith row elements in the normalized PWCM. If W be the computed weight vector then
maximum eigenvalue lmax can be calculated from the equation:
AW ¼ lmax W ðA1Þ
In this case the closer lmax to n, the more consistent is the comparison matrix A.
The consistency of the matrix A is investigated through calculating consistency ratio (CR) as
it is defined in equation (A2):
CI
CR ¼ ðA2Þ
RI

Important intensity Definitions

1 Equally important or preferred


3 Slightly more important or preferred
5 Strongly more important or preferred
7 Very strongly more important or preferred
Table AI. 9 Extremely more important or preferred
9-point scale for pairwise 2,4,6,8 Intermediate values to reflect compromise
comparison Reciprocals Used to reflect dominance of the second alternative over the first
where CI stands for consistency index and RI stands for consistency index of random matrix. Indian technical
The values of CI and RI can be determined by the following formula:
institutions
lmax 2 n
CI ¼ ðA3Þ
n21
1:98ðn 2 2Þ
RI ¼ ðA4Þ
n
587
If CR # 0.1 the pairwise comparison matrix is considered to have an acceptable consistency,
otherwise, it needs to be revised.

About the authors


Manik Chandra Das received the Bachelors degree in Mechanical Engineering from Bengal
Engineering College (DU), India, in 2000. He finished his Master Degree in Manufacturing
Downloaded by UNIVERSITY OF OTAGO At 07:38 10 November 2016 (PT)

Technology at National Institute of Technical Teachers’ Training and Research, India in 2007. He
is now doing PhD. in Production Engineering Department, Jadavpur University, Kolkata, India.
Manik Chandra Das is the corresponding author can be contacted at: cd_manik@rediffmail.com
Bijan Sarkar received B.E, M.E and PhD Degree from Jadavpur University, Kolkata, India. He
received outstanding paper award from Emerald Publications (U.K) in UK He was visiting
research scholar at Aston University during 2006-2007 and 2007-2008. He published papers in
Neuro Computing, International Journal of Production Research, Applied Soft Computing,
International Journal of Industrial Engineering and IEEE Transactions on Engineering
Management.
Siddhartha Ray received PhD Degree in Mechanical Engineering from Jadavpur University,
India in 1975. He served industry in various positions for more than 30 years. For last 9 years he
has been with the National Institute of Technical Teachers’ Training and Research (NITTTR) as
Professor and Head of Mechanical Engineering Department.

To purchase reprints of this article please e-mail: reprints@emeraldinsight.com


Or visit our web site for further details: www.emeraldinsight.com/reprints

Das könnte Ihnen auch gefallen