Sie sind auf Seite 1von 39

E-Metrics:

Assessing Electronic Resources


Willian S.A. Frias
Librarian, AGH Law Library
De La Salle University
2401 Taft Avenue, Manila 0922

Parts of the Lecture


1. Objectives
2. Overview
1. Definition
2. Brief Background
3. Advantages/Disadvantages
3. E-metrics in Action
4. Summary
5. Conclusion/Points to Ponder

Objectives
To familiarize participants in assessing
electronic collection using e-metrics;
To present the different capabilities of emetrics as an assessment tool;
To develop in the participants a framework
of identifying useful data in assessing
electronic collection; and,
To guide participants in the preparation of
collection assessment studies/reports.

Overview

(1/2)

Why e-Metrics?
proliferation of electronic resources in
libraries
Unmanageable growth of electronic
resources
Continuous increase of subscription
prices
24/7 data capture (exhaustiveness of
data)
Easy extraction of data (accessibility of

Overview
Uses of e-Metrics

Collection management
Budget decisions
Marketing and promotions
Planning for additional services
Measuring student learning
Predicting ILL/DDS outcomes

(2/2)

Definition
E-metrics is a standardized measurement
producing quantitative data that are
extracted from using electronic resources.
This standard measure serves as a tool to
determine the effectiveness, efficiency,
performance and/or quality of patronaccessible electronic resources through
different assessment methods.

Brief Background (1/2)


In 1999, ARL funded the ARL e-Metrics
Project
rise in acquisition price (1992-1993 to
1999-2000)
Additional costs for infrastructure and
personnel
Absence of authoritative data for
electronic resources
Inconsistencies in collecting and
analyzing data

Brief Background (2/2)


2002, Project COUNTER (Counting Online
Usage of NeTworked Electronic Resources)
brought librarians, publishers and vendors
together
develop standards for reporting and recording of
online usage data

2008, COUNTER Code of Practice for


Journals and Databases was published
92 vendors provided JR1: fulltext article requests by
journal
26 vendors provided JR2: total searches and sessions by
database
41 vendors provided JR3: successful item requests and
turnaways by journal and page type

Advantages vs Disadvantages
Advantages
Standardized
Unobtrusive
Data are easily
captured
Objective
Focused on a
specific format of
collection

Disadvantages
Vendor-dependent
Indicators
Reports
Issues on
Federated
Searching
Issue on open
access use

E-Metrics in Action
Trend Analysis
Patterns of Use/User behavior
Planning for marketing and advertising
Measuring learning outcomes

Efficiency Studies
Usage correlation studies
Ratio analysis

Cost-Benefit Analysis
Return on Investments (ROI)
Cost per Article Reading (CPR)

Trend Analysis
Profiling/Mapping
the simplest way of assessing eresources
Data used are from transaction logs
such as
Fulltext article requests
Total searches and sessions

Analyzing by means of usage report

Online Database Annual Usage Report:


Number of Downloads
SY 2013-2014 SY 2015-2016*
Downloads

Total

Averag

SY2013-

SY2014-

SY2015-

2014
3,492

2015
4,929

2016
5,827

14,248

Interdisciplinary 1

13,587

13,555

13,576

40,718

Interdisciplinary 2

20,468

18,808

15,188

925

940

955

54,464 18,155
2,820
940

3,771

8,108

13,551

Database/Aggregator
Business Database

Psychology Database

STEM Database

*Not real data, for discussion


purposes only

25,430

e
4,749
13,573

8,477

Graphed Online Database Annual


Usage Report
Number of Downloads
SY 2013-2014 SY 2015-2016*
Probable Observation
Patterns
use
ProbableofAnalysis
1. Decline
in database is
1.
The
least used
Other
factors
to consider:
Interdisciplinary 2
1. Psychology
Population Database;
(FTE) of potential
2.
Almost
the
2. The
most used same
database is
users
number of use
for
2 database.
2. Interdisciplinary
Size of the database/Aggregator
Interdisciplinary
1
3. Marketing
andofAdvertising
1. Number
journals
Psychology 2 needs to
2.and
Interdisciplinary
Subjects covered
by
database
for the
be
marketed
the
most
database/aggregator as
past
3 SYs by its sharp
observed
3. Increasing
downwardusage
line on the 3rd
patterns
for by
SY, followed
Business
and STEM
Interdisciplinary
1 and
Databases
Psychology because no
improvement in use ocured
in 3 Sys.
3. Learning Outcomes
After two years of offering
2013-2014
2014-2015
Information Literacy
Sessions, students learned
to use STEM Database the
most, basing on the
steepness of its upward

2015-2016

Efficiency Studies
Studies that measure how efficient an
electronic resource based on extracted
transaction logs and other indicators

Usage (Downloads)
Actual searches
Population (FTE)
number of journals/database

Use Ratio and Transaction Log analyses

Online Database Annual Usage Report:


Utilization Rate
SY 2015-2016*
Database/Aggregator

Business Database
Interdisciplinary 1
Interdisciplinary 2
Psychology Database

STEM Database

Log-ins

Searches

Downloads

1,614

7,283

5,827

13,555

12,921

13,576

15,410 125,238

15,188

1,470

927

955

10,093

19,984

13,551

Utilization Rate = Log-in


x 100%
*Not real data, for discussion
FTE
purposes only

Population

Utilization

(FTE)

Rate (%)

2,381

67.78

4,656

291.13

10,250

150.34

280
5,230

153.93
192.98

Online Database Annual Usage Report: User


Satisfaction Rate
SY 2015-2016*
Database/Aggregator

Business Database
Interdisciplinary 1
Interdisciplinary 2
Psychology Database

STEM Database

Log-ins

Searches

Downloads

1,614

7,283

5,827

13,555

12,921

13,576

15,410 125,238

15,188

1,470

927

955

10,093

19,984

13,551

User Satisfaction Rate =


Downloads x 100%
*Not real data, for discussion
purposes only
FTE

Population

Satisfaction

(FTE)

Rate (%)

2,381

244.73

4,656

291.58

10,250

148.17

280
5,230

347.07
259.10

Online Database Annual Usage Report: Database


Efficiency Rate
SY 2015-2016*
Database/Aggregator

Log-ins

Business Database

Searches

Downloads

Efficiency Rate (%)

1,614

7,283

5,827

80.00

Interdisciplinary 1

13,555

12,921

13,576

105.07

Interdisciplinary 2

15,410 125,238

15,188

12.13

1,470

927

955

103.02

10,093

19,984

13,551

67.81

Psychology Database

STEM Database

Database Efficiency Rate =


Downloads x 100%
Searches

*Not real data, for discussion


purposes only

Online Database Annual Usage Report:


Database Usability Rate
SY 2015-2016*
Log-ins Searches Downloads Journals/Data

Database/Aggregator

Business Database

1,614

7,283

5,827

Interdisciplinary 1

13,555

12,921

13,576

Interdisciplinary 2

15,410 125,238

15,188

1,470

927

955

10,093

19,984

13,551

Psychology Database

STEM Database

Database Usability Rate =


x 100%

base

Rate (%)

10,631

54.81

4,656

291.58

4,295

353.62

72

1,326.38

2,101

Downloads
Number of

Journals/Database
*Not real data, for
discussion
purposes only

Usability

644.98

Overall Database Performance Ranks Based on


Transaction Logs
SY 2015-2016*
Probable Conclusions
Overall

Database

Business Database

Interdisciplinary 1

Interdisciplinary 2

Psychology Database

STEM Database

Legend:
1 Usage
2 Utilization Rate
3 User Satisfaction
Rate
4 Database
*NotEfficiency
real data, forRate
discussion

Total

1. Interdisciplinary 1 database is
database;
5the most
4 important
3
5
21
2. Business database is the least
1important
2 database.
1
4
10

Rank

Probable
Recommendations:
4
5
5
3
18
3. Sustain subscription to
3Interdisciplinary
1
2 1, Psychology
1
12
Database and STEM Database.
3
4 to Business
2
14
4. 2Stop subscribing
and Interdisciplinary 2
databases
Important factor to consider:

4th

Financial
Factor
1. Return on Investments (ROI)
2. Cost per article Reading (CPR)

5th
1st

2nd
3rd

Cost-Benefit Analysis
A measure on the efficiency of database
based on costs and other financial-related
factors
Popular methods of analyzing benefits are
through:
Return on Investments (ROI)
Cost per Article Reading (CPR)

Return on Investments
(ROI)
A performance measure that is used to
evaluate the efficiency of an investment
(Investopedia, 2012)
presents gains or losses of an investment.
positive ROI means gains from
investments
negative ROI means losses from
investments

Return on Investments
(ROI)

E-books
ROI per Title (ROIT)
ROIT = [(Cost of ebook)(Savings cost*)(Total usage)
Cost of ebook]
ROI for Ebook Collection (ROIC)
ROIC = Total ROIT
Rate of ROIC = Total ROIT x 100%
Total Acquisition Cost
*Savings cost is equal to 50% for the assumption that access to a volume through borrowing it from the
library costs 50% of what it would cost to purchase the book.(Habre, 2012)

Return on Investments
(ROI)

For Individual
e-book
E-book Title
Acquisition
Cost (Php)
Defining Tourism
Everyday Microbiology
Intereligiosity

Savings
Cost

ROI
(Php)

Usage

7,614

0.5

0.00

12,555

0.5

-6,277.50

8,410

0.5

12,615.00

9,470

0.5

4,735.00

10,093

0.5

-10,093.00

World Politics
GMO in 2020

ROIT = (Cost of ebook*Savings cost)(Total usage)


Cost of ebook
*Not real data, for discussion

Return on Investments
(ROI)

For e-book
Collection
E-book
Title
Acquisition
Cost (Php)
Defining Tourism

ROIT

7,614.00

0.00

12,555.00

-6,277.50

Intereligiosity

8,410.00

12,615.00

World Politics

9,470.00

4,735.00

GMO in 2020

10,093.00

-10,093.00

Total

48,142.00

9,979.50

Everyday Microbiology

ROIC = Total
ROIT

Rate of ROIC =
100%

Total ROIT x
Total

Acquisition Cost
= Php 9,979.50 x

Return on Investments
(ROI)

On E-Journals (Databases)
ROI per Database (ROID)
ROID = [(average cost/title) (Total usage) - Subscription Price]

Cost per Title =

Rate of ROI =

Subscription Price
Number of Titles/Database
ROI x 100%
Subscription Price

Return on Investments
(ROI)
Online Database Annual Usage Report: Return on
Investments Per Database
SY 2015-2016*
ROI per
Rate of
Database/Aggregator

Downloads

Business Database

Subscription

Average

Cost

Cost/Title

Database

ROI (%)

5,827

1,118,000.00

105.16

Interdisciplinary 1

13,576

134,550.00

28.90

257,796.40

191.6

Interdisciplinary 2

15,188

1,131,927.00

263.55

2,870,870.40

253.63

955

631,800.00

8,775.00

7,748,325.00

1,226.38

13,551

1,285,975.76

612.08

7,008,320.32

Psychology Database
STEM Database

ROI

-505,232.68

= [(average cost/title) (Total usage) - Subscription Price]

Rate of ROI =
Price

*Not real data, for discussion

ROI x 100%
Subscription

-45.19

5,433.83

Return on Investments
(ROI)

E-Journals (Databases)
ROI for E-resources Collection(ROIC)
ROIC = Total ROID

Rate of ROIC =

ROIC x 100%
Total Subscription Price

Return on Investments
(ROI)
Online Database Annual Usage Report: Return on
Investments : Whole E-Collection
SY 2015-2016*
Subscription
ROI per
Database/Aggregator

Business Database

Cost
1,118,000.00

Database
-505,232.68

Interdisciplinary 1

134,550.00

257,796.40

Interdisciplinary 2

1,131,927.00

2,870,870.40

631,800.00

7,748,325.00

STEM Database

1,285,975.76

7,008,320.32

Total

4,302,252.76

17,380,079.44

Psychology Database

ROIC = Total ROID


*Not real data, for discussion

Rate of ROIC =
100%

Total ROID x
Total

Subscription Price
= Php

Cost Per Article Reading


(CPR)

A cost-benefit measure that determines


the value of an electronic resource
(database).
Simple formula of Cost per article reading
(CPR)
CPR =
Subscription Cost___
Number of Downloads

Cost per Article Reading


(CPR)
Online Database Annual Usage Report: Cost
Per Article Reading
SY 2015-2016*
CPR
Database/Aggregator
Business Database

Downloads

Subscription
Cost

(Php)

5,827

1,118,000.00

Interdisciplinary 1

13,576

134,550.00

9.91

Interdisciplinary 2

15,188

1,131,927.00

74.53

955

631,800.00

661.57

13,551

1,285,975.76

94.90

Psychology Database
STEM Database

CPR=
*Not real data, for discussion

191.86

Subscription Cost
Number of Downloads

Cost per Article Reading


(CPR)
Online Database Annual Usage Report: Cost
Per Article Reading
CPR
SY 2015-2016*
Subscription
Database/Aggregator

Business Database

Downloads

Cost

(Php)

5,827

1,118,000.00

Interdisciplinary 1

13,576

134,550.00

13.21

Interdisciplinary 2

15,188

1,131,927.00

99.37

955

631,800.00

882.09

13,551

1,285,975.76

126.53

Psychology Database
STEM Database

CPR=

255.13

Subscription Price
Number of Readings

Number of Readings = Number of Downloads - (Number of


Downloads * 25%)
Number of Readings = Number of
Downloads * 75%

Combining Cost-Benefit
Analyses*
Database/Aggregator
Business Database

ROI per

CPR

Database (Php)

(Php)

-505,232.68

255.13

Interdisciplinary 1

257,796.40

13.21

Interdisciplinary 2

2,870,870.40

99.37

Psychology Database

7,748,325.00

882.09

STEM Database

7,008,320.32

126.53

T =17,380,079.44

Ave = 275.27

Total/Average

*Not real data, for discussion

Summary(1/2)
E-metrics is a quantitative analysis used in
assessing electronic resources.
Three main uses of e-metrics as an assessment
tool
Trend Analysis
Efficiency Studies
Cost-Benefit Analysis
Important data to gather
Usage (Log-ins, Searches, Downloads)
Potential Population of users
Number of journal Titles per database
Subscription Cost

Summary(2/2)
Important formulas
Efficiency Studies
Utilization Rate = Log-in
x 100%
FTE
User Satisfaction Rate =
Downloads x 100%

Database
Efficiency Rate =
FTE
Downloads x 100%
Database
Searches Usability Rate =
x
100%

Cost-Benefit Studies
Journals/Database
ROI

Downloads
Number of

= [(average cost/title) (Total usage) - Subscription Price]

Rate of ROIC =
100%

Total ROID x

Total
Subscription
Price
Subscription Price
Number of Downloads

CPR=

Limitations of E-Metrics
Quality of sessions are not measured
Length of each session
Users interaction with the resource is
not defined
Results can over-emphasize or hide real
results
Dependent on vendor-supplied data, if inhouse transaction log counter is not
available

Points to Ponder
E-metrics is an assessment tool based on
culled usage statistics, it only provides
conclusions based on the formula and its results.

Not all E-metrics results should be readily


accepted, librarians have to look at the broader
picture and correlate usage results with other rational
factors in order to arrive at the most unbiased and
dependable results.

Librarians may not be agreeable to the


results of the assessment, but these results will
play big roles in decision-making activities

There is no perfect assessment measure


for every situation But librarians can find a

The important question is


not how assessment is
defined, but whether
assessment information is
used
- Palomba &
Banta

References
Association of Research, L. (2002). Measures for electronic resources (E-Metrics).
strm, F. A., Hansson, J. A., Olsson, M. A., & Linnuniversitetet, F. O. (2011). Bibliometrics and
the changing role of the university libraries.
Cox, J. (2003). Original article: Value for Money in Electronic Journals. A Survey of the Early
Evidence and Some Preliminary Conclusions.Serials Review,2983-88. doi:10.1016/S00987913(03)00041-8
Fowler, F. C. (Ed.). (2007). The next steps in developing usage statistics for e-serials. InUsage
statistics of e-serials(pp. 245-260). Binghamton: Haworth Information Press.
Habre, C. (2012, April 7). ROI @ LAU Libraries. In9th Annual AMICAL Meeting and Conference.
Retrieved from http://www.slideshare.net/Cendrella1/roilau-libraries
Holmstrm, J. (2004). The cost per article reading of open access articles.D-Lib Magazine,10(1),
11p.. doi:10.1045/january2004-holmstrom
King, D. W., Boyce, P. B., Montgomery, C. H., & Tenopir, C. (2003). Library economic metrics:
examples of the comparison of electronic and print journal collections and collection
services. (Academic Libraries).Library Trends, (3), 376.
Kinman, V. (2009). E-metrics and library assessment in action.Journal Of Electronic Resources
Librarianship,21(1), 15-36. doi:10.1080/19411260902858318
Matthews, J. (2011). What's the return on roi? the benefits and challenges of calculating your
library's return on investment.Library Leadership And Management,25(1),
Montgomery, C. )., & King, D. ). (2002). Comparing library and user related costs of print and
electronic journal collections: A first step towards a comprehensive analysis.D-Lib
Magazine,8(10), 11p..
Nagra, K. A. (2009). The Evaluation of Use of Electronic Resources and Services in Academic
Libraries: A Study of E-metrics and Related Methods for Measurement and
Assessment.Journal Of The Library Administration & Management Section,5(3), 28-41.
Palomba, C. A., & Banta, T. W. (1999).Assessment essentials : planning, implementing, and
improving assessment in higher education. San Francisco, CA: Jossey-Bass Publishers.
Pesch, O. (2004). Usage statistics: Taking E-metrics to the next level.Serials Librarian,46(1-2),
143-154.
Plasmeijer, H. W. (2002). Pricing the serials library: in defence of a market economy.Journal Of

Thank you!
and

Keep safe as you follow any of these roads...