Sie sind auf Seite 1von 18

DATA QUALITY

ASSESSMENT TOOL
For Assessment & Capacity Building
to verify the quality of reported data for key indicators at
selected sites; and
to verify that appropriate data management systems are in
place in countries;
Purpose of the DQA
The Data-Quality Assessment (DQA) Protocol is designed:
to contribute to M&E systems strengthening and capacity
building.
Determine scope of the data quality assessment
Suggested criteria for selecting Program/project(s) &
indicators
Engage Program/project(s), obtain authorization for DQA
Templates for notifying the Program/project of the
assessment
Guidelines for obtaining authorization to conduct the
assessment
Assess the design & implementation of the
Program/projects data collection and reporting systems.
Steps & protocols to ID potential threats to data
quality created by Program/projects data
management & reporting system
Trace & verify (recount) selected indicator results
Protocol with special instructions based on indicator & type of
Service Delivery Site (e.g. health facility or community-based)

Develop and present the assessment Teams findings
and recommendations.
instructions on how and when to present the DQA findings
recommendations to Program/project officials for how to plan for
follow-up activities to ensure strengthening measures are
implemented
DQA Components
DISEASE INDICATORS REPORTING PERIOD
HIV/AIDS
Number of patients on
ARV
3-month period
[1-Nov-05 / 31-Jan-
06]
National
Numbers
TB
Number of smear positive
TB cases registered
under DOTS who are
successfully treated
3-month period
[1-Oct-04 31-Dec-04]
National
Numbers
Malaria
Number of insecticide-
treated bed nets (ITNs)
distributed (i.e., number
of vouchers redeemed)
6-month period
[1-Nov-2005 / 30-Apr-
2006]

Reported
numbers to
Global
Fund
Example: Indicator Selection
Chronology and Steps of the DQA
Preparation and
Initiation
(multiple locations)
PHASE 1

M&E
Management
Unit

PHASE 2

Service Delivery
Sites /
Organizations

PHASE 3
Intermediate
Aggregation
levels
(eg. District, Region)
PHASE 4

M&E
Management
Unit

PHASE 5
Completion
(multiple locations)
PHASE 6
1. Select
Indicators and
Reporting
Period
2. Obtain National
Authorizations
and notify
Program
3. Assess Data Management and Reporting Systems
5. Trace and Verify Reported Results
4. Select/Confirm
Service
Delivery Points
to be visited
6. Draft initial
findings and
conduct close-
out meeting
7. Draft and
discuss
assessment
Report
8. Initiate
follow-up of
recommended
actions
The DQA is implemented chronologically in 6 Phases.
Assessments and verifications will take place at every stage of the reporting system:
M&E Management Unit
Intermediate Aggregation Level (Districts, Regions)
Service Delivery Sites.
PROTOCOL 1:
Assessment of Data
Management and
Reporting Systems

M&E
Management
Unit

PHASE 2

Service Delivery
Sites /
Organizations

PHASE 3
Intermediate
Aggregation
levels
(eg. District,
Region)
PHASE 4
3. Assess Data Management and Reporting Systems
Purpose
Identify potential risks to data quality created by data
management & reporting systems at:
M&E Management Unit;
Service Delivery Points;
Intermediary Aggregation Levels (District or Region)
The DQA assesses both design and implementation of
data-management & reporting systems.
Assessment covers 8 functional areas (HR, Training,
Data Management Processes , etc.)
SYSTEMS ASSESSMENT QUESTIONS BY FUNCTIONAL AREA
Functional Areas Summary Questions
I M&E Capabilities,
Roles and
Responsibilities
1
Are key M&E and data-management staff identified with clearly assigned
responsibilities?
II Training
2
Have the majority of key M&E and data-management staff received the required
training?
III Data Reporting
Requirements
3
Has the Program/Project clearly documented (in writing) what is reported to who,
and how and when reporting is required?
IV Indicator Definitions
4
Are there operational indicator definitions meeting relevant standards and are they
systematically followed by all service points?
V Data-collection and
Reporting Forms and
Tools
5
Are there standard data-collection and reporting forms that are systematically
used?
6
Are source documents kept and made available in accordance with a written
policy?
Functional Areas of M&E System that affect Data Quality
VI Data
Management
Processes
7
Does clear documentation of collection, aggregation and
manipulation steps exist?
VII Data Quality
Mechanisms
and Controls
8
Are data quality challenges identified and are mechanisms
in place for addressing them?
9
Are there clearly defined and followed procedures to
identify and reconcile discrepancies in reports?
10
Are there clearly defined and followed procedures to
periodically verify source data?
VIII Links with
National
Reporting
System
11
Does the data collection and reporting system of the
Program/Project link to the National Reporting System?
Functional Areas of an M&E System that Affect Data Quality
PURPOSE: Assess on limited scale if Service Delivery Points and
Intermediate Aggregation Sites are collecting &
reporting data accurately and on time.
Trace and verification exercise - two stages:
In-depth verifications at the Service Delivery Points; and
Follow-up verifications at the Intermediate Aggregation Levels
(Districts, Regions) and at the M&E Unit.
PROTOCOL 2:
Trace and verify
Indicator Data

M&E
Management
Unit

PHASE 2

Service Delivery
Sites /
Organizations

PHASE 3
Intermediate
Aggregation
levels
(eg. District,
Region)
PHASE 4
5. Trace and Verify Reported Results
DQA Protocol 2: Trace and Verification
Service Delivery Site 5
Monthly Report
ARV Nb. 50
Service Delivery Site 6
Monthly Report
ARV Nb. 200
Source
Document 1
Source
Document 1
District 1
Monthly Report
SDS 1 45
SDS 2 20
TOTAL 65
District 4
Monthly Report
SDP 5 50
SDP 6 200
TOTAL 250
District 3
Monthly Report
SDS 4 75
TOTAL 75
M&E Unit/National
Monthly Report
District 1
65
District 3 75
TOTAL 435
District 4 250
Service Delivery Site 3
Monthly Report
ARV Nb. 45
Source
Document 1
Service Delivery Site 4
Monthly Report
ARV Nb. 75
Source
Document 1
Service Delivery Site 1
Monthly Report
ARV Nb. 45
Source
Document 1
Service Delivery Site 2
Monthly Report
ARV Nb. 20
Source
Document 1
District 2
Monthly Report
SDS 3 45
TOTAL 45
District 2 45
SERVICE DELIVERY POINT - 5 TYPES OF DATA VERIFICATIONS
Verifications Description -
Verification n
o
. 1:
Description
Describe the connection between the delivery of
services/commodities and the completion of the source
document that records that service delivery.
In all cases
Verification n
o
. 2:
Documentation
Review
Review availability and completeness of all indicator source
documents for the selected reporting period.
In all cases
Verification n
o
. 3:
Trace and
Verification
Trace and verify reported numbers: (1) Recount the reported
numbers from available source documents; (2) Compare the
verified numbers to the site reported number; (3) Identify reasons
for any differences.
In all cases
Verification n
o
. 4:
Cross-checks
Perform cross-checks of the verified report totals with other
data-sources (e.g. inventory records, laboratory reports, etc.).
If feasible
Verification n
o
. 5:
Spot checks
Perform spot checks to verify the actual delivery of services or
commodities to the target populations.
If feasible
Service Delivery Points Data Verification
0.95
1.07
1.03
1.22
1.07
0.00 0.20 0.40 0.60 0.80 1.00 1.20
1
Total and Adjusted District Verification
Factors from DQA
Total Verification Factor IAL #4 IAL #3 IAL #2 IAL #1
0.80
0.50
0.92
0.83
0.75
0.00 0.20 0.40 0.60 0.80 1.00
1
% Available Reports from DQA
M&E Unit IAL #4 IAL #3 IAL #2 IAL #1
0.80
0.47
0.88
0.73
0.75
0.00 0.20 0.40 0.60 0.80 1.00
1
% On Time Reports from DQA
M&E Unit IAL #4 IAL #3 IAL #2 IAL #1
0.70
0.60
0.72
0.67
0.50
0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80
1
% Complete Reports from DQA
M&E Unit IAL #4 IAL #3 IAL #2 IAL #1
DQA Summary Statistics
44,7%
(21,449
Recounted)
55,3%
(26,654
Unaccounted)
20%
40%
60%
80%
100%
1- VERIFICATION FACTOR
(% difference in the reported / re-aggregated numbers)

Illustration 1 - Trace and Verification at the M&E Unit
(HIV/AIDS)
Number of patients on ARV - 31
st
of August 2006
41% Incomplete 69% Complete
No tracking of timeliness
Availability 67% Missing 33% Available
20% 40% 60% 80% 100%
Completeness *
Timeliness
2- AVAILABILITY, COMPLETENESS AND
TIMELINESS OF REPORTS
* Report has to include (1) Name of site; (2) Reporting Period; (3) Name of submitting person; (4)
Cumulative data.
REPORTING
LEVEL
FINDINGS RECOMMENDATIONS
National
M&E Unit
No specific documentation specifying
data-management roles and
responsibilities, reporting timelines,
standard forms, storage policy,
Develop a data management manual
to be distributed to all reporting
levels
Inability to verify reported numbers
by the M&E Unit because too many
reports (from Service Points) are
missing (67%)
Systematically file all reports from
Service Points
Develop guidelines on how to
address missing or incomplete
reports
Most reports received by the M&E
Unit are not signed-off by any staff or
manager from the Service Point
Reinforce the need for documented
review of submitted data for
example, by not accepting un-
reviewed reports
Illustration 3 Systems Finding at the M&E Unit (HIV/AIDS)
Findings from DQAs
Data not collected routinely
Documentation of what was reported (cant locate
source documents/lack of filing system for easy
retrieval)
Issues around double-counting
Integrity incentives for over-reporting
Effect of staff turnover
Involving staff in M&E definitions of indicators,
value of data, data use
THANK YOU
References:
MEASURE Evaluation: University of North Carolina:
USAID:
http://www.cpc.unc.edu/measure/tools/monitoring-
evaluation-systems/data-quality-assurance-
tools/RDQA%20Guidelines-Draft%207.30.08.pdf
http://whqlibdoc.who.int/hq/2003/WHO_V&B_03.19.
pdf








MEASURE Evaluation is a MEASURE project funded by the
U.S. Agency for International Development and implemented by
the Carolina Population Center at the University of North Carolina
at Chapel Hill in partnership with Futures Group International,
ICF Macro, John Snow, Inc., Management Sciences for Health,
and Tulane University. Views expressed in this presentation do not
necessarily reflect the views of USAID or the U.S. Government.
MEASURE Evaluation is the USAID Global Health Bureau's
primary vehicle for supporting improvements in monitoring and
evaluation in population, health and nutrition worldwide.

Das könnte Ihnen auch gefallen