Sie sind auf Seite 1von 40

Interpreting and Using Polls in

2014
Claudia Deane
Director, Research Practices
September 17, 2014 2 www.pewresearch.org
Value of polls in reporting
Most high profile use: Election horserace

Much more important at Pew Research Center:
Scientifically represent public attitudes, values,
experiences
Add scope, meaning to anecdotes
Track societys changing attitudes
Giving voice to those not always heard
Journalists play a critical role in vetting,
interpreting survey research
Just like any source, polls require vetting
But no peer review for many polls
Especially now, varying methodological
approaches and quality
Journalists continue to be gatekeepers
Basic (but challenging) journalistic question

Is poll reliable enough to report?
This is getting harder to answer as survey research,
like journalism, is in flux





Theres got to be something better??
Continuing proliferation of polls

Continuing decrease in response rates

Advent of the internet and internet-based surveys

Rise of cell phones

Advent of new techniques for interviewing

Changes in legacy media polls

Continuing increase in language diversity of nation
Are they still accurate?
NOTE: Candidate error is the difference between final result
SOURCE: National Council on Public Polls
12
6
5
10
9
3
2
5
2.5
2.7
2
6.3
4.3
3.2
2.2
4.1
2.2
0.9 0.9
1.6
0
2
4
6
8
10
12
14
1936 1940 1944 1948 1952 1956 1960 1964 1968 1972 1976 1980 1984 1988 1992 1996 2000 2004 2008 2012
Average Candidate Error
Presidential Elections 1936-2012
Questions to ask when writing about polls
AAPOR
1. Who paid for the poll and why
2. Who conducted the poll
3. How were interviews conducted
4. Number of interviews/margin of sampling error
5. How were people chosen (probability sample?)
6. What population is poll is trying to represent
7. Timing of poll
8. Question wording/order
9. Results based on full sample or subset
10. Were the data weighted, and if so, to what

To evaluate the quality of a poll,
you need two things:

1. A copy of the topline/trend
document The actual
questionnaire, including
responses

2. The method box or survey
methodology, with details on
how poll was conducted
Practically speaking
1. Who paid for the poll and why
2. Who conducted the poll

Questions to ask when writing about
polls
Usually easy to find

Washington Post-ABC News poll

This poll was conducted for The Washington Post and ABC
News by telephone April 17 to 21, 2013 among a random
national sample of 1,000 adults, including users of both
conventional and cellular phones. The results from the full
survey have a margin of sampling error of plus or minus
3.5 percentage points. Sampling, data collection and
tabulation by SSRS of Media, Pa. Produced for the
Washington Post by Capital Insight.
*= less than 0.5 percent
SOURCE: http://www.washingtonpost.com/blogs/fact-checker/post/a-misleading-obamacare-poll-courtesy-of-the-chamber-of-commerce-and-
harris-interactive/2013/07/30/26e5f51c-f94a-11e2-8e84-c56731a202fb_blog.html
Here, the polling company, Harris Interactive, and the sponsor, the U.S.
Chamber of Commerce, presented the data in a highly misleading way and
then made false claims about the type of poll that had been conducted.

The Chamber has been a fierce opponent of the health-care law, a.k.a.
Obamacare, and we frequently warn readers they should always be skeptical
of polls peddled by partisan organizations. Perhaps it should be no surprise
that this poll was released just as the GOP-led House of Representatives
scheduled a vote to repeal the law.

Given the way the data was presented, Republican lawmakers thought they
had been handed a gift and ended up with egg on their faces.

Kessler, The Washington Post

Possible Sponsors
Federal, state and local governments
Media organizations
Academic institutions
Non-profit groups or foundations
Special interest groups
Businesses and corporations
Political campaigns, consultants and candidates
Who paid for the poll and why?
Who conducted the poll?
1. Who paid for the poll and why
2. Who conducted the poll
3. How were interviews conducted
4. Number of interviews/margin of sampling error
Questions to ask when writing about
polls
Also usually easy to find

Washington Post-ABC News poll

This poll was conducted for The Washington Post and ABC
News by telephone April 17 to 21, 2013 among a random
national sample of 1,000 adults, including users of both
conventional and cellular phones. The results from the full
survey have a margin of sampling error of plus or minus 3.5
percentage points. Sampling, data collection and tabulation by
SSRS of Media, Pa. Produced for the Washington Post by
Capital Insight.
*= less than 0.5 percent
0%
2%
4%
6%
8%
10%
12%
14%
0 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000
M
a
r
g
i
n

o
f

S
a
m
p
l
i
n
g

E
r
r
o
r

(
+
/
-
)

Sample Size
N=50
MOSE= +/- 14 pts
N=1,000
MOSE= +/- 3 pts
N=2,500
MOSE= +/- 2 pts
N=10,000
MOSE= +/- 1 pts
Relationship between sample size and
margin of sampling error
1. Who paid for the poll and why
2. Who conducted the poll
3. How were interviews conducted
Questions to ask when writing about
polls
Modes of administration
Telephone
RDD = Random Digit Dial of landline and (often) cell
phones
Voter list-based phone surveys
In-person
Mail
The Internet?!


The Internet is the edge
Challenges
Obvious coverage issues
Less obvious, but bigger issue: How to get a random
sample?
Wide variety of methodologies being tried
Internet-based samples are often not
probability samples
Suggestion: Requires extra look to understand
and report on what youre dealing with

1. Who paid for the poll and why
2. Who conducted the poll
3. How were interviews conducted
4. Number of interviews/margin of sampling error
5. How were people chosen (probability sample?)
Questions to ask when writing about
polls
Probability or non-probability
Probability sample Non-probability sample
CAN extrapolate from your sample to
a larger population (Most
Americans)
CANNOT extrapolate to a larger
population (yet)
CAN apply a margin of sampling
error
Margin of sampling error cannot be
computed
Ie, a randomly selected sample, a
random sample of the national
public
Ie, Self-selected samples, click-
through polls, most internet panels
Yes No
Easy: Online click-in polls
SOURCE: From www.nationalreview.com, accessed 1/10/12.
Trickier: Hybrid Internet panels
The sample itself is random
But the population from which the
sample is drawn is made up of people
who have signed up to be members of
the panel

Question: In what cases can this represent
the full U.S. population?
Answer: Unresolved





9/17/2014 26
September 17, 2014 www.pewresearch.org 27
Is Pew Research ever going to use the kind of online non-probability
panel that the Times and CBS are using?

Scott Keeter: Yes, we will but the real question is what we will use it for. Our
current standards permit the use of non-probability samples for certain
purposes, such as conducting experiments or doing in-depth interviews. In
addition, we have embarked on a program of research to help us better
understand the conditions under which non-probability samples can provide
scientifically valid data. We also are exploring how to utilize non-survey data
sources, which by their very nature tend to come from samples that are not
random. But until we understand the pros and cons of those methods a lot
better, were going to be very cautious about incorporating them into our
research.

1. Who paid for the poll and why
2. Who conducted the poll
3. How were interviews conducted
4. Number of interviews/margin of sampling error
5. How were people chosen (probability sample?)
6. What population is poll is trying to represent
Questions to ask when writing about
polls
What population is poll trying to represent?
All voters?
All conservatives?
All residents of Pakistan?

A thinking question, mainly, requiring 90% common
sense and 10% methodological savvy
So if you want to represent young people
Blue= cell
Green =
landline
SOURCE: Washington Post-ABC News analysis; data from 2009-2012
1. Who paid for the poll and why
2. Who conducted the poll
3. How were interviews conducted
4. Number of interviews/margin of sampling error
5. How were people chosen (probability sample?)
6. What population is poll is trying to represent
7. Timing of poll
8. Question wording/order
Questions to ask when writing about
polls
Bad questions:
1. Are complex or presume information
2. Are leading, or unbalanced
3. Are double-barreled or double negative
4. Are loaded with emotional or red flag words
5. Give biasing or unequal information in the lead-
in to question
Wording DOES Matter: Govt Surveillance
9/17/2014 33
1. Who paid for the poll and why
2. Who conducted the poll
3. How were interviews conducted
4. Number of interviews/margin of sampling error
5. How were people chosen (probability sample?)
6. What population is poll is trying to represent
7. Timing of poll
8. Question wording/order
9. Results based on full sample or subset
Questions to ask when writing about
polls
ASK IF HEARD A LOT OR A LITTLE AND INTERNET USER ((INT1=1 OR
INT2=1 OR INT3=1) AND (Q.HB1=1,2)) [N=897]:
Q.HB2 Do you think your own online personal information was put at
risk by the Heartbleed bug, or do you think your information was not
put at risk?

Apr 23-27
2014 All internet users
45 Own information put at risk 29
47 Own information not put at risk 30
8 Dont know/Refused (VOL.) 5


9/17/2014 35
1. Who paid for the poll and why
2. Who conducted the poll
3. How were interviews conducted
4. Number of interviews/margin of sampling error
5. How were people chosen (probability sample?)
6. What population is poll is trying to represent
7. Timing of poll
8. Question wording/order
9. Results based on full sample or subset
10. Were the data weighted, and if so, to what

Questions to ask when writing about
polls
Were the data weighted, and if so, to what?
CBS News/New York Times

The combined results have been weighted to adjust
for variation in the sample relating to geographic
region, sex, race, Hispanic origin, age, education and
number of adults in the household. Respondents in the
landline sample were also weighted to take account of
the number of telephone lines into the residence.
Weighting of poll data
Its necessary in almost all surveys
(Nearly) Everyone does it
It corrects for the problem of not interviewing people in
the sample in correct proportion to their size in the
population
Good practices when writing about public opinion
Report the topline story
Then break it down: identify key subgroups, look for
interesting differences
Are the numbers changing over time?
Be cautious with causality
Are the results confirmed by other recent polls? If
not, write through it!
Use poll archives to validate conflicting claims about
public opinion
Use personal quotes/interviews to animate raw
numbers

Thank you
cdeane@pewresearch.org

Das könnte Ihnen auch gefallen