Expert Commentary

11 questions journalists should ask about public opinion polls

Our new tip sheet outlines 11 questions journalists should ask to help them decide how to frame the findings of a public opinion poll — or cover them at all.

public opinion polls
(andibreit/Pixabay)

Regardless of beat, journalists often write about public opinion polls, which are designed to measure the public’s attitudes about an issue or idea. Some of the most high-profile polls center on elections and politics. Newsrooms tend to follow these polls closely to see which candidates are ahead, who’s most likely to win and what issues voters feel most strongly about.

Other polls also offer insights into how people think. For example, a government agency might commission a poll to get a sense of whether local voters would support a sales tax increase to help fund school construction. Researchers frequently conduct national polls to better understand how Americans feel about public policy topics such as gun control, immigration reform and decriminalizing drug use.

When covering polls, it’s important for journalists to try to gauge the quality of a poll and make sure claims made about the results actually match the data collected. Sometimes, pollsters overgeneralize or exaggerate their findings. Sometimes, flaws in the way they choose participants or collect data make it tough to tell what the results really mean.

Below are 11 questions we suggest journalists ask before reporting on poll results. While most of this information probably won’t make it into a story or broadcast, the answers will help journalists decide how to frame a poll’s findings — or whether to cover them at all.

1. Who conducted the poll?

It’s important to know whether it was conducted by a polling organization, researcher, non-expert, political campaign or advocacy group.

2. Who paid for it?

Was the poll funded by an individual or organization that stands to gain or lose something based on the results?

3. How were people chosen to participate?

The best polls rely on randomly selected participants. Keep in mind that if participants were targeted in some way — for example, if pollsters went to a shopping mall and questioned people they encountered there — the results may be very different than if pollsters posed questions to a random sample of the population they’re interested in studying.

4. How was the poll conducted? 

It’s important to find out if participants filled out a form, answered questions over the phone or did in-person interviews. The method of collecting information can influence who participates and how people respond to questions. For instance, it’s easier for people to misrepresent themselves in online polls than in person — a teenager could claim to be a retiree.

5. What’s the margin of error? 

Be sure to ask about the margin of error, an estimate of how closely the views of the sample reflect the views of the population as a whole. When pollsters ask a group of people which presidential candidate they prefer, pollsters know the responses they will get likely won’t match the responses they’d get if they were to interview every single voter in the United States. The margin of error is reported as plus or minus a certain number of percentage points.

Journalists covering tight political races should pay close attention to the margin of error in election polls. If a poll shows that one candidate is 2 percentage points ahead of another in terms of public support but the margin of error is plus or minus 3 percentage points, the second candidate could actually be in the lead. The Pew Research Center offers a helpful explainer on the margin of error in election polls.

6. Were participants compensated?

Offering people money or another form of compensation can also affect who participates and how. Such incentives might encourage more lower-income individuals to agree to weigh in. Also, participants may feel compelled to answer all questions, even those they aren’t sure about, if they are paid.

7. Who answered questions?

Were most participants white? Or female? A sample of primarily white, elderly, high-income women is likely to provide very different results than a sample that closely resembles the general population.

8. How many people responded to the poll? 

While there isn’t a perfect number of participants, higher numbers generally result in more accurate representations. If pollsters want to know if the American public supports an increase in military funding, interviewing 2,000 adults will likely provide a more accurate measurement of public sentiment than interviewing 200.

9. Can results be generalized to the entire public?

Journalists should be clear in their coverage whether the results of a poll apply only to a segment of the population or can be generalized to the population as a whole.

10. What did pollsters ask?

Knowing which questions were asked can help journalists check whether claims made about poll results are accurate. It also can help journalists spot potential problems, including vague terms, words with multiple meanings and loaded questions, which are biased toward a candidate or issue. Cornell University’s Roper Center for Public Opinion Research offers an example of a loaded question.

Request a copy of the questions in the order they were asked. Participants’ answers also can differ according to question order.

11. What might have gone wrong with this poll? 

Get pollsters to talk about possible biases and shortcomings that could influence results.

Want more info on public opinion polls? Check out these other resources:

  • The Journalist’s Resource has written an explainer on polls.
  • The Poynter Institute offers a free online course on understanding and interpreting polls.
  • FiveThirtyEight, a news site that focuses on statistical analysis, has updated its pollster rankings in time for the 2018 midterms. It gave six election pollsters a grade of A-plus: Monmouth University, Selzer & Co., Elway Research, ABC News/Washington Post, Ciruli Associates and Field Research Corp.
  • The American Association for Public Opinion Research provides numerous resources, including information on poll and survey response rates, random sampling and why election polls sometimes get different results.

Research chat: Political scientist Michael Traugott on primary polls

About The Author