Expert Commentary

5 basic things journalists need to know about polls and surveys

Want to learn about the benefits and pitfalls of poll and survey research? Read on for insights from a political scientist, a social psychologist, a statistician and an investigative journalist.

(Mohamed Hassan/Pixabay?

Political scientist Mike Binder has likened conducting polls and surveys to home-brewing beer. Both might seem relatively easy to do, but both require ample thought and planning if you want high-quality data — and beer.

A key piece of a survey or poll is its study sample. This is a subset of a population that researchers want to know more about, such as registered voters, parents who homeschool their children or older adults who avoid medical care. If the sample’s composition doesn’t closely resemble the larger group in terms of gender, race, education level and other factors, the answers that people in the sample give won’t represent the attitudes, opinions or experiences of the larger group.

“Much like [with] beer, things can go wrong,” Binder, faculty director of the Public Opinion Research Lab at the University of North Florida, told people gathered at a local brewery for a video magazine story about political polls a few years ago. “If you have a poor combination of ingredients, like too many Democrats or too many Republicans in a poll, it’s not going to be right.”

Researchers have warned the public not to assume all polls and surveys are conducted the same way, or to accept their results without some level of scrutiny. Because journalists use poll and survey data often in stories about economics, politics, public health and other areas of public policy, they should familiarize themselves with the basics of this type of research. They can start with these five things.

1. Survey and poll results are estimates, and sometimes rough estimates.

Even perfectly executed surveys and polls provide only estimates. The results are based on answers collected from a small segment of a population, so they will almost never be the same as the results researchers would have gotten had they asked every single person in the population the same set of questions.

Many other factors can further impact the accuracy of results, including leading questions, confusing questions and the order in which questions are asked. It’s important to keep in mind, too, that people may not answer every question truthfully. Some may simply “agree” with questions asking them to agree or disagree with a statement, regardless of how they actually feel about it. Researchers have a name for this problem: acquiescence bias.

People also tend to respond to questions about sensitive, controversial or intrusive subjects — for instance, alcohol consumption, illegal activity and how often they floss their teeth per day — in a way they consider to be socially acceptable. That sometimes means being dishonest. Researchers refer to this as social desirability bias.

The gold standard for conducting surveys and polls in the U.S.: In-person interviews with a random sample of people identified using the U.S. Postal Service’s master address file.

“That method yields the highest response rates, and people are remarkably thoughtful and honest when talking to interviewers face-to-face,” social psychologist Jon Krosnick, who studies survey research methods at Stanford University, told Knowable Magazine last month. “The government still does face-to-face interviewing for its most important surveys that produce numbers used widely by businesses, scholars studying the economy, economists and investors, and government agencies planning their actions.”

2. The best surveys and polls are transparent about their methodology and results.

Several national and international groups are pushing researchers, research universities and polling firms to share key details about their work at the same time they release their results.

“The best polls are transparent about their methods,” Kristen Olson, director of the University of Nebraska–Lincoln’s Bureau of Sociological Research, recently told Nebraska Today.

In 2014, the American Association for Public Opinion Research launched its Transparency Initiative. Members must commit to publicly reporting disclosure elements such as who sponsored the research, who conducted it, how the sample was created, how data were weighted and what procedures were used to ensure data quality.

The World Association for Public Opinion’s code of professional ethics and practices is broadly consistent with the Transparency Initiative’s disclosure standards.

538, a website that focuses on opinion poll analysis, ranks pollsters according to both their track record and methodological transparency. The four it has ranked highest are The New York Times/Siena College, ABC News/The Washington Post, Marquette University Law School and YouGov.

3. The margin of error is important.

Researchers note that journalists make a lot of mistakes when they report on data collected through polls and surveys because they don’t understand or they ignore the data’s margin of error. The margin of error is a range of values that indicates the amount of uncertainty in a poll or survey result. It is an estimate of how much the result differs from what the result would have been had every person in the population being studied participated.

For more details, read our tip sheet “The Margin of Error: 7 Tips for Journalists Covering Polls and Surveys.

The U.S. Census Bureau, which conducts dozens of national surveys each year, publishes the margin of error with its survey results. In detailed reports, it also points out the margin of error for each data point.

Census Bureau statistician Lacy Loftin demonstrates how to interpret margin of error in a short video the agency posted to its website earlier this year. She explains that per capita income in the U.S. is an estimated $37,638, according to the 2021 American Community Survey. The margin of error is ±102, so census officials are very confident that the true amount of the nation’s per capita income falls within the range of $37,536 and $37,740.

Loftin notes that the larger the margin of error, the less precise the estimate is. For example, the Census Bureau estimates there were 13,620 kindergarten students in Montana in 2021. With a margin of error of ±840, the actual number of kindergarteners in Montana that year is very likely to be between 12,780 and 14,460, she adds.

Another thing worth noting: If a survey estimate involves a proportion, such as the proportion of voters who have a college degree, the margin of error is expressed as percentage points — not percent. For example, if a survey finds that 36% of registered voters have college degrees, and the margin of error is ±3, the margin of error is plus or minus 3 percentage points.

4. Researchers often use the terms “poll” and “survey” interchangeably, although they are technically different.

Although polls and surveys are different methods of collecting information, researchers, polling organizations and government officials often use the terms “poll” and “survey” interchangeably. Journalists should keep this in mind when reading reports and academic papers and when interviewing people who study or work in this field.

Polls are generally short, offer limited choices for answers and focus on a specific topic. Surveys, on the other hand, tend to be longer and ask more detailed questions, including open-ended questions requiring a written response. Surveys often ask about a variety of topics.

5. Knowing why people have a certain opinion can be more useful than knowing the percentage of people who hold that opinion at a single point in time.  

Investigative journalist Noah Pransky explained that surveys and polls are an underutilized tool for finding story ideas. This type of data, he notes, can reveal storylines a reporter might not have considered.

“It can open our eyes to silent majorities and underrepresented groups that may not typically see their voices amplified by the media — or social media,” Pransky, a former national political correspondent for NBC News, told The Journalist’s Resource in an email.

Survey data tipped him off to a story he reported in 2022 about an unusual U.S. Senate race in Utah. Democrats had endorsed a politically independent candidate running against a Republican incumbent.

Pransky added that knowing the reasons people hold certain views can be more helpful to a journalist than the share of the population that felt that way at the time of the poll or survey.

“A poll’s topline can be great at providing a snapshot of how a population feels at a specific moment in time,” he wrote. “But the big picture, including questions that delve deeper into why people feel a certain way about an issue or a candidate, can provide insight into why opinions may be shifting — and with as volatile as the toplines can be, I find the ‘why’ questions more valuable to gauging actual sentiment.”

Our other tip sheets on this topic

About The Author