How to be an informed consumer of polling and survey data

We live in a world awash with polling and survey data. Any time you read or watch the news, you are likely to encounter a story reporting the results of a public-opinion poll. That is especially true in 2020, being both an election year and a period of intense opinion-research activity related to the coronavirus. […]

Already an Subcriber? Log in

Get Instant Access to This Article

Become a Central New York Business Journal subscriber and get immediate access to all of our subscriber-only content and much more.

We live in a world awash with polling and survey data. Any time you read or watch the news, you are likely to encounter a story reporting the results of a public-opinion poll. That is especially true in 2020, being both an election year and a period of intense opinion-research activity related to the coronavirus.

The need to understand the world around us has never been greater. Survey data can provide some insights, but it’s important to be an informed consumer of such data. When you encounter polling and survey data in the news, here are a few questions you can apply to help determine its relevance and to enhance your understanding of what it might be saying. 

Who conducted the research? Most research reported in the media is conducted by specialized political-polling organizations, academic institutions, or market-research agencies. Some of these, like Gallup, Nielsen, or Siena College Research Institute may be familiar to you. If you are unfamiliar with the research firm, it’s worth looking it up online to see what kind of work it has done in the past, areas of specialty, and organizations with which it partnered. This information, or the lack thereof, can give you a sense of the firm’s credibility.

Who sponsored it? Was the research sponsored or commissioned by an organization with strong partisan-political affiliations or a company with a vested financial interest in influencing public opinion on the subject of the survey? If so, that in itself doesn’t make the findings invalid, but it should prompt you to apply extra scrutiny to them. 

Who is reporting it? The credibility of the source reporting on polling data matters as much as the credibility of the pollster. A careless or inexperienced journalist can misinterpret findings or relate them in a misleading manner. Media outlets with an agenda might spin research findings or leave out important details to encourage conclusions that the research doesn’t support. A news article summarizing research findings should provide a link or the contact information of the research organization so you can go straight to the source. That is a good check on the accuracy of media reports, not to mention a way to access deeper data beyond the topline news summaries.

When was the data collected? Reports of survey research should include the dates the data was collected. In some cases, that won’t matter much. But in a fluid, dynamic situation like a tight political race or the current coronavirus pandemic, knowing when data collection occurred and understanding what might have happened to move public opinion since then is vital. Just a week or two could mean the difference between meaningful data and stale information that no longer applies. 

Who was interviewed? All polls and survey research set parameters for who is selected to participate. It might be broad, like “adults.” It could be narrow, like “company CEOs in the manufacturing sector.” You cannot fully appreciate data unless you know who provided it and interpret it in that context. For example, if a poll says the president has a 45-percent approval rating, it is important to determine if the poll was taken among registered voters, likely voters, or members of his own party.

What was the sampling process? Sampling refers to the process by which people are selected to participate in research, and it falls into two broad categories. A probability sample is a process where every member of the population being studied has an equal chance of being chosen. Non-probability sampling chooses participants based on convenience, the judgment of the researcher, participant self-selection, by referral from other participants, or just haphazardly. 

The methodology section of a research summary will document the sampling method used. Most major political polls use probability samples. Market-research studies might use either method. Research done with non-probability samples can be useful, but it is important to understand that you cannot statistically infer the results from such studies apply to a broader population as you can with a probability sample. If the analysis of a study conducted with a non-probability sample appears to be making statistical inferences, especially without a caveat about the sampling method, that is a red flag. 

What is the margin of error? Survey research done with probability sampling will have a margin of error, usually expressed as “+/- X percentage points at the 95 percent confidence level.” That means that the percentage of a result could actually be anywhere within the given number of points. So, if 55 percent of poll respondents say they will vote for a certain candidate and the margin of error is +/- 5 points, that means that the actual figure could be anywhere from 50 to 60 percent among the broader population. (The 95 percent confidence level means that if the same survey was conducted multiple times, the results from each could be expected to fall within the margin of error 95 percent of the time.) A major national political poll will usually have a margin of error of about +/- 3 points, while many market-research studies will be about +/- 5 points. The important thing to keep in mind is that reported survey percentages are midpoints within a statistically likely range. 

How were the questions worded? The wording of questions matters a lot in survey research. Poorly worded questions can confuse respondents and make results meaningless. The phrasing of questions can also introduce bias that will skew results. Consider these three versions of a survey question:

a. Do you support New York State’s social-distancing restrictions related to the COVID-19 pandemic?

b. Do you support New York State’s efforts to save lives during the COVID-19 pandemic?

c. Do you support New York State’s forcible closure of businesses and much of the economy during the COVID-19 pandemic?

Technically, all three of those questions are asking about the same issue, but the latter two are clearly nudging the respondent to answer in certain ways. When possible, you should look at the verbatim wording of questions that were asked of respondents to look for potential sources of confusion or bias.

How do the results track with other recent research done on the same topic? If something about the findings seems questionable, search other research on the topic that was conducted recently. Are the findings consistent? If not, look for differences in methodology or the timing of the study that might account for them. 

Are your own biases affecting your interpretation of the data? Sometimes the biggest obstacle to us learning valuable new information is ourselves. “Confirmation bias” refers to the tendency humans have to give credence to information that supports our pre-existing beliefs and disregard information that contradicts them. We all exhibit confirmation bias sometimes, but if we are aware of it and keep an open mind, we can overcome it. Resist the temptation to reflexively dismiss data that makes you uncomfortable. Instead, apply the questions above to the findings and consider the new possibilities and perspectives open to you if the insights are indeed accurate.        

Vance Marriner is research director at the Central New York Business Journal and a part-time instructor of marketing at SUNY Oswego’s School of Business.

Vance Marriner

Recent Posts

MACNY expands Real Life Rosies, Advance 2 Apprenticeship programs to Onondaga, Oswego counties

DeWITT, N.Y. — MACNY, the Manufacturers Association plans to expand the Real Life Rosies and…

2 hours ago

State, Thousand Islands Land Trust to acquire 1,000 acres to protect water quality in Jefferson County

CLAYTON, N.Y. — Gov. Kathy Hochul and the Thousand Islands Land Trust (TILT) recently announced…

2 hours ago

Cornell University food team takes second place at contest

ITHACA, N.Y. — Cornell University’s team was named a finalist at the Institute of Food…

2 hours ago

Lockheed Martin Sikorsky helicopter built in Tioga County joins presidential fleet

OWEGO, N.Y. — The U.S. Marine Corps formally accepted a 23rd and final next-generation VH-92A…

1 day ago
Advertisement

Naturally Lewis, Lewis County seek expressions of interest for the redevelopment of former Lyons Falls School

LYONS FALLS, N.Y. — The Lewis County Development Corporation, administered by Naturally Lewis, Inc., and…

1 day ago

North Country, Syracuse airports awarded more than $5 million in federal funding for projects

Airports serving Syracuse, areas of the North Country, and other upstate locations will use more…

1 day ago