The New York Times is doing something truly groundbreaking this election. They are publishing live polling data during the run-up to the American midterms. As a statistician, I am very excited about this. It’s super-cool and will provide some really interesting insights — both into the elections themselves and into political polling.
But maybe you’re not a stats nerd. And maybe you’re not even especially invested in the US elections. This is still a cool story. Because it demonstrates that data collection and analysis can be incredibly challenging. Even for a massive organization with tons of money and highly skilled scientists.
So maybe it’s ok that your organization sometimes struggles with data. Maybe you can give yourself a break next time your data wrangling doesn’t go quite to plan. Statistical analysis can be incredibly valuable. But it’s not easy, and we all need to remember that.
So… Live Polling? Really?
Yup. You can follow the action on the New York Times live polling site.
Not clear on why this is such a big deal?
Traditionally, polls are conducted, data analyzed, and then the results are presented. Publishing live results is basically pulling back the curtain and letting the audience in on what’s happening:
- Who is being polled
- Who is/isn’t responding
- What assumptions are being made
That’s all really interesting from a transparency perspective. You get to see exactly how the analysts there arrive at the conclusions they do. But another thing live polling shows is just how much work goes into polling — and how hard it can be to get the data you need.
Response Rates Are Not What You Think
Do you remember the show The West Wing? There was an episode (Lies, Damn Lies, and Statistics) where the team was waiting for the results of a poll on the president’s popularity. In one scene, Sam enters his office, and his assistants ask why polling takes so long. Sam replies that it takes about 6,000 calls to get 1,500 responses. Ah, if only.
It’s a lot harder than many of us realize to actually get answers to a telephone poll. Response rates can be incredibly low. Check out the NYT poll of Pennsylvania’s 7th Congressional District.
Pollsters talked to 539 voters after making more than 23,000 phone calls.
Check out the map. The white circles represent people who didn’t answer. (They either didn’t pick up the phone or declined to respond to the questions.) Getting 539 responses from 23,562 phone calls means that only 2.3 percent of calls resulted in someone participating. Broken down further:
- Only 1 in 20 landline calls was successful, compared to 1 in 36 cell phone calls
- Just 1 in 29 white voters responded, compared to 1 in 44 non-white voters
- And 1 in 24 calls to seniors (65+) was answered, compared to 1 in 36 for 18-29-year-olds
Not Everyone Who Responds Has an Answer
The grey circles on the NYT live polling maps are also interesting. They represent the undecideds. And a significant number of people (nearly half) had no opinion when asked about the candidates.
Both nominees to replace longtime Republican Congressman Charlie Dent are still unknown to a large swath of voters in this district.
Democrat Susan Wild
- 34% favorable
- 19% have an unfavorable opinion
- 47% don’t know
Republican Marty Nothstein
- 30% hold a favorable view
- 22% have an unfavorable view
- 48% are unsure
The Next Step isn’t Much Easier
Polling isn’t easy. (Live polling really isn’t!)
I haven’t even gotten into the complexities that come with conducting a survey. How do you decide who to contact? How do responses correlate with turnout? What about weighting the results? Should they be weighted, and if so, how? How do you convey your level of confidence in your conclusions?
The Times offers some in-depth insight into how the polls are being conducted and how their experts chose the methods they did. (It’s definitely worth reading if you want to gain more understanding about the intricacies of polling.) I’ve also talked a few times before about communicating uncertainty in your results — and why that matters.
Have an opinion on live polling? We’d love to hear it. Want to learn more about collecting and analyzing survey data? We can help. At Datassist, our goal is to help nonprofits and journalists alike tell compelling, honest data stories. Get in touch to discuss your project and how we can help.