In a media landscape filled with data, it remains a struggle to accurately understand and effectively communicate uncertainty in journalism.
Journalism both consumes and produces uncertainty — this was one of the major themes at the 2017 NICAR (National Institute for Computer-Assisted Reporting) conference that I was lucky enough to attend a few weeks ago. The conference is a calendar highlight for data journalists across the US.
The event was dominated by journalists, and, as a P.Stat., I was struck by how similar the conversations were among different professional sectors. There is obvious potential for partnerships between the statistical and journalism communities in the area of understanding and communicating uncertainty in journalism.
We Can’t Eliminate Uncertainty
This frustration is a common thread. An issue faced by both statisticians and data journalists alike is people’s desire for certainty. We long for a “sure thing.” As a statistical consultant, I’m frequently approached by clients hoping for bulletproof data — they want to rid themselves of uncertainty. Likewise, journalists must walk the line between million-click-generating headlines that proclaim certainty and more nuanced (and more accurate) headlines that moderate expectations of certainty.
When clients come to me hoping to eliminate the uncertainty in their numbers, I have to explain that statistical analysis is not about removing uncertainty, but rather, about understanding it.
As the public’s hunger for statistics increases, a variety of new ways to communicate uncertainty to general audiences are being developed and tried:
- Showing ranges of possible options
- Including confidence intervals in published charts and tables
- Showing a variety of results based on random simulations
Unfortunately, research indicates that most of these methods aren’t successfully conveying uncertainty concepts to readers in the pieces they’re consuming.
A 20th Century Message for Prehistoric Brains
My friend and colleague, Alberto Cairo, suggested at the NICAR conference that our struggle to successfully communicate uncertainty in journalism might stem from disconnects between the way we think and the tools we use. He says:
“We are trying to communicate a 20th-century message with 19th-century tools to prehistoric brains that like binary things.”
Cairo is still developing his theory, but the premise is based on Michael Friendly’s idea of the Golden Age of Statistical Graphics. The basic idea is that many core data visualization techniques were developed during a “golden age” of data visualization — a time when data visualization efforts and developments were at their peak.
The golden age of data viz was then followed by a different peak in data developments — inference. Fisher, sampling, p-values and many other valuable inference methods and tools were developed at this time, which coincided with a “dark age” for visualization. Previous visualization techniques focused on counting and were left behind as statisticians embraced the age of inference — the age of uncertainty.
When a second data viz golden age followed the age of inference and data experts began focusing on how to visualize for the age of inference, Cairo suggests that news and journalism professionals joined the data world, but on a parallel track. They relied on methods developed in the first visualization age. The established (and, in certain circumstances, useful) tools they adopted were inadequate for conveying the results of inference — and as a result, they struggled to convey uncertainty in their journalism.
New Approaches for a New Audience
Often, the best practices and accepted methods of considering uncertainty that we use in the statistics profession don’t translate well to wider audiences. Another, possibly more effective, option is to communicate uncertainty in ways that people are more familiar with — rolls of a dice rather than odds ratios, for example.
But even journalists who understand and avoid pitfalls like the ecological fallacy or margin of error are fighting an uphill battle trying to convey uncertainty in their work — because people don’t like uncertainty.
The New York Times piece, How Not to Be Misled by the Jobs Report, is a great example of uncertainty in journalism. The article demonstrates how statistical noise and humans’ inability to detect randomness mean the Labor Bureau’s monthly jobs report can have us all elated or defeated for very little reason at all.
Regardless of the methods we apply, we can be sure that, overall, the understanding and communication of uncertainty in journalism is a pressing and urgent issue; one that would benefit from collaboration with the statistical community.
Need help better understanding (or communicating) the uncertainty in your data journalism? The data analysts and viz experts at Datassist are at your service. Get in touch now to see what we can do for you.