December 14, 2017
Advances in survey research make reporting polls a challenge for newsrooms, research shows
By Jessica Mahone, Democracy Fund Public Square Research Associate
Conversations between academic research and local newsrooms are often siloed. Interesting and useful findings from research are trapped in long papers that those in local newsrooms may not know about or have time to sift through. This series, Research Roundup, highlights some key findings from recent research and what they could mean for those working in local news.
This month on Research Roundup, we’re featuring research on best practices for reporting on public opinion polls and how accurate infographics benefit the credibility of news organizations.
Data and charts and numbers, oh my!
Coming out of two high-profile elections in Virginia and Alabama and heading into what is shaping up to be an eventful 2018 midterm season, we look at two pieces of research about data reporting and visualization from the most recent issues of the Newspaper Research Journal and Journalism. They are behind a paywall but can often be accessed by contacting the authors directly.
Important note for those that want more specific details — some identifying information, such as the names of news organizations or locations, are not specified in the papers due to practices to ensure that peer review processes remain blind. The goal is to ensure that reviewers cannot identify authors and will instead judge works based on their quality, rather than who the authors are.
Survey Says: Newsroom Practices in Reporting Public Opinion Polls
Through a series of 41 in-depth interviews with political journalists, media analysts, and public opinion pollsters, the author finds that journalists are increasingly relying on poll aggregator sites such as FiveThirtyEight for precise predictions of elections. Building on this, the authors found:
- Newsrooms increasingly lack the ability and in-house expertise to assess the quality of polls and to enact gatekeeping standards around polls.
- Those working in political reporting say that overemphasis of pre-election forecasts has decreased trust not only in public opinion research but also in journalism that draws on it, something reflected- somewhat ironically- in polling earlier this year.
As discussed at the IRE and NICAR conference this year, newsrooms face a difficult challenge in improving their understanding and reporting of public opinion data. However, this paper points to two things even small newsrooms can do to improve their reporting of polls:
1) Newsrooms can avoid overemphasizing any one particular poll or change in polls, particularly if results are within the margin of error. For example, in the days just before Virginia’s gubernatorial election, Monmouth University released a poll showing a two-point lead for Democrat Ralph Northam. However, the margin of error, which indicates how much variability there is in a survey result, was 3.7%. Given this, it was best to understand the race as being tied between the two candidates, despite the poll’s own headline of a two-point lead.
2) Newsrooms should never publish results from any poll if there is no information on who paid for it, regardless of who conducted the poll. This information is typically included in the section of a survey report that details the methods used to collect data, usually at the end of the report.
For additional pointers on reporting public opinion data, Harvard’s Shorenstein Center offers this guide on polling basics, and Nieman Reports has put together a toolbox for political reporting that goes beyond just reporting facts and figures.
Seeing Isn’t Always Believing: Misleading Infographics and Credibility
Infographics play an important role in conveying complex data to audiences. However, sometimes, infographics can be misleading, depicting information in a way that isn’t consistent with the numeric data. To understand how audiences make sense of misleading graphics, the authors conducted an experiment with 239 students to find out how much individuals remember from such graphics and how credible they find them. They found:
- Audiences spend more time reading misleading infographics than they do other graphics, and they are more likely to remember the information from misleading graphics.
- However, audiences also rate the credibility of misleading infographics as lower than other graphics.
It’s critical to note that misleading graphics are not those where false information is presented, and more often than not, misleading graphics are not intentional. However, even though information is more memorable from deceptive graphics, care should be taken to present data in a transparent way. Three easy ways to make sure graphics add up for readers:
1) Check the math. Make sure the numbers add up to 100 for pie, bar, and area charts.
2) Label axes clearly, and never cut them short. This can make differences between groups seem larger than they are.
3) Make notes that help readers make sense of the data.
There are a number of tools that make creating compelling infographics simple and straightforward. Poynter has a list of tools for data journalists that includes visualization and mapping tools such as Google Fusion Tables and Tableau Public that are easy for those new to infographics and data visualization, and the Journalist’s Toolbox has a comprehensive list of tools for data visualization.
Of course, not all election reporting is (or should be) polls, and data visualization isn’t just for political reporting. For examples of creative, collaborative election storytelling, check out Voting Block NJ, and for examples of storytelling with data, take a look at the stories included alongside API’s list of topics that benefit from data reporting and visualization.
Do you have a project or paper that the audience of the Local News Lab would be interested in? Is there something that you want to know more about from local news related-research? Let us know at localnewslab@democracyfund.org.
Jessica Mahone is the Research Associate for the Public Square program at Democracy Fund. Previously, she was a researcher with the News Measures Research Project at Duke University and a temporary Research Associate in journalism at Pew Research Center. Her research interests are varied but center around local news, civic engagement, and diversity in media.