Facing Facts (and Avoiding Alarmism)

pewreport-factopinion-1This morning, I saw a Reuters story with an alarming lede: “Only a quarter of U.S. adults in a recent survey could fully identify factual statements—as opposed to opinion—in news stories, the Pew Research Center found in a study released on Monday.”

Well, that doesn’t sound good, I thought.

This seems like important territory for history instruction to address. It might also provide a useful reading for students. So I pulled up the Pew report in question: “Distinguishing Between Factual and Opinion Statements in the News” (by Amy Mitchell, Jeffrey Gottfried, Michael Barthel and Nami Sumida, dated June 18, 2018).

There’s a lot to like about the way this study was conducted. But I have concerns.

The researchers surveyed 5,035 U.S. adults over a two-week period. In the most important aspect of the study, they presented each participant with twelve statements and asked to classify each of them as “a factual statement (whether you think it is accurate or not)” or “an opinion statement (whether you agree with it or not).”

Five were statements the researchers considered clearly factual, albeit controversial, like “Health care costs per person in the U.S. are the highest in the developed world.” Five were statements they considered clearly opinion, whether true or false, like “Democracy is the greatest form of government.” And two were statements they considered “borderline”: “Applying additional scrutiny to Muslim Americans would not reduce terrorism in the U.S.” and “Voter fraud across the U.S. has undermined the results of our elections.” (The report simply ignores these statements in the main results.)

This isn’t terrible methodology, overall, and the study has some interesting findings. For example, respondents were much more likely to get the answers right if they showed “a lot of trust in news media” or were “very digitally savvy” (based mostly on their frequency of Internet use). And if you look further into the details of the report, there are a lot of other cool things happening. I recommend reading it.

My concerns about the researchers’ approach are significant, however.

First, the word “factual” is ambiguous in ordinary usage. It can mean “pertaining to claims of fact” or “factually accurate.” Indeed, people are far more likely to use the term in the latter sense. So when researchers asked participants to identify a statement as factual “whether you think it was accurate or not,” they were asking them to do something counter-intuitive. I suspect this skewed the results. It is likely that some people got an answer wrong because of the code-switching involved rather than because they really cannot distinguish between fact and opinion.

The only way I see to avoid this problem is to use different wording—instead of asking whether a statement is “factual,” researchers might ask whether it is a “claim of fact,” or something like that.

My second concern is about the statements themselves. Some could reasonably be interpreted differently from the way the researchers see them. For example, I object to calling this a “clearly” factual statement: “Immigrants who are in the U.S. illegally have some rights under the Constitution.” Yes, it can be a factual statement, and perhaps that is how it should be seen. But when non-lawyerly Americans talk like that in ordinary life, they typically mean something normative or aspirational—something more like “To follow the spirit of the Constitution, we should recognize certain rights” or “The government should recognize that the Constitution conveys certain rights, whether or not this is currently recognized.”

(It would be easy enough to fix this statement. “Federal courts have ruled that immigrants who are in the U.S. illegally have some rights under the Constitution” would be clearly a claim of fact.)

Overall, the ambiguity of statements like this should not be an enormous problem. Respondents who understand the difference between fact and opinion should still mostly get the answers right. But this brings me to my third concern.

The study’s headline-grabbing top paragraphs and prepared graphics emphasize one measure in particular: the percentage of respondents who correctly classified all five of the “clearly factual” or “clearly opinion” statements. The report disparages any other level of success. The key passage: “The main portion of the study … found that a majority of Americans correctly identified at least three of the five statements in each set. But this result is only a little better than random guesses. Far fewer Americans got all five correct, and roughly a quarter got most or all wrong.”

Not surprising, then, that Reuters misleadingly says the study found that “only a quarter of U.S. adults” can “fully” tell fact from opinion.

pewreport-factopinion-3

In fact, 50% of respondents got at least four of the factual statements right, and 59% got at least four of the opinion statements right. Those are not bad results, considering how easy it would be to interpret at least one of the statements differently from the way the researchers intend.

And although getting three of five statements right may technically be “a little better than random guesses,” the report could just as fairly call it a little worse than 80% accuracy. 

Indeed, in the appendices, the report makes clear that most statements were classified correctly by about three-quarters of respondents. Of the ten main statements, eight were classified correctly by 68-80% of participants. (The lowest correct response rate—for the ambiguous statement about illegal immigrants’ constitutional rights—was 54%.)

This study is legitimate cause for concern. But we should be careful not to overstate what its results mean.