Become part of the movement for unbiased, accessible election information. Donate today.

Academic studies of political fact-checking

From Ballotpedia
Jump to: navigation, search
Fact Check by Ballotpedia-Bold.png

BP-Initials-UPDATED.png Fact Check was Ballotpedia's fact-checking project that ran from October 2015 through October 2018. These pages have been archived.


This article provides brief summaries of academic studies of political fact-checking written between 2010 and 2015. It includes both peer-reviewed and popular publications.

Articles are arranged chronologically. To skip to a specific year, click one of the following links:
201520142013201220112010

In the articles below, fact-checking researchers look at whether fact checkers are subjective in selecting claims to verify or partisan in their investigations and conclusions. Some research indicates that prestige rather than public demand is the primary factor in motivating media outlets to begin fact-checking. Other topics of interest to researchers are the responses of elected officials to fact-checking, how preconceived views affect a reader's conclusions about fact-checking and whether fact-checking corrects public misperceptions. One study found that fact checks which attempt to correct public misperceptions may serve to solidify a previously held conviction that the statement was true.

2015

"Revisiting the Epistemology of Fact-Checking"

Jan. 2, 2015
Michelle A. Amazeen, Rider University

Amazeen, M.A. (2015). "Revisiting the Epistemology of Fact-Checking." Critical Review, 27(1), 1-22.

Michelle A. Amazeen, assistant professor of advertising at Rider University, challenged "The Epistemology of Fact Checking," a paper by Joseph E. Uscinski and Ryden W. Butler that viewed journalistic fact-checking as naìve and unscientific.

Amazeen argued that Uscinski and Butler drew anecdotes from an unrepresentative sample of fact checkers. She said dedicated fact checkers like PolitiFact or FactCheck.org should represent journalistic fact-checking for research on fact checkers' epistemology. Amazeen inserted some anecdotes to represent epistemologically sound fact-checking.

To bolster her argument, Amazeen appealed to an empirical study of America's three major dedicated fact checkers: FactCheck.org, PolitiFact and the The Washington Post Fact Checker. Amazeen examined ads from the 2008 and 2012 election cycles for cases where two or more of the fact checkers examined the same claim.

Separating claims found true and claims found to be some degree false into separate groups, Amazeen noted that the fact checkers overwhelmingly agreed when claims were deceptive. The worst performance in terms of agreement came from matching The Washington Post Fact Checker with PolitiFact and these two agreed 95 percent of the time. To establish that the results were not the consequence of chance, Amazeen calculated Krippendorff’s alpha for her results. She said the result, 0.66, was above the minimum required.

From those results, Amazeen inferred that fact checkers rated claims accurately, reasoning that reaching the same conclusion from different vantage points suggests the fact checkers are right.


"Agent-based modeling (ABM) and the Dissemination of Erroneous Information: A viral explanation of rumor propagation"

April 2015
Leslie Caughell, Virginia Wesleyan University
Wenshuo Zhang, University of Illinois
Amanda Cronkite, University of Illinois

Caughell, L., Zhang, W., & Cronkite, A. (n.d.). "Agent-based modeling (ABM) and the Dissemination of Erroneous Information: A viral explanation of rumor propagation"

Researchers led by Leslie Caughell used a computer model developed for studying contagion to study the spread of rumor. They started with a set of foundational assumptions, including the idea that people accept rumors more readily from trusted sources within their social groups. Experiments covered interactions among small world social networks where friends of friends are likely to know each other and scale-free social networks where friends of friends are unlikely to know each other.

The researchers found rumors spread best in the small world social networks, especially if persons resistant to the virus (rumor) were not spread evenly throughout the population. Experiments also showed rumors spread more readily when spread by trusted persons with many social connections.


"The Diffusion of Fact-checking: Understanding the growth of a journalistic innovation"

April 22, 2015
Lucas Graves, University of Wisconsin
Brendan Nyhan, Dartmouth College
Jason Reifler, University of Exeter

Graves, L., Nyhan, B., & Reifler, J. (2015, April 22). "The Diffusion of Fact-checking: Understanding the growth of a journalistic innovation."

Political science researchers Lucas Graves, Brendan Nyhan and Jason Reifler did a two-part study of the growth of fact-checking in an effort to understand the reasons behind its growth. The research looked to test whether news outlets tried to compete with another state media outlet to provide fact check coverage or whether journalists gravitated toward fact-checking as a means of emulating respected peers.

In the first study, the researchers looked for differences in fact check coverage between 2008 and 2012 while looking at whether the presence of a PolitiFact state affiliate made a difference in fact check coverage in those states. They found no significant difference in fact check coverage for states with a PolitiFact affiliate.

In the second study, the researchers explored what motivations help encourage fact-checking. They sent out two messages to encourage fact-checking. One message emphasized the prestige enjoyed by fact checkers. The other emphasized the public demand for fact-checking. The first set of media outlets, treatment group 1, received the prestige message. A second set of media outlets received the public demand message. A third set received the first and second messages in equal amounts. The control set received no messages from the researchers.

To measure the results, the researchers conducted a media survey of fact check coverage like the one from the first study. Reporters who were sent encouragement to fact check based on prestige showed an increase in fact check coverage.

The researchers concluded that the journalists' drive for prestige encourages the spread of fact-checking more than the drive to satisfy public demand.


"Identifying and Correcting Policy Misperceptions"

April 23, 2015
Emily Thorson, George Washington University

Thorson, E. (2015, April 23). "Identifying and Correcting Policy Misperceptions."

Emily Thorson conducted two studies. One helped identify three commonly held political misperceptions. The other helped estimate the scope of the misperceptions along with the effectiveness of a correction.

In her first experiment, Thorson identified three common American misperceptions using an open-ended interview process. The misperceptions included thinking China owns over 50 percent of U.S. debt, believing the U.S. Welfare program, TANF, has no time limit on benefits and believing Social Security operates like a retirement account.

Thorson's second experiment tested how her test subjects responded to corrections of their misperceptions. Thorson used a forced choice between two plausible options with her test subjects, allowing them also to express their degree of confidence in their answers on a three-point scale. She found each of the three misperceptions was common among both Democrats and Republicans.

When experimenters later corrected those misperceptions with a simple method similar to returning a graded school paper, test subjects often accepted the correction. Democrats and Republicans accepted the corrections at about the same rate. Moreover, test subjects expressing high confidence in their misperceptions often readily accepted corrections.

Thorson concluded that the combination of an open-ended interview process coupled with her method of questioning may serve researchers well in studying misperceptions. She said her process found misperceptions less reinforced by partisanship than those typically studied and suggested a resulting decrease in motivated reasoning may help account for her results.


"Estimating Fact-Checking’s Effect: Evidence from a Long-Term Experiment during Campaign 2014"

April 28, 2015
Brendan Nyhan, Dartmouth College
Jason Reifler, University of Exeter

Nyhan, B., & Reifler, J. (2015, April 28). "Estimating Fact-Checking's Effects: Evidence from a long-term experiment during campaign 2014."

Political science researchers Brendan Nyhan and Jason Reifler explored three research questions with an experiment based on a multi-wave survey.

  • Research question 1: Will exposure to fact-checking polarize people’s views of the practice?
  • Research question 2: Does fact-checking affect levels of trust in politicians? Will these effects be strongest among people who are already highly distrustful?
  • Research question 3: How does fact-checking affect political efficacy? Do these effects vary by prior political knowledge?[1]

The researchers used the YouGov survey service to simulate a representative experimental population. Between Sept. 21 and Nov. 18, 2014, the researchers gave their treatment group a condensed PolitiFact fact check to read. The control group received a news article. Both groups were asked questions about the content of the fact check and their attitudes toward fact-checking.

The researchers reported that Democrats and Republicans in the treatment group were more likely to show improved knowledge of the fact check topic than the control group. More educated and informed people in both groups showed more interest in fact-checking than less educated and informed people in both groups. Republicans in both groups showed less favorable attitudes toward fact-checking than Democrats.

The researchers found the attitudes toward fact-checking tended to improve in the treatment group, but they found no effect for the second two research questions. The treatment condition had no significant effect on attitudes toward politicians. Nor did the treatment condition result in any effect on perceived political efficacy. The researchers concluded that exposure to fact-checking did not polarize people's views of the practice.


"'Fact-Check This': How U.S. politics adapts to media scrutiny"

May 13, 2015
Mark Stencel, Duke University

Stencel, M. (2015, May 13). "'Fact Check This': How U.S. politics adapts to media scrutiny."

Former Washington Post and NPR journalist Mark Stencel reviewed the way politicians respond to fact-checking journalism. Using a survey of fact checks and over a dozen interviews, Stencel looked for evidence of fact-checking's successes and lingering challenges.

Stencel found evidence that politicians sometime stop using claims criticized by fact checkers. But he also found that political campaigns develop strategies to play fact-checking to their own advantage. Some respond by using fact checks to support their own claims. Some use fact check findings to oppose other candidates. Others insist the fact check is wrong, and in cases where the campaign opposes the fact checker, Stencel called it "going nuclear." Still, other campaigns stop talking to fact check journalists. Alternatively, campaigns carefully prepare their messages in anticipation of being fact checked.


"The Epistemology of Fact Checking (Is Still Naìve)"

August 3, 2015
Joseph E. Uscinski, University of Miami

Uscinski, J. E. (2015). "The Epistemology of Fact Checking (Is Still Naìve): Rejoinder to Amazeen." Critical Review, 27(2), 1-10.

In a reply to Michelle A. Amazeen's "Revisiting the Epistemology of Fact Checking," University of Miami political scientist Joseph E. Uscinski defended his 2013 paper with Ryden W. Butler, "The Epistemology of Fact Checking."

Amazeen had argued that Uscinski and Butler overgeneralized and used inconsistent methods in their criticism of journalistic fact-checking. She defended mainstream fact checkers by showing they reach similar conclusions when checking the same claims, suggesting that agreement helps show the fact checkers are right.

Uscinski reviewed the arguments from "The Epistemology of Fact Checking" and "Revisiting the Epistemology of Fact-Checking." Then Uscinski argued that the low bar Amazeen set for agreement between fact checkers, among other problems, negated her criticisms. He said the severe lack of "true" fact checker ratings in Amazeen's study invalidated her results, for the fact checkers almost invariably agree regardless of their reasoning.

Uscinski argued that none of Amazeen's criticisms affect the arguments from "The Epistemology of Fact Checking," but he allowed that fact-checking could benefit society if journalists would stick to verifiable claims and use consistent scientific methods.

2014

"The Effect of Fact‐Checking on Elites: A Field Experiment on US State Legislators"

July 30, 2014
Brendan Nyhan, Dartmouth College
Jason Reifler, University of Exeter

Nyhan, B., & Reifler, J. (2014). "The Effect of Fact‐Checking on Elites: A Field Experiment on US State Legislators." American Journal of Political Science, 628-640.

Political scientists Brendan Nyhan and Jason Reifler conducted an experiment to measure the effects of fact-checking on state legislators.

The researchers sent letters to state-level politicians in their treatment group warning them of potential adverse publicity from negative fact checks. The researchers sent the letters in states where PolitiFact affiliates were located, reasoning that the local presence of a fact checker would lend credibility to the threat of political harm from a negative fact check. Letter recipients were instructed to mail a return envelope to confirm receipt of the message. Twenty-one percent of the envelopes sent to the treatment group were returned, making it difficult to confirm the treatment condition.

A control group did not receive letters.

The researchers then looked for evidence of differences between the control and treatment groups as measured by the number of politicians’ claims questioned in the media, including PolitiFact affiliates. The treatment group had fewer claims questioned than the control group.

The researchers concluded that the threat of fact checking had an effect on state politicians.


"Checking the fact-checkers in 2008: Predicting political ad scrutiny and assessing consistency"

October 14, 2014
Michelle A. Amazeen, Rider University

Amazeen, M.A. (2014). "Checking the fact-checkers in 2008: Predicting political ad scrutiny and assessing consistency." Journal of Political Marketing, 1-32.

Michelle Amazeen examined the factors that result in ads receiving fact check treatment and looked for evidence of fact checker consistency.

Amazeen assigned coders to evaluate political claims from the 2008 campaign season. She then correlated the coders' findings with the evaluations of fact checkers in 2008. The fact checkers gravitated toward ads that opposed other candidates, suggesting ads that opposed other candidates were more likely to draw fact checkers' attention than other political ads.

Amazeen's paper also surveyed instances where different fact check services evaluated the same political claim. Amazeen concluded that the high level of agreement in the small set of data helped provide evidence of the accuracy of fact-checking by FactCheck.org, PolitiFact and The Washington Post Fact Checker.

2013

"Making a difference? A critical assessment of fact-checking in 2012"

October 2013
Michelle A. Amazeen, Rider University

Amazeen, M.A. (2013). "Making a difference: A critical assessment of fact-checking in 2012." New America Foundation Media Policy Initiative Research Paper.

In this study, Michelle Amazeen surveyed the landscape of fact-checking in 2012, looking at its methods, goals and influence. Amazeen included a discussion of the pros and cons of using a rating system and recommended a set of best practices for fact checkers.


"The Epistemology of Fact Checking"

October 30, 2013
Joseph E. Uscinski, University of Miami
Ryden W. Butler, University of Miami

Uscinski, J.E. and R.W. Butler (2013). "The Epistemology of Fact-Checking." Critical Review, 25(2), 162-180.

Political scientists Joe Uscinski and Ryden Butler surveyed the truth-finding efforts of journalistic fact checkers and found them lacking.

Uscinski and Butler noted that displaying a politician's average fact check rating, as The Washington Post does, creates a misleading impression if the selection process lacks neutrality or the fact checker uses an inconsistent method for rating claims. The researchers also criticized the fact checkers for weighing in on unverifiable claims, such as simplistic claims of causation or future predictions.

Uscinski and Butler likewise deplored the inconsistency of the fact checkers' methods, selecting four examples from PolitiFact to illustrate their point.

The researchers concluded that fact checkers' inconsistent and unscientific methods betray a naive approach to their work.


"Deciding What's True: Fact-Checking Journalism and the New Ecology of News"

2013
Lucas Graves, Columbia University

Graves, L. (2013). "Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral dissertation, Columbia University).

Lucas Graves' dissertation took an extensive look at the history of fact-checking, the state of fact-checking and the reasons behind the rise of modern fact-checking.

Graves drew on his experience working for and observing the fact-checking services FactCheck.org and PolitiFact to provide a detailed view of the modern fact-checking process. Graves supplemented those observations with interviews and historical research.

Graves described fact-checking journalism as a media movement developed in opposition to a threat to its authority posed by the Internet blogging culture. Fact-checking, according to Graves, walk the blurry line between objective reporting and opinion journalism.


"Study: Media Fact-Checker Says Republicans Lie More"

May 28, 2013
The Center for Media and Public Affairs, George Mason University
Press Release regarding the study

The Center for Media and Public Affairs. (2013, May 28). "Study: Media Fact-Checker Says Republicans Lie More."

The Center for Media and Public Affairs released a study of ratings by PolitiFact. The Center examined 100 statements by both Republicans and Democrats from Jan. 20 to May 22, 2013. The authors noted that this time frame coincided with a news cycle which included several Obama Administration controversies. PolitiFact determined Republican statements were false at three times the rate of Democratic statements. Only 16 percent of Republican statements were deemed true, compared to 54 percent of Democratic statements.


"Politics and Facts in PolitiFact Ratings: A Reply to Vanity Fair"

June 7, 2013
The Center for Media and Public Affairs, George Mason University
Press Release regarding the study

The Center for Media and Public Affairs. (2013, June 7). "Politics and Facts in PolitiFact Ratings: A Reply to Vanity Fair."

The Center for Media and Public Affairs responded to a May 29, 2013, Vanity Fair article by Kurt Eichenwald. In a press release, the Center defended its methodology and the sampling of its recent "Media Fact-Checker Says Republicans Lie More." They argued that their study, which focused on the fact-checking rating system as applied to Republicans and Democrats, was not an assessment of which political party is telling the truth.

2012

"Assessing truth in the information age: Evidence from Politifact"

June 22, 2012
Michael Nash, Oregon State University

Nash, M. (2012). "Assessing truth in the information age: Evidence from Politifact." (Doctoral dissertation, Oregon State University).

Graduate student Michael Nash studied PolitiFact’s ratings of politicians, assuming the ratings were accurate for the sake of the study. Using PolitiFact’s coverage of the 2010 election, Nash concluded that PolitiFact was somewhat more likely to fact check statements from Republicans and women. Nash also concluded that PolitiFact finds Republicans more likely than Democrats to transmit misinformation.

Nash blamed the media's "horse race frame" for much public misinformation and expressed hope that fact-checking’s abandonment of that frame may help decrease misinformation.


"Study: PolitiFact Rates GOP As Biggest Liar"

Sept. 21, 2012
The Center for Media and Public Affairs, George Mason University

The Center for Media and Public Affairs. (2012, September 21). "Study: PolitiFact Rates GOP As Biggest Liar."

The Center for Media and Public Affairs released a study of PolitiFact's coverage of the 2012 campaign. This study analyzed the ratings given by PolitiFact of 98 statements by candidates or surrogates, as well as campaign ads spanning June 1 to Sept. 11, 2012. Democrats received the "true" or "mostly" true designation at twice the rate of Republicans. "Conversely, statements by Republicans were rated as entirely false about twice as often as Democratic statements – 29 percent false ratings for GOP statements vs. 15 percent false ratings for Democrats," the study said. The study pointed out other ratings distinctions in statements made by presidential candidates and their campaigns. Romney’s campaign earned 26 percent false ratings, while Obama’s campaign received 5 percent false ratings.


"Study: Fact-Checkers Disagree on Who Lies Most"

October 22, 2012
The Center for Media and Public Affairs, George Mason University

The Center for Media and Public Affairs. (2013, May 28). "Study: Fact-Checkers Disagree on Who Lies Most."

The Center for Media and Public Affairs issued a study comparing the ratings of PolitiFact and the The Washington Post Fact Checker for the 2012 presidential candidates and their campaigns. The study examined 152 statements by presidential candidates from July 1 to September 11, 2012. PolitiFact rated Republican statements as false at almost twice the rate of Democratic statements. The Washington Post Fact Checker appeared more balanced for this time frame in assessing the veracity of the two candidates’ statements.

The Center's President, Robert Lichter, noted, "This study shows that media fact-checking involves subjective judgments, just like any other form of journalism. Voters must still decide for themselves which 'facts' they trust."

2011

"Selection Bias? PolitiFact Rates Republican Statements as False at 3 Times the Rate of Democrats"

Feb. 10, 2011
Eric Ostermeier, University of Minnesota

Ostermeier, E. (2011, February 10). "Selection Bias? PolitiFact Rates Republican Statements as False at 3 Times the Rate of Democrats."

Political scientist Eric Ostermeier surveyed PolitiFact's ratings of former or current officeholders during 2010, finding Republicans received substantially more negative ratings.

"Republican statements were graded in the dreaded 'false" and "pants on fire" categories 39 percent of the time, compared to just 12 percent for statements made by Democrats," Ostermeier wrote.

Under the assumption the ratings were done fairly, Ostermeier considered possible explanations. Did Republicans simply lie more than Democrats? While allowing for that possibility, Ostermeier focused on the additional possibility of selection bias. Ostermeier noted that PolitiFact reveals little about its selection process, essentially using an undescribed system of editorial discretion. Ostermeier called for greater transparency in the selection process to reduce the perception PolitiFact targets Republicans.

2010

"Why the 'Death Panel' Myth Wouldn't Die: Misinformation in the Health Care Reform Debate"

January 2010
Brendan Nyhan, University of Michigan

Nyhan, B. (2010, January). "Why the 'Death Panel' Myth Wouldn't Die: Misinformation in the Health Care Reform Debate." The Forum (Vol. 8, No. 1).

Political science researcher Brendan Nyhan looked at the topic of misperceptions in this paper and focused on people's beliefs about health care reform. Nyhan defines misperceptions as "demonstrably false claims and unsubstantiated beliefs about the world that are contradicted by the best available evidence and expert opinion."

Nyhan describes survey data measuring misperceptions about the health care reform sought by President Bill Clinton in the 1990s and misperceptions about the health care reform proposed by President Barack Obama. Using statistical modeling, Nyhan concluded Republicans were more likely than non-Republicans to believe misperceptions about the Democratic health care reform proposals. Nyhan linked his conclusion to studies of motivated reasoning, which suggest people are more likely to accept ideas in accord with their ideology.

Nyhan partly blamed a media climate that allows partisans to consume news that fits their tastes. He also charged the news media with partial responsibility for allowing misperceptions to take hold' and suggested steps media outlets might take to prevent the future spread of misperceptions.


"When Corrections Fail: The persistence of political misperceptions"

March 30, 2010
Brendan Nyhan, University of Michigan
Jason Reifler, Georgia Southern University

Nyhan, B., & Reifler, J. (2010). "When corrections fail: The persistence of political misperceptions." Political Behavior, 32(2), 303-330.

Brendan Nyhan and Jason Reifler examined the effect of corrective information on misperceptions. Nyhan and Reifler defined "misperceptions" as "cases in which people's beliefs about factual matters are not supported by clear evidence and expert opinion." They had subjects read mock news stories with or without a correction notice. The researchers found that corrective information in the news stories did little to correct misperceptions encouraged in the version of the mock news story lacking the correction. In fact, Nyhan and Reifler described a "backfire effect" in their conclusion. Corrective information in the mock news stories sometimes increased misperceptions compared to the uncorrected stories.

See also

Footnotes

  1. Note: This text is quoted verbatim from the original source. Any inconsistencies are attributable to the original source.