Your feedback ensures we stay focused on the facts that matter to you most—take our survey.

The methodologies of fact-checking

From Ballotpedia
Jump to: navigation, search
Fact Check by Ballotpedia-Bold.png

BP-Initials-UPDATED.png Fact Check was Ballotpedia's fact-checking project that ran from October 2015 through October 2018. These pages have been archived.


Fact checkers use a range of methodologies that shape their approach to fact-checking. These methodologies govern how they select topics, conduct research, structure articles, evaluate claims and—most importantly—render verdicts.

Below, we break down the methodologies of three major fact-checking organizations in the United States: PolitiFact, FactCheck.org and The Washington Post Fact Checker. We focus on the following areas:

  1. Selection process: the processes by which fact checkers choose what claims to evaluate.
  2. Research methods: the basic techniques and types of sources that fact checkers most commonly use when conducting research on claims, as well as the official rules and editorial policies that govern their approaches.
  3. Claim evaluations: the systems and processes by which fact checkers establish the veracity of a claim.

PolitiFact

See also: PolitiFact

PolitiFact began in 2007 as a project of the St. Petersburg Times (now known as the Tampa Bay Times), calling itself "an independent, nonpartisan news organization." Bill Adair, the St. Petersburg Times' Washington bureau chief, and news technologist Matthew Waite founded PolitiFact with backing from the newspaper.

PolitiFact explains some of the methodology used on its website in an "About PolitiFact" section and in an article ("Principles of Politifact and the Truth-O-Meter") written by PolitiFact creator Bill Adair on February 21, 2015.[1][2] Other sources of information for PolitiFact methodology include interviews with staff and outside observers.

Selection process

PolitiFact fact checks a wide range of political actors and groups, including both elected and non-elected government officials, political candidates, media pundits, celebrities, and special interest groups. They also examine claims made in social media circles in the form of memes and viral images.[1]

But how do they go about selecting claims to fact check?

According to his articles on PolitiFact's principles, Adair says their selection process begins with a preliminary sweep through news stories, political ads and speeches, campaign websites, social media, and press releases. They also accept suggestions from readers.[1][2]

Adair says that PolitiFact asks these critical questions when determining what to fact check:[2]

  • Is the statement rooted in a fact that is verifiable? We don’t check opinions, and we recognize that in the world of speechmaking and political rhetoric, there is license for hyperbole.
  • Is the statement leaving a particular impression that may be misleading?
  • Is the statement significant? We avoid minor 'gotchas' on claims that obviously represent a slip of the tongue.
  • Is the statement likely to be passed on and repeated by others?
  • Would a typical person hear or read the statement and wonder: Is that true?[3]

Another key factor in PolitiFact's selection process is newsworthiness. Adair wrote that the site selects "the most newsworthy and significant" claims. He is reported to have told journalism students in 2011, "We're guided by news judgement. And we are all journalists, we're not social scientists."[2][4]

Adair has cited curiosity as an additional factor that shapes PolitiFact's selection process. In an August 2009 interview for C-SPAN's "Washington Journal," he said, "We choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact check it."[5]

Research methods

PolitiFact's research generally begins with contacting the source of a claim, according to Adair and researcher Lucas Graves. [2] If an elected official, for example, makes a claim that attracts PolitiFact's attention, they reach out to that individual's office for clarification, data or a source to back up that claim. Graves noted that this step is required of PolitiFact writers.[2][6]

With or without data from the original source, PolitiFact then turns to news articles, free and subscription-based sources on the Internet, and on-record interviews with reporters and experts on the subject. On-record interviews assist writers and researchers with the interpretation of data.[2] PolitiFact prohibits the use of off-record sources.[7]

According to Graves, PolitiFact writers are encouraged to seek out nonpartisan data sources whenever possible. Such sources often include government agencies like the Bureau of Labor Statistics or the Congressional Budget Office.[8]

PolitiFact publishes a list of its sources along with each article.[2]

Claim evaluation

PolitiFact's uses a group approach for settling the veracity of a claim.

The first step involves the lead writer submitting an article with a recommended rating to a panel of at least three editors, according to Adair's "Principles of Politifact" article as well as researchers and reporters who have observed the PolitiFact evaluation process.[2][9][10]

PolitiFact's rating system is the "Truth-o-Meter." The meter has six ratings:[1]

PolitiFact's Truth-o-Meter
  • TRUE – The statement is accurate and there’s nothing significant missing.
  • MOSTLY TRUE – The statement is accurate but needs clarification or additional information.
  • HALF TRUE – The statement is partially accurate but leaves out important details or takes things out of context.
  • MOSTLY FALSE – The statement contains an element of truth but ignores critical facts that would give a different impression.
  • FALSE – The statement is not accurate.
  • PANTS ON FIRE – The statement is not accurate and makes a ridiculous claim.[3]

The panel of editors evaluates the article and the author's recommended rating. The panel then discusses whether PolitiFact should follow the author's recommendation or assign a new rating.[10] To make this decision, Adair writes, the panel of editors relies on five principles:[2]

  • Words matter -- We pay close attention to the specific wording of a claim. Is it a precise statement? Does it contain mitigating words or phrases?
  • Context matters -- We examine the claim in the full context, the comments made before and after it, the question that prompted it, and the point the person was trying to make.
  • Burden of proof -- People who make factual claims are accountable for their words and should be able to provide evidence to back them up. We will try to verify their statements, but we believe the burden of proof is on the person making the statement.
  • Statements can be right and wrong -- We sometimes rate compound statements that contain two or more factual assertions. In these cases, we rate the overall accuracy after looking at the individual pieces.
  • Timing – Our rulings are based on when a statement was made and on the information available at that time.[3]

Graves, who sat in on 25 editor evaluation meetings, said that "two of three panelists must agree on the final ruling, though unanimity is strongly preferred and usually achieved." He also said that meetings often last 10 to 15 minutes, though disagreements among the editors or between the editors and the author occasionally extend meetings to multiple hours and sometimes require follow-up meetings. In one instance, Graves reported, an "indecisive panelist had to be replaced" before the panel could make a ruling.[9]

FactCheck.org

See also: FactCheck.org

FactCheck.org was founded in 2003 by former AP, Wall Street Journal and CNN reporter Brooks Jackson and academic Kathleen Hall Jamieson with backing from the Annenberg Public Policy Center of the University of Pennsylvania.

FactCheck.org provides explanations of its methodology in a brief mission statement on its website and in two articles written by former Director Brooks Jackson.[11] Additional information on its methodology, policies and approach to fact-checking is derived from outside sources.

Selection process

FactCheck.org targets "major U.S. political players in the form of TV ads, debates, speeches, interviews and news releases," according to a page titled "Our Mission" on its website. Like PolitiFact, FactCheck.org also fact checks social media claims and chain emails.[11] Unlike PolitiFact, though, it tends to avoid fact-checking media figures and pundits.[12]

Lucas Graves, who sat in on a FactCheck.org training session for his dissertation on fact-checking, notes that the organization emphasizes verifiable facts over opinions in its selection process. He says they typically pursue only claims that are false. When it becomes apparent that a claim is true, the research often ends there and a new claim is picked up. Moreover, FactCheck.org seeks out claims with "national significance" that most readers would find interesting.[12]

Research methods

FactCheck.org, like PolitiFact and The Washington Post Fact Checker, contacts the individual or organization responsible for a claim to request backup data and/or original sources. Writers and researchers also often try to trace claims back to original sources by tracking news stories that mention a specific claim or statistic.[13]

Graves notes that FackCheck.org writers and researchers are taught to rely heavily on federal and state government agencies for raw data.[14] A list of resources compiled by FactCheck.org Director Eugene Kiely shows that they make use of nonprofit groups and think tanks as well, including the Tax Policy Center, the Tax Foundation and the Center for Effective Government.[15]

The organization regularly makes use of experts to help interpret data. According to Graves, though, they do this far less than PolitiFact. FactCheck.org avoids using anonymous sources. Former director, now director emeritus, Brooks Jackson has been quoted as saying, "[W]e don’t cite anonymous sources as proof of anything factual. Why would anyone believe it if we did ... We think of our pieces as meeting the high standards of academic scholarship."[12]

FactCheck.org includes a list of sources at the end of each article.

Claim evaluation

FactCheck.org does not use an official rating system. It is unique among the three major fact-checking organizations in the United States for this reason. Jackson addressed the issue in a December 2012 article titled "Firefighters, Fact-Checking and American Journalism." He called rating systems "inflexible." He said:[16]

Rating statements with devices such as 'truth-o-meters' or 'Pinocchios' are popular with readers, and successful attention-grabbers. But such ratings are by their nature subjective — the difference between one or two 'Pinocchios' is a matter of personal judgment, and debatable. Some statements are clearly true, and some provably false, but there’s no agreed method for determining the precise degree of mendacity in any statement that falls somewhere in between.[3]

Jackson echoed this in an April 2015 article, saying, "We consider [rating systems] inherently subjective, and find that many political claims don't fit neatly into inflexible categories."[17]

While FactCheck.org writers usually assert whether they found a statement to be true, false or somewhere in between, they often qualify the ruling with additional context. Jackson cited an example from PolitiFact and compared it to how FactCheck.org would have handled the claim:[16]

A senator who said a 'majority' of Americans are conservative was rated 'mostly true' (and later 'half true') even though the statement was false. The story cited a poll showing only 40 percent of Americans rated themselves conservative. That’s more than said they were moderate (35 percent) or liberal (21 percent) but still far from a majority. The senator had a point, but stated it incorrectly, thereby exaggerating. A simple 'truth-o-meter' had no suitable category for that. Our approach would have been to say that it was false. But we would also note that the senator would have been correct to say Americans are more likely to call themselves conservative than moderate, or liberal, when given those three choices.[3]

In an American Press Institute "Fact-Checking Project" study, this kind of approach to claim evaluation is described as "a nuanced contextual analysis of the contested claim." The study added, "These fact checks may refute egregious claims in clear, decisive language, announcing in the headline or the first sentence that a speaker has distorted the truth. But they stop short of assessing statements in any systematic fashion that would allow different claims or speakers to be compared."[18] FactCheck.org's method of claim evaluation is, therefore, fluid and encourages readers to examine claims and those who make them on a case-by-case basis.

The Washington Post Fact Checker

See also The Washington Post Fact Checker

The Washington Post Fact Checker started in 2007 as a temporary project aimed at fact-checking the 2008 presidential campaign. The Washington Post made the Fact Checker a permanent feature in 2011 under reporter Glenn Kessler.[19]

Glenn Kessler provided a brief overview of The Washington Post Fact Checker's mission, methodology, principles and scoring system in an "About the Fact Checker" article published in September 2013.[19] Outside sources and interviews with Kessler provide additional information.

Selection process

The purpose of The Washington Post's Fact Checker column, Kessler wrote, is "to 'truth squad' the statements of political figures regarding issues of great importance, be they national, international or local." Political figures, in this case, include elected and non-elected government officials and "political candidates, interest groups, and the media."

The column places a heavy emphasis on the role of readers in selecting claims to fact check. Kessler said, "It's a big world out there, and so we will rely on readers to ask questions and point out statements that need to be checked." He added, "the success of this project depends, to a great extent, on the involvement of you—the reader. We will rely on our readers to send us suggestions on topics to fact check and tips on erroneous claims."

He also included a series of principles for the Post's Fact Checker. The principles that shed light on his claim-selection process are included below:[19]

  • This is a fact-checking operation, not an opinion-checking operation. We are interested only in verifiable facts, though on occasion we may examine the roots of political rhetoric.
  • We will focus our attention and resources on the issues that are most important to voters. We cannot nitpick every detail of every speech.
  • We will strive to be dispassionate and non-partisan, drawing attention to inaccurate statements on both left and right.[3]

Research methods

Similar to its counterparts, PolitiFact and FactCheck.org, the Post's Fact Checker reaches out to the individual or organization responsible for a claim and uses raw data and original sources to examine it. Kessler said that Google searches can be effective tools for investigation, telling Lucas Graves in an interview in 2012, "When I encounter a weird [or] strange fact, one of the first things I do is google the figures, which is an amazingly efficient way to figure out where it comes from."[20]

Unlike his counterparts, however, Kessler said he prefers not to consult subject experts, even for assistance with interpreting complex data. "I'm the kind of reporter who is reasonably confident in his judgments. I like to speak with my voice, because I have actually covered just about everything in Washington... You can see my bio, you can see what I’ve covered. I bring a unique perspective of having listened to bull***t in Washington for 30 years," he told Graves in an interview.[21]

In contrast to PolitiFact and FactCheck.org, the Post's Fact Checker will, in some cases, cite anonymous sources.[22]

Claim evaluation

The Post's Fact Checker's approach to evaluating the veracity of a claim, Kessler says, centers on a "'reasonable man' standard for reaching conclusions. We do not demand 100 percent proof."[19]

Like PolitiFact, the Post's Fact Checker uses a rating system. It is based on Pinocchio, the title character of the children's story, whose nose grew longer with each lie he told. For false claims, it assigns up to four Pinocchios to a statement depending on how untrue the claim is determined to be. Kessler provides a guide to the Pinocchio system in his "About the Fact Checker" article. It can be seen below.[19]

A Pinocchio head
  • One Pinocchio: "Some shading of the facts. Selective telling of the truth. Some omissions and exaggerations, but no outright falsehoods. (You could view this as 'mostly true.')"
  • Two Pinocchios: "Significant omissions and/or exaggerations. Some factual error may be involved but not necessarily. A politician can create a false, misleading impression by playing with words and using legalistic language that means little to ordinary people."
  • Three Pinocchios: "Significant factual error and/or obvious contradictions."
  • Four Pinocchios: "Whoppers."

An additional factor that the Post's Fact Checker uses for assigning Pinocchios is whether or not a claim has been previously debunked. Kessler writes, "Repeated misstatements of previously debunked statistics can, over time, result in higher Pinocchio ratings for a particular claim. In other words, we may hold a politician to a higher standard if he or she already has been put on notice that a certain 'fact' is dubious."[19]

For claims that turn out to be true, the Post's Fact Checker awards a Geppetto check mark, a reference to Pinocchio's father/creator who had a reputation for telling the truth. They also use an upside down Pinocchio head for "flip-flops" and a scales of justice symbol for claims that are too difficult to verify or require more time and/or data.[19]

See also

External links

Footnotes

  1. 1.0 1.1 1.2 1.3 PolitiFact.com, "About PolitiFact," accessed September 10, 2015
  2. 2.0 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 PolitiFact.com, "Principles of PolitiFact and the Truth-O-Meter," February 21, 2011
  3. 3.0 3.1 3.2 3.3 3.4 3.5 Note: This text is quoted verbatim from the original source. Any inconsistencies are attributable to the original source.
  4. Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 143
  5. Smart Politics, "Selection Bias? PolitiFact Rates Republican Statements as False at 3 Times the Rate of Democrats," February 10, 2011
  6. Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 170
  7. Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 180
  8. Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 178
  9. 9.0 9.1 Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 202-3
  10. 10.0 10.1 Niemenlab.org, "Inside the Star Chamber: How PolitiFact tries to find truth in a world of make-believe," August 21, 2012
  11. 11.0 11.1 FactCheck.org, "Our Mission," accessed September 10, 2015
  12. 12.0 12.1 12.2 Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 140-5 Cite error: Invalid <ref> tag; name "factcheck" defined multiple times with different content
  13. Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 170
  14. Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 178
  15. ODU Reporting and Newswriting II, "FactCheck.org’s favorite fact-checking resources," September 14, 2014
  16. 16.0 16.1 FactCheck.org, "Firefighters, Fact-Checking and American Journalism," December 21, 2012
  17. FactCheck.org, "Fact-Checking Is More Popular than Politicians," April 22, 2015
  18. American Press Institute, "A Comparison of Correction Formats: The Effectiveness and Effects of Rating Scale versus Contextual Corrections on Misinformation," February 2015
  19. 19.0 19.1 19.2 19.3 19.4 19.5 19.6 The Washington Post Fact Checker, "About the Fact Checker," September 11, 2013
  20. Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 176
  21. Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 184
  22. Graves, L. (2013) Deciding What’s True: Fact-Checking Journalism and the New Ecology of News (Doctoral Dissertation). Retrieved from ProQuest (UMI 3549415), p. 180