Your feedback ensures we stay focused on the facts that matter to you most—take our survey
Pliny's Point polling methodology
For Ballotpedia's presidential approval, congressional approval, direction of the country, and generic congressional ballot polling results, we take an average of the most recent polls (from the last 30 days, with some exceptions for major news events) on one or more of these topics conducted by the following sources:
- CBS News (or CBS News and The New York Times)
- The Economist / YouGov
- Fox News
- Gallup
- Pew Research
- Quinnipiac University
- Rasmussen Reports
- Reuters / Ipsos
- USA Today / Suffolk University
- The Wall Street Journal / NBC
- The Washington Post / ABC
- Politico/Morning Consult (added March 6, 2017)
- Investors' Business Daily/TechnoMetrica (IBD/TIPP) (added April 4, 2017)
- Public Policy Polling (added April 10, 2017)
- The Marist Poll (added September 19, 2017)
- Monmouth University (added September 21, 2017)
- CNN (added September 22, 2017)
- Emerson College (added January 12, 2018)
- Harvard-Harris Poll (added February 22, 2018)
- Grinnell College(added September 12, 2018)
- ScottRasmussen.com(added September 14, 2018)
- NewsNation(added June 7, 2023)
- Civiqs(added June 15, 2023)
- Harris X(added June 16, 2023)
- Yahoo! News / YouGov(added July 21, 2023)
- The New York Times / Siena(added March 4, 2024)
Ballotpedia chose to include polls from these organizations in our averages because we regard their methodologies as broadly trustworthy and reliable. If you know of other outlets who do aggregate polling on these topics, email us. We average the results and show all polling results side-by-side because we believe that paints a clearer picture of public opinion than any individual poll can provide. The data is updated daily as new polling results from the above sources are published.
Understanding polling
Below we briefly highlight three aspects of public polling that illustrate both the complexity of polling and how polls tend to differ from one another. Understanding these concepts is key to interpreting what polls mean and underscores the value of aggregating polling results.
Featured polls methodologies
Source | Sample size | Percent margin of error (+/-) | Contact method | FiveThirtyEight Grade* |
---|---|---|---|---|
CBS News | 1,111 adults | 4 | Live phone | N/A |
The Economist / YouGov | 1,327 registered voters | 3.2 | Online | B |
Fox News | 1,020 registered voters | 3 | Live phone | B |
Gallup | 1,500 adults | 3 | Live phone | B- |
Pew Research | 2,504 adults | 2.3 | Live phone | B+ |
Quinnipiac University | 1,514 registered voters | 3.1 | Live phone | A- |
Rasmussen Reports (Pres. Approval) | 1,500 likely voters | 2.5 | Automated phone + online | C+ |
Rasmussen Reports (Direction of Country) | 2,500 likely voters | 2 | Automated phone | |
Reuters / Ipsos | 2,744 adults | 2.1 | Online | A- |
USA Today / Suffolk University | 1,000 adults who identified as registered voters | 3 | Live phone | B+ |
The Wall Street Journal / NBC | 900 adults | 3.27 | Live phone | A- |
The Washington Post / ABC | 1,014 adults | 3.5 | Live phone | A+ |
Politico/Morning Consult | 1,987 registered voters | 2 | Online | N/A |
Investor's Business Daily/TechnoMetrica (IBD/TIPP) | 904 adults | 3.3 | Phone | A- |
Public Policy Polling | 887 registered voters | 3.3 | Phone + online | B+ |
Last updated August 24, 2017
*FiveThirtyEight pollster ratings are calculated based on historical accuracy in predicting elections, sample sizes, methodology, etc. Find out more here.
Contact method
Pollsters use a variety of different methods to contact potential survey participants. From the 1930s to the 1980s, pollsters generally did their work through direct contact: going door-to-door, a remarkably expensive and time-consuming method.[1] Nowadays, pollsters rely upon telephones and the internet. Neither of these approaches comes without challenges. Fewer Americans today, for example, live in households with landlines than they did 20 or even 10 years ago. On the other hand, not every American—particularly in older generations—has a cell phone. To get around this, many pollsters call a combination of landlines and cellphones for a survey. An additional problem is that, with the rise of caller-ID, fewer people pick up the phone to participate in surveys—part of a systemic problem in the modern polling industry known as the response rate. Some pollsters have to looked to the internet as a workaround for this issue, but analysts continue to debate the accuracy and dependability of online polls.[2][3]
A study by FiveThirtyEight found that variances in polls about President Trump's favorability stemmed primarily from the collection method. Polls of registered or likely voters tended to be more favorable to Trump than those that polled adults generally. Automated or online polls also resulted in more favorable rankings than those conducted with live phone calls. The data for these findings was taken from polls conducted between Feb. 1 and Feb. 19, 2017.[4]
There are also differences among polling firms in who contacts the participants. Some phone-based surveys use live-interviewers, while others use automated interactive voice responders.[3]
The sample and margin of error
Pollsters can’t realistically contact every American adult throughout the country and ask their opinion on a given issue. Instead, they try to contact a representative sample—usually anywhere between 1,000 and 2,000 individuals—that accurately represents the country’s population as a whole. Pollsters, with the help of statisticians, demographers, and data experts, use a variety of techniques to create a representative sample. This typically involves using complicated probability formulas and algorithms to ensure random sampling and to increase the likelihood of contacting an accurate cross-section of the U.S. adult population. Some pollsters also create panels of respondents that they believe reflect the actual population and poll them repeatedly over a span of time. These polls are usually called tracking polls. Oftentimes, pollsters weigh their respondents to account for various demographic measurements. For example, a pollster might weigh more heavily the responses from a specific demographic group if that group was poorly represented in the random sample in relation to the country’s estimated demographic composition. The same might be done if a group appears to be overrepresented.
Samples are also where margins of error (MoE) come into play. The MoE describes the potential range of variation for a poll’s results in the context of its representative sample and the actual population. For example, if a poll with a margin of error of 3 percentage points showed that 47 percent of respondents approve of candidate X, that means the pollster believes, based on the representative sample in the poll, anywhere between 44 and 50 percent of the actual population approves of candidate X. Generally speaking, a larger sample size means a smaller MoE, while a smaller sample size means a larger MoE. Other factors, such as the poll’s design, probability formulas, and weighting methods, can also affect MoE.[5][6]
Question framing
Though all polling firms, in general, are after the same goal—to find out what the public thinks about a given topic or issue—they don’t always ask their questions the same way. Studies have found that differences in how questions are worded—even subtle differences—can lead to a range of results. In 2003, for example, Pew Research found that when they asked respondents if they “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule,” a total of 68 percent responded that they favor military action. But when Pew added to the end of that question, “... even if it meant that U.S. forces might suffer thousands of casualties,” 43 percent responded in favor of military action.[7]
The number of possible answers that pollsters provide to respondents has also been known to produce different results. With questions about presidential approval and disapproval, for instance, some firms only give respondents the options of saying approve or disapprove. Other firms, however, give respondents more flexibility by allowing them to respond with answers such as “strongly approve” or “somewhat disapprove.” Again, these slight differences have historically led to differing results among polling firms.[8]
Trust in sources
Public perception of the various sources cited here varies. Pew Research published a study on this topic in 2014, detailing how members of various idealogical groups (conservatives and liberals) trusted or distrusted popular media organizations. The results from this study for the news organizations included in Ballotpedia's polling data are listed below. By providing a variety of sources of polling results side-by-side, we hope to mitigate the influence of potential bias. All of the major news sources selected for Ballotpedia's polling index were rated as more trusted than distrusted in the overall results from all respondents.[9]
Trust levels in polling sources by ideology
The following table includes broad summaries of the data found in a survey by Pew Research. Blue indicates more distrust of the news source by conservatives, while red indicates more distrust by liberals. Full results are available here.
Source | Trust by ideology |
---|---|
The Economist | Mostly trusted by all groups (except those identified as consistently conservative, where trust/distrust were about equal) |
ABC News | Somewhat not trusted by conservatives |
CBS News | Somewhat not trusted by conservatives |
Fox News | Mostly not trusted by liberals |
NBC News | Somewhat not trusted by conservatives |
The New York Times (sometimes polls in conjunction with CBS) | Mostly not trusted by conservatives |
Politico | Mostly not trusted by conservatives |
USA Today | Somewhat not trusted by conservatives |
Wall Street Journal | Mostly trusted by all groups |
The Washington Post | Mostly not trusted by conservatives |
See also
- Pliny's Point
- Ballotpedia's Polling Index: Presidential approval rating
- Ballotpedia's Polling Index: Congressional approval rating
- Ballotpedia's Polling Index: Direction of country rating
Footnotes
- ↑ Gallup, "How does Gallup polling work?" accessed January 12, 2017
- ↑ The New York Times, "Online Polls Are Rising. So Are Concerns About Their Results," November 27, 2015
- ↑ 3.0 3.1 FiveThirtyEight, "Live Polls And Online Polls Tell Different Stories About The Election," August 31, 2016
- ↑ FiveThirtyEight, "Why Polls Differ On Trump’s Popularity," February 20, 2017
- ↑ Pew Research Center, "5 key things to know about the margin of error in election polls," September 8, 2016
- ↑ MIT News, "Explained: Margin of error," October 31, 2012
- ↑ Pew Research Center, "Questionnaire design," accessed January 12, 2017
- ↑ The Wall Street Journal, "When Wording Skews Results in Polls," September 25, 2010
- ↑ Pew Research Center, "Trust Levels of News Sources by Ideological Group," October 20, 2014