Everything you need to know about ranked-choice voting in one spot. Click to learn more!

Polling indices of the first year of the Trump administration

From Ballotpedia
Jump to: navigation, search

From the presidential tweets that reacted to them to the media that analyzed every approval rating, each poll released in 2017 demonstrated there was a strong desire to make polling a barometer for what was happening at the federal level. This page provides a summary of what happened last year looking specifically through the prism of presidential approval polls, direction of country polls, and congressional approval polls.

For Ballotpedia's presidential approval, congressional approval, and direction of the country polling results, we took an average of the then most recent polls. To see more about methodology, see our section on sources below or click here to read more.

Polling averages every Friday (January 2017 - January 2018)

The chart below shows the average polling for three metrics—presidential approval ratings, the direction of country approval rating, and congressional approval ratings—pulled each Friday from January 2017 until January 2018. The highest spikes for each of the three types of polls came in March 2017. During this time, major discussions on healthcare topics in Congress were happening. Polling dips occurred in the weeks that followed, likely as a result of the unsuccessful healthcare overhaul that dominated the news at that time.

Presidential approval by week and source

The chart below breaks down each weekly average beginning with the last week of January 2017 and ending with the week of January 8, 2018. Breaks in the lines mean that the polling company represented by the line did not publish a poll that week.

Using this chart, one is able to gauge how different the data are from one company to the next. While in many instances the ratings are tightly grouped together, there are considerable discrepancies between the companies in some weekly averages.


Direction of country rating by week and source

This chart, like the one above, also shows how different the data are from one company to the next. While in many instances the shape of the wavelengths are broadly similar, there are noticeable gaps in the numbers in any given week, and significant divergence in the earliest weeks.

Methodology

Sources

For Ballotpedia's presidential approval, congressional approval, direction of the country, and generic congressional ballot polling results, we take an average of the most recent polls (from the last 30 days, with some exceptions for major news events) on one or more of these topics conducted by the following sources:

Ballotpedia chose to include polls from these organizations in our averages because we regard their methodologies as broadly trustworthy and reliable. If you know of other outlets who do aggregate polling on these topics, email us. We average the results and show all polling results side-by-side because we believe that paints a clearer picture of public opinion than any individual poll can provide. The data is updated daily as new polling results from the above sources are published.


Questions

Typical poll questions asked either online or by phone include:

Presidential approval

  • "Do you approve or disapprove of the way Donald Trump is handling his job as President?"[1]
  • "Do you approve or disapprove of the way Barack Obama has handled his job as president?"[2]
  • Some polls also allow respondents more than two options, such as Strongly Approve, Somewhat Approve, Somewhat Disapprove, or Strongly Disapprove.[3]

Congressional approval

  • "Overall, do you approve or disapprove of the way that the United States Congress is handling its job?"[4]
  • "Do you approve or disapprove of the way Congress is handling its job?"[5][6]

Direction of country

  • "Do you feel things in this country are generally going in the right direction or do you feel things have pretty seriously gotten off on the wrong track?"[7]
  • "All in all, do you think things in the nation are generally headed in the right direction, or do you feel things are off on the wrong track?"[8]
  • "Would you say things in this country today are generally headed in the right direction or off on the wrong track?"[9]

Generic congressional ballot

  • "If an election for U.S. Congress were being held today, who would you vote for in the district where you live?"[10]
  • "Thinking about the elections in 2018, if the election for U.S. Congress were held today, would you vote for the Democratic candidate or the Republican candidate in the district where you live?"[11]
  • "What is your preference for the outcome of this November’s congressional elections—a Congress controlled by Republicans or a Congress controlled by Democrats?"[12]

Understanding polling

Below we briefly highlight three aspects of public polling that illustrate both the complexity of polling and how polls tend to differ from one another. Understanding these concepts is key to interpreting what polls mean and underscores the value of aggregating polling results.

Contact method

Pollsters use a variety of different methods to contact potential survey participants. From the 1930s to the 1980s, pollsters generally did their work through direct contact: going door-to-door, a remarkably expensive and time-consuming method.[13] Nowadays, pollsters rely upon telephones and the internet. Neither of these approaches comes without challenges. Fewer Americans today, for example, live in households with landlines than they did 20 or even 10 years ago. On the other hand, not every American—particularly in older generations—has a cell phone. To get around this, many pollsters call a combination of landlines and cellphones for a survey. An additional problem is that, with the rise of caller-ID, fewer people pick up the phone to participate in surveys—part of a systemic problem in the modern polling industry known as the response rate. Some pollsters have to looked to the internet as a workaround for this issue, but analysts continue to debate the accuracy and dependability of online polls.[14][15]

A study by FiveThirtyEight found that variances in polls about President Trump's favorability stemmed primarily from the collection method. Polls of registered or likely voters tended to be more favorable to Trump than those that polled adults generally. Automated or online polls also resulted in more favorable rankings than those conducted with live phone calls. The data for these findings was taken from polls conducted between Feb. 1 and Feb. 19, 2017.[16]

There are also differences among polling firms in who contacts the participants. Some phone-based surveys use live-interviewers, while others use automated interactive voice responders.[15]

Contact methods of featured polls

SourceContact methodFiveThirtyEight Grade*
CBS NewsLive phoneN/A
The Economist / YouGovOnline3 stars
Fox NewsLive phoneN/A
GallupLive phone2.5 stars
Pew ResearchLive phone2.5 stars
Quinnipiac UniversityLive phone2.8 stars
Rasmussen Reports (Pres. Approval)Automated phone + onlineN/A
Rasmussen Reports (Direction of Country)Automated phoneN/A
Reuters / IpsosOnline2.8 stars
USA Today / Suffolk UniversityLive phone2.9 stars
The Wall Street Journal / NBCLive phoneN/A
The Washington Post / ABCLive phone3 stars
Morning ConsultOnline1.9 stars
TIPP InsightsPhone1.8 stars
Public Policy PollingPhone + online1.4 stars
The Marist PollLive phone2.9 stars
Monmouth UniversityLive phone2.9 stars
CNNLive phone2.8 stars
Harris Insights & AnalyticsOnline1.5 stars
Emerson CollegePhone2.9 stars

*Last updated January 2025. FiveThirtyEight pollster ratings were calculated based on historical accuracy in predicting elections, sample sizes, methodology, etc. Find out more here.

The sample and margin of error

Pollsters can’t realistically contact every American adult throughout the country and ask their opinion on a given issue. Instead, they try to contact a representative sample—usually anywhere between 1,000 and 2,000 individuals—that accurately represents the country’s population as a whole. Pollsters, with the help of statisticians, demographers, and data experts, use a variety of techniques to create a representative sample. This typically involves using probability formulas and algorithms to ensure random sampling and to increase the likelihood of contacting an accurate cross-section of the U.S. adult population. Some pollsters also create panels of respondents that they believe reflect the actual population and poll them repeatedly over a span of time. These polls are usually called tracking polls. Oftentimes, pollsters weigh their respondents to account for various demographic measurements. For example, a pollster might weigh more heavily the responses from a specific demographic group if that group was poorly represented in the random sample in relation to the country’s estimated demographic composition. The same might be done if a group appears to be overrepresented.

Samples are also where margins of error (MoE) come into play. The MoE describes the potential range of variation for a poll’s results in the context of its representative sample and the actual population. For example, if a poll with a margin of error of 3 percentage points showed that 47 percent of respondents approve of candidate X, that means the pollster believes, based on the representative sample in the poll, anywhere between 44 and 50 percent of the actual population approves of candidate X. Generally speaking, a larger sample size means a smaller MoE, while a smaller sample size means a larger MoE. Other factors, such as the poll’s design, probability formulas, and weighting methods, can also affect MoE.[17][18]

Question framing

Though all polling firms, in general, are after the same goal—to find out what the public thinks about a given topic or issue—they don’t always ask their questions the same way. Studies have found that differences in how questions are worded—even subtle differences—can lead to a range of results. In 2003, for example, Pew Research found that when they asked respondents if they “favor or oppose taking military action in Iraq to end Saddam Hussein’s rule,” a total of 68 percent responded that they favor military action. But when Pew added to the end of that question, “... even if it meant that U.S. forces might suffer thousands of casualties,” 43 percent responded in favor of military action.[19]

The number of possible answers that pollsters provide to respondents has also been known to produce different results. With questions about presidential approval and disapproval, for instance, some firms only give respondents the options of saying approve or disapprove. Other firms, however, give respondents more flexibility by allowing them to respond with answers such as “strongly approve” or “somewhat disapprove.” Again, these slight differences have historically led to differing results among polling firms.[20]

Trust in sources

Public perception of the various sources cited here varies. Pew Research published a study on this topic in 2020, detailing how members of various idealogical groups (conservatives and liberals) trusted or distrusted popular media organizations. The results from this study for the news organizations included in Ballotpedia's polling data are listed below. By providing a variety of sources of polling results side-by-side, we hope to mitigate the influence of potential bias. All of the major news sources selected for Ballotpedia's polling index were rated as more trusted than distrusted in the overall results from all respondents.[21]

Trust levels in polling sources by ideology

The following chart includes data found in a 2024 survey by YouGov. Full results are available here.

For questions on polls and methodology, email: editor@ballotpedia.org.

Back to top

See also

Ballotpedia daily polling averages:

More on Ballotpedia:

Footnotes

  1. YouGov, "The Economist/YouGov Poll," August 27-29, 2017
  2. The Washington Post, "Washington Post-ABC News Poll," January 12-15, 2017
  3. Rasmussen Reports, "Obama Approval: Comparing the Numbers," November 25, 2013
  4. YouGov, "The Economist/YouGov Poll," January 14-17, 2017
  5. CBS News, "CBS News Poll: Expectations for the Trump Presidency," January 13-16, 2017
  6. Gallup, "Gallup Poll Social Series: Mood of the Nation," January 4-8, 2017
  7. CBS News, "CBS News Poll: Expectations for the Trump Presidency," January 13-16, 2017
  8. Hart Research Associates/Public Opinion Strategies, "NBC News/Wall Street Journal Survey," January 12-15, 2017
  9. YouGov, "The Economist/YouGov Poll," January 14-17, 2017
  10. YouGov, "The Economist/YouGov Poll," August 12-14, 2018
  11. Reuters/Ipsos, "Core Political," August 15, 2018
  12. Hart Research Associates/Public Opinion Strategies, "NBC News/Wall Street Journal Survey," July 15-18, 2018
  13. Gallup, "How does Gallup polling work?" accessed January 12, 2017
  14. The New York Times, "Online Polls Are Rising. So Are Concerns About Their Results," November 27, 2015
  15. 15.0 15.1 FiveThirtyEight, "Live Polls And Online Polls Tell Different Stories About The Election," August 31, 2016
  16. FiveThirtyEight, "Why Polls Differ On Trump’s Popularity," February 20, 2017
  17. Pew Research Center, "5 key things to know about the margin of error in election polls," September 8, 2016
  18. MIT News, "Explained: Margin of error," October 31, 2012
  19. Pew Research Center, "Questionnaire design," accessed January 12, 2017
  20. The Wall Street Journal, "When Wording Skews Results in Polls," September 25, 2010
  21. Pew Research Center, "Ideology reveals largest gaps in trust occur between conservatives and liberals," January 24, 2020