COVID-19 vaccination polls align closely with CDC data
The COVID-19 vaccination rate offers a rare opportunity for investigators to compare their data to a high-profile outcome other than an election – in this case, the share of adults who have received at least one dose of a vaccine against the coronavirus as documented by the Centers for Disease Control and Prevention (CDC). This type of benchmarking helps determine whether polls continue to provide reasonably accurate information about the American public on this matter.
Pew Research Center analysis reveals that public polls on COVID-19 vaccination have tracked CDC levels fairly closely. Survey estimates of the adult vaccination rate are about 2.8 percentage points, on average, of the rate calculated by the CDC. About a quarter (22%) of the polls differed by less than 1 percentage point from the CDC’s estimate.
This analysis includes 98 public surveys conducted by 19 different survey organizations from December 29, 2020 to June 30, 2021. Researchers assessed the accuracy of each survey by calculating the difference between the published survey estimate and the CDC’s vaccination rate on the day the survey ended.
Researchers collected 98 public polls conducted by 19 different pollsters from December 29, 2020 to June 30, 2021. Researchers assessed the accuracy of each poll by calculating the difference between the poll’s published estimate and the Centers for immunization rate. Disease Control and Prevention (CDC) on the day the polls close. CDC numbers were collected from their publicly available API. For a full list of surveys that were included in this analysis, click here.
When polls differed from the CDC rate, they were often higher than lower – but not always. A notable difference between the poll results for the 2020 presidential election and the COVID-19 vaccination rate is that the polling errors regarding the vaccine have been less systematic (i.e. in the same direction ). One way to measure systematic error, also known as bias, is to leave surveys that underfound that the CDC benchmark overrides polls that moreestimated the benchmark. (Researchers call this calculation the “signed” mean error in polls.)
In this analysis, once the overestimates and underestimates of the CDC benchmark were allowed to cancel each other out, the polls differed from the CDC rate by only +0.3 percentage points on average, with the net result being that the surveys matched the vaccination rate almost exactly. . By comparison, according to an analysis of national polls conducted in the last two weeks of the 2020 presidential election by the American Association for Public Opinion Research, the polls “underestimated Trump’s share of the certified vote. by 3.3 percentage points and overestimated Biden’s share of the certified vote. vote of 1.0 percentage point.
Admittedly, the vaccination rate among Americans increased with each poll, making it difficult to determine the exact difference between the vaccination rates reported in the survey data and the official CDC rates. The median duration of surveys in this analysis was six days. If the accuracy of the polls is judged on the basis of the CDC rate for the mid-date of data collection, rather than the end date, the average absolute difference would be 3.1 percentage points instead of 2.8 . Comparisons are further complicated by the fact that CDC rates themselves are not necessarily error-free due to issues such as delays in jurisdictions reporting vaccinations.
As other research has suggested, poll results may differ from CDC’s (and each other’s) vaccination rate due to differences in how pollers asked about vaccination status. The most common type of question in this analysis asks something like “Have you been vaccinated against the coronavirus?” âWith answer options forâ Yes âorâ No â. Other questions asked respondents if they knew anyone who had been vaccinated and included a response option allowing the respondent to say that they had received the vaccine themselves. Some pollsters asked if respondents planned to be vaccinated with the option of indicating that they had already received the vaccine. The mean absolute difference for the 76 questions using a yes / no format was 2.8 percentage points, compared to 3.0 points for all other questions (22 questions used a different format).
In some ways, the fact that the polling industry has done a better job of estimating vaccinations than voting is not surprising. Election polls face challenges that do not exist for non-election polls measuring public opinion on issues such as abortion or immigration. Election polls focus on future behavior (will you vote? For whom?) And should seek out respondents who are likely to be voters.
Another type of benchmarking involves pollsters asking questions that also appear on benchmark and high-response government surveys and comparing their results to those of the government. While this is an important and useful exercise, these analyzes do not compare one survey to an objective result, but rather one survey to another.
Coronavirus vaccination rates offer pollsters a rare opportunity to compare their results on a high-profile result that is both fully available to the adult U.S. population (rather than registered citizens and voters) and has a known truth to which pollsters can compare. Unlike the less-than-stellar poll results in the 2020 election, the results of this analysis suggest that the polls have tracked growth in the share of adults receiving vaccines.
Nick hatley is a research analyst specializing in survey methodology.