Why Jon Stewart and Politifact are Wrong about Public Misinformation
The recent “debate” between Comedy Central’s Jon Stewart and the non-partisan Politifact group is largely missing the point with regard to the current state of public ignorance. Stewart recently argued in an interview on Fox News that public opinion polls show Fox viewers are consistently more ignorant with regard to basic political facts. Politifact counters, contending that national polls fail to consistently document a pattern between consumption of Fox and political ignorance. Neither side is presenting a completely accurate picture of reality.
Politifact, a non-partisan project of the St. Petersburg Times, was originally created to research the truthfulness and accuracy of statements emanating from Washington. Sadly, it seems to be falling victim to the same weaknesses of “objective” journalists, who parrot a “one side is as good as another” framework with regard to Washington officialdom. This problem is rather stark with regard to their recent claims about Fox News consumption and misinformation. In criticizing Jon Stewart’s claim that Fox viewers “are the most consistently misinformed media viewers” in “every poll” undertaken, Politifact muddies the waters in relation to the polls they examine. They claim that Fox News consumption is only inconsistently associated with being misinformed, citing a number of Pew Research Center polls which find that Fox News viewers are no less likely than other types of media consumers to accurately answer civic I.Q. questions with regard to questions such as: which party controls the House of Representatives, who is the Prime Minister of Britain, who is the Secretary of State?” etc. In contrast, the group discounts studies done by the Program on International Policy Attitudes (PIPA), which find that Fox News viewers were consistently more misinformed with regards to foreign policy-Iraq related issues and domestic, electorally-related topics.
As someone who has spent years intimately analyzing raw data comprising these specific polls, I feel I’m in a relevant position to offer some insights.
A brief look at the PIPA surveys of news consumption does suggest that Fox News viewers are systematically misinformed. PIPA’s 2003 Iraq survey found that Fox News viewers were far more likely than other types of news viewers to hold a variety of misperceptions regarding the war, including: opinions that Iraq possessed WMD, that Iraq had ties to al Qaeda, that WMD had been found in Iraq following the invasion, and that the international community strongly supported the U.S. invasion. The 2010 PIPA survey examined a number of misperceptions, including: opinions that Obama’s stimulus did not include tax cuts for most Americans, opinions that the scientists don’t agree that climate change is occurring, that health care reform will increase the deficit, and that Obama was not born in the U.S., among other false attitudes. Of the eleven questions PIPA surveyed in 2010, Fox News viewers were more likely to hold misinformed opinions on nine of them.
Politifact wishes away the authoritative PIPA findings by suggesting that they are problematic, in part because some pundits in the mass media disagree with them. Politifact prefers the Pew Center’s civics I.Q. questions over the more in-depth measures from PIPA, since questions about who is the Vice President are difficult to politicize, whereas questions about Obama’s tax agenda have long been politicized and the subject of heated partisan “discussion” in the mass media.
I won’t spend much time refuting Politifact’s misguided dismissal of the PIPA surveys in favor of Pew’s extremely low (arguably non-existent) standard for what constitutes an “informed” citizen. Jane Hamsher does a splendid job in exploding Politifact’s inaccuracies in a recent piece on Firedoglake (see Hamsher’s: “Fact Checking Politifact: Wrong about Jon Stewart’s Use of the Word ‘Misinformed’”). Regarding the three Pew surveys, Hamsher argues:
“The three polls measure how informed viewers are. They don’t even belong in the discussion, because they don’t go to Stewart’s point. “Do you know who the Secretary of State is” or “what is the name of the Vice President” are questions that you can answer if you’re paying attention. There is no shortage of people who go glassy-eyed and stupid while staring at cable news, and I’m proud to be one of them. I can feel the lull of Kathleen Parker’s voice shaving points off my IQ. I might be able to tell you who the Secretary of Education is under ordinary circumstances, but freely admit that listening to Bill Bennett drone on about anything is enough to flip the switch on enough synapses that answering any question becomes a challenge. It’s a guilty pleasure for people who don’t smoke pot.”
Hamsher continues: “On the other hand, the two PIPA studies measure how misinformed viewers are. That’s a very different yardstick. Listening to Dana Bash may freeze a few neurons in the “off” position, and I may not get the news value that I should out of the segment, but unless she says something that is manifestly untrue I can’t claim to have been misinformed.”
In short, Hamsher is arguing that the placing of the Pew and PIPA surveys alongside each other is tantamount to comparing apples and oranges. One set of surveys measures basic civics literacy with regard to uncontroversial textbook questions of government. The stakes with regards to these uncontroversial questions are nil. The other surveys from PIPA measure how misinformed the public is with regard to the official manipulation and propaganda that dominates discourse in Washington D.C. and in the national media (and is most extreme in the right-most elements of the mass media).
If Politifact is interested in assessing the seriousness of various polls in measuring public misinformation, it would do well to consult scholarly experts in the field of public opinion, rather than partisan pundits in the mass media. The latter of these two groups share zero understanding of the complexities of public opinion research, while the former have spent their entire careers closely studying these issues. If Politifact had consulted enough experts in this area, it would find that many scholars reject civics I.Q. questions (of the Pew variety) altogether due to their superficiality in measuring public information levels. These questions have little to do with the processes by which Americans interact on a day to day basis with the political system. They aren’t representative of the grand questions that the public faces with regard to the policy process – questions which PIPA comes much closer to measuring in its surveys of domestic and foreign policy issues. In short, they aren’t of much value in measuring the public’s exposure to misinformation, propaganda, and indoctrination.
And yet, in pointing out Politifact’s shortcomings here, I should point out that I do not fully agree with Jon Stewart either. Stewart is a liberal pundit operating in the mass media, and his program tends to privilege Democratic and mainstream liberal points of view, while skewering conservative ones. As a result, his limited insights are unlikely to uncover the larger problem in the American political-media system. This larger problem, simply stated, is the consistent correlation between political attentiveness and media consumption across all media outlets, and the corresponding increase in political misinformation resulting from this exposure.
As an expert in the study of mass media and public opinion, I’ve devoted my academic career to studying questions of media exposure and public ignorance. After closely tracking media content and public opinions surveys over the last ten years, I can confidently conclude that there is much truth in the saying: “the more you watch, the less you know.” I’ve tracked the relationship between increased political-media attentiveness and political ignorance with regard to a handful of issues, including the following foreign policy subjects: the wars in Libya, Iraq, Afghanistan, and the rhetorical war with Iran, and opinions of Islam as a “terrorist threat,” and with regard to domestic issues, including: the national health care debate, the Tea Party, the 2001 tax cuts, and the 2009 stimulus. On every single one of these issues, I find a positive correlation between increasingly reactionary media coverage and growing public embrace of right-wing political views. This relationship, in and of itself, does not suggest widespread public ignorance (among those consuming such right-wing news and rhetoric), although a closer examination of the issues suggests that misinformation and confusion is rampant among those paying closest attention to national reporting and political debate.
A brief review of my findings puts the state of the problem in better perspective:
- With regard to Libya and Afghanistan, those paying closest attention to the national reporting and to political debate in Washington on the U.S. (2011) Libya bombing and the (2009) Afghan “surge” were consistently more likely to support the interventions. Sympathetic media coverage was enough to convince a majority of Americans, a majority of which were at first opposed to both interventions (in 2009 and 2011 respectively), to shift their opinions and grant majority support. In short, an increasingly reactionary political-media debate convinced most Americans to support policies that went against their previously expressed opinions and wishes.
- With respects to Iraq and Iran, those paying closest attention to national reporting and political debate with regard to non-existent WMD “threats” (faithfully disseminated in media propaganda) were consistently more likely to think (erroneously) that these countries represented a serious danger to U.S. national security.
- Concerning attitudes toward Islam, those getting most of their information about the religion and its adherents from the mass media were most likely to hold a variety of misperceptions regarding Islamic beliefs and culture. Those most reliant on the mass media were the most likely to embrace racist anti-Muslim views.
- In relation to health care reform, those paying closest attention to the national reporting and debate were consistently more likely to be confused about such reforms at a time (in 2009 and 2010) when coverage was dominated by propagandistic discourse over non-existent Obama “socialism” and “death panels,” and fictitious “government takeovers” of private care. The most attentive were also the most likely to oppose reform in light of the increasingly reactionary state of media coverage.
- With regard to the 2001 tax cuts and the 2009 stimulus, those paying closest attention to the national reporting and debate on these issues were consistently more likely to support the tax cuts and oppose the stimulus during periods when my analysis of media found that support for the tax cuts were high and criticism of the stimulus was growing.
- Concerning the Tea Party, those paying closest attention to national reporting and debate were consistently more likely to hold favorable opinions of the group at a time when the mass media was celebrating the group as a “mass movement.” Increasing public support was driven by inaccurate assumptions that the Tea Party was a grassroots rebellion, despite my extensive, year-long national and local research project which suggested the group is largely a top-down, elite out-growth of corporate and Republican interests.
The problem of mass-mediated ignorance is encouraged by journalists’ over-reliance on increasingly reactionary bi-partisan political elites in Washington. These officials set the agenda with regard to what issues will be talked about and how they will be discussed. Public ignorance across the “mainstream” parts of the political spectrum is ignored in the “debate” over whether Fox News viewers are misinformed. Of course Fox viewers display a staggering ignorance; but at the end of the day, that ignorance is not substantively different from that seen among most media consumers.
A more constructive and engaging research agenda would analyze the ways in which the entire media-political system fosters ignorance and misinformation. At a time when both parties are embracing right-wing views, media coverage is increasingly moving to the right and having predictable effects on public attitudes. Mediated ignorance is a cancer afflicting the American populace. That cancer has its roots in the entire political-economic-media system, rather than in its most extreme sectors. We would do well not to exaggerate Fox viewers’ ignorance in light of these findings.
Anthony DiMaggio is the co-author with Paul Street of the newly released Crashing the Tea Party (Paradigm Publishers, 2011). He is also the author of When Media Goes to War (2010) and Mass Media, Mass Propaganda (2008). He has taught U.S. and Global Politics at Illinois State University, and can be reached at: firstname.lastname@example.org.