Fact Checking... Doesn't Matter?

When the Web started to rise as a source of news and information, I mistakenly assumed we might witness a change in public discourse. My assumption was that the rise of competitive "fact checkers" would force political leaders and pundits to restrain from flawed rhetorical devices.

My optimism was based on the rise of websites including the award-wining PolitiFact (http://www.politifact.com/) and the competing partisan sites Media Matters (http://mediamatters.org/) and NewsBusters (http://newsbusters.org/). Media/politics focused sites Mediaite (http://www.mediaite.com/), Politico (http://www.politico.com/) and The Daily Caller (http://dailycaller.com/) also offered hope politicians and pundits would be held to account.

I tell students I care about facts, statistics, and lab results. Since many, if not most, of the students with whom I work are business, technology, and science majors, they appreciate this emphasis on the empirical. But, it turns out even some of the most intellectually gifted students have limited interests in the fact checking of political claims.

Asking a student if she cared that PolitiFact ranked a statement by a particular politician a lie, her response was, "PolitiFact is worthless according to DailyKos." Of course, I've heard similar responses from other students, simply substituting the fact-checking and partisan sources. Left and right, fact checking is selective when it is pursued at all.

Logical, scientific students don't want to believe factual refutation of political statements supporting the biases of the students. I am surrounded by students with the highest SAT/ACT scores. These young men and women are more engaged politically and socially than most other people. And yet, they are as resistant, if not more so, to challenges of their deeply held assumptions than other media consumers.

I dedicate much class time to our flawed ability to consider competing facts. Humans are not reasonable, logical computing devices. We tend to reach conclusions and then justify those conclusions by selectively accepting and rejecting information. The research on this problem is extensive. I provide various journal articles to students and almost universally they claim to be above such human weaknesses. But, we all filter media and must struggle to get beyond that instinct.

But, I had hoped that the explosion in fact-checking sites and easy access to a myriad of opinions would lead people to be curious. I imagined more people reading sources from across the political spectrum. The Web was going to elevate public discourse because someone was always going to check the statements of political figures.

I was wrong.

Instead, it turns out people stick to like-minded, biased, and often uncivil sources. Website readers are often profiled statistically for advertisers and political organizations. The reports reveal that we have become less interested in other points of view.

I'm a cynical skeptic and don't trust any sources. Even the best news sources make mistakes and all reporters have biases. PolitiFact, Politico, and Mediaite are part of my daily reading rituals. When I ask students, those on the right accuse Politico of left-leaning bias. The students on the left accuse Mediate of right-leaning bias. Rarely does a student express an enthusiasm for PolitiFact, either, a site I consider among the best destinations on the Web.

Why doesn't fact checking matter? For a rhetoric instructor, it is disappointing.


  1. PolitiFact makes some absolutely stunning blunders (mostly harming conservative ideology). Their "Truth-O-Meter" grading system is ultimately subjective. It is the latter that will increasingly reveal PolitiFact's ideological bias.

    And if you're basing your notions of people's rejection of information that conflicts with their beliefs on studies involving Brendan Nyhan then I have to ask why. His failure to to directly equate "misperceptions" with falsehoods in his studies ought to give anyone pause. He's okay at equating the two when he's drawing conclusions, but that's not the same thing (nor is it a good idea in terms of logic and science).

  2. I fall into the "trust no one" school of analysis. Begin skeptical, you won't be disappointed.

    I am not familiar with Nyhan's work. I have a about three dozen journal articles I use in courses, including research from de Waal, Haidt, and Hauser (sadly, he damaged his credibility). One of the most telling quotes, from 2005:

    "I suspected that many media outlets would tilt to the left because surveys have shown that reporters tend to vote more Democrat than Republican," said Tim Groseclose, a UCLA political scientist and the study's lead author. "But I was surprised at just how pronounced the distinctions are."

    Media Bias Is Real, UCLA, December 14, 2005

    Econ Journal Watch has done a fantastic job tracing the politicization of economics reporting, including the reversals of Krugman's positions in the NYT editorial pages (Econ Journal Watch Volume 7, Number 2 May 2010, pp 119-156).

    If Paul Krugman, supposed "scientific" economist, isn't consistent, but instead rationalizes the actions of politicians with whom he agrees, then there's little chance of students resisting the same pattern.

    The problem is that students, like most people, don't have time to read everything and analyze presentation of "findings" so they stick to what is comfortable. I had hoped the rise of the Web might change this, but instead it has reinforced "tribalism."

    Good overviews of research:



    As for PolitiFact, because they "fact check" more Republican politicians, they end up finding more errors among the GOP than Democrats. The lopsided checking pattern is problematic. Still, PolitiFact is one of the best sources for the background information on the Web.

  3. Nyhan's studies are at least more directly related to the issue of whether fact checking works (he attempted to test whether partisans changed their minds in response to information contradicting ideas consonant with their partisan notions).

    As for Peterson, I'm not sure his hypothesis ought to make us fret overmuch (and he appears to offer reasonable cautions to that effect). Yes, reality is highly complex. The strategies we use to navigate reality will vary, and Peterson's ideas seem to agree with the standard notion that most people follow the ideas of "elites" of one type or another. The Internet confronts individuals with a whole new dimension of that complexity, along with more questions as to which sources to trust.

    You're sensible to agree that PolitiFact has a problem with selection bias. But the problem is deeper than that. PolitiFact is produced by journalists (though I know of at least one librarian on the staff). Let's face it: Journalism doesn't tend to attract our best and brightest. The research is done by people often barely better than laymen at sifting through some of the problems they try to tackle. And even the problems that ought to be easy (such as detecting hyperbole) often give them fits.

    For my money, Annenberg Fact Check easily does the better job.

  4. I would agree that FactCheck.org, though not as prolific, often does a better job with the deep analyses required of some topics.

    I'm not sure I would generalize about journalists to the same extent. Maybe they are similar to most liberal arts graduates? I know the lowest test scores are within education, not journalism. I'm sure someone could make a snarky comment or two about my degrees including print journalism and English education.

    Liberal arts education is not welcoming of people with differing points of view. The lack of diversity of opinions at conferences and within academic journals should bother us, but instead the homogeneity if often celebrated as proof everyone in the group is wise.

    Journalism is constrained by the same social pressures. No one wants to be the outlier.

  5. "The lack of diversity of opinions at conferences and within academic journals should bother us, but instead the homogeneity if often celebrated as proof everyone in the group is wise."


    That's a *second* thing you've written in this thread that I'm more than half inclined to record and quote later on down the line. Delightfully phrased, and I happen to agree.

    I'm generalizing about journalists to the extent I do because I know them well (I have a degree in journalism). Journalism could draw better than education about four rungs up the ladder and still not be said to attract the best and the brightest. The evidence occurs in the news every day, so there's not even any need to refer to the metrics of education.

    I've got a classic example with respect to PolitiFact that helps illustrate the disdain for rigorous methodology. What's the difference between "False" and "Pants on Fire" on the "Truth-O-Meter"? Depending on which set of definitions PolitiFact has offered, both False and "Pants On Fire" refer to false statements while the latter refers to "ridiculous" statements.

    Makes me wonder what objective criteria one uses to determine ridiculousness. You?

    The PolitiFact folks spent real time developing their system. The result is a sufficient indictment.

    It seems that we are substantially in agreement, so your next reply will stand as the last word unless I feel compelled to offer another compliment on your phrasing. Cheers.

  6. I often disagree with the "barely true" designation PolitiFact awards some statements, while describing an equally incorrect statement as "false." I have to agree -- the standard should be True or False without the qualifications. I suppose "Pants on Fire!" was meant to call the source "Liar!" politely.

    Thank you for the kind words. I hope future posts are of some interests. Or maybe some past ones? I will locate your blogs, as well.


Post a Comment

Popular Posts