Rebuttals and Refutations, English homework help

Hire our professional essay experts at who are available online 24/7 for an essay paper written to a high standard at an affordable cost.

Order a Similar Paper Order a Different Paper

Rebuttals and Refutations (graded)

Anticipating readers’ objections is one way to determine what other sections to include and support in your paper. Practice writing a rebuttal or a refutation by taking your thesis and considering the point of view of someone who believes differently or even the opposite of the argument you are making. To do this, review Chapter 10, pp. 449–452 and post a paragraph that summarizes an oppositional point of view to your thesis and then refutes it. As peers, reply to one another explaining whether or not your classmates are presenting the opposition objectively and whether the refutation is logical. Give one another ideas or suggestions for points that may be left out or might need to be further developed. The paragraph you draft here can be used in a section of your Second Draft this week.

The samples in Chapter 42 can help you with the rebuttal. Review the sample research paper, “The Public Overwhelmingly Wants It: Why Is Taxing the Rich so Hard?” paragraph 10 for a sample rebuttal. What do you notice about how the author acknowledges the opposing viewpoint? By using statistics, what kind of argumentative strategy is she applying? Is it effective?

The Public Overwhelmingly Wants It: Why Is Taxing the Rich So Hard? chapter 42


Alyssa Battistoni takes on a question that has occurred to many people: Why not ask more from people who already have more than enough money? After all, the gap between the wealthy and the middle class has grown dramatically in the past two decades. By doing research on this question, the author explores the power of the wealthy over politicians and how the rich, as a minority, have more influence on government than others. Look at the ways Battistoni supports her arguments with a balance of facts and emotional appeals.

When even the New York Times, the supposed bleeding heart of the liberal media, is asking whether it’s more “perilous politically” to accept tax increases for 3 percent of households or benefit cuts for everyone, you’d assume that even Americans who aren’t rich are opposed to raising taxes on those who are (“Will Voters,” par. 4). But you’d be wrong: nearly three-quarters of Americans support raising taxes on the wealthy (“Washington Post-ABC News Poll,” item 14). So why is raising taxes on the wealthy so hard—or why do we think it is?

The obvious answer is that rich people have political clout—but can it really be so simple? A growing mound of evidence suggests that while wealthy people’s preferences may not be the only factor in political decision-making, it’s a worrisomely important one. In a recent study, Princeton political scientist Larry Bartels found that senators outright ignored the views of their least advantaged constituents while catering to the preferences of the wealthy (4). Princeton’s Martin Gilens has also found that policy changes reflect the preferences of the most affluent, while the preferences of poor and middle-income Americans have almost no bearing (794).

Political scientists Lawrence Jacobs and Benjamin Page have found that the preferences of foreign policymakers correspond more to the preferences of executives of multinational companies than to the general public (115). Page and Jeffrey Winters estimate that the top 10 percent of income earners hold about 90 percent of materially based political power, and that “each member of the top 1 percent averaged more than 100 times the power of a member of the bottom 90 percent; about 200 times if the index is calculated in terms of the more politically relevant non-home wealth” (736). These numbers are staggering, and should be seriously troubling to anyone who thinks political equality worth defending. Indeed, by Page and Winter’s definition of oligarchy as “the extreme political inequalities that necessarily accompany extreme material inequalities,” it’s pretty hard to argue that the United States isn’t an oligarchic society (732).

The simple fact of the matter is that the people who can afford to fund and engage in Beltway politics, from idea-generating to legislation-drafting, are disproportionately wealthy, so it’s difficult to suss out just how much of politicians’ deference to the preferences of the wealthy is responsiveness to the wealthy themselves as opposed to the general alignment of rich people’s interests with those of influential elites, organized special interest groups, business lobbies, and those of policymakers themselves.


Because of course, plenty of politicians are themselves wealthy—the median net worth of members of Congress is just under a million dollars (Gilson and Perot, sec. 4). Being wealthy doesn’t necessarily mean you’re a shill for lower taxes—indeed, John Kerry and Jay Rockefeller, two of the richest senators, have advocated more progressive tax rates—but it certainly means that most representatives have a different perspective on economic matters than the average American. Indeed, the 10 richest members of Congress—supporters of progressive taxation or no—all voted to extend the Bush tax cuts. (Gilson and Perot, sec. 4)

Of course, it’s no secret that as political campaigns have grown increasingly more expensive, campaign contributions have grown increasingly more important: the 2010 midterms cost $4 billion (Klein, par. 1), and President Obama is already planning to spend a billion dollars on his bid for reelection (Overby, par. 1). Meanwhile, citizens in the top income quarter provide nearly three-quarters of campaign contributions, while those in the lowest quintile account for just 2 percent. But as Bartels notes, campaign contributions don’t explain the whole story.

If anything, we probably understate the political influence of the rich. In part, that’s because we can’t quite comprehend the magnitude of economic inequality and the extent to which political power is correlated with it. The real numbers—like that the wealthiest 300,000 Americans received as much income as the bottom 150 million—sound too crazy to be true (Johnston, par. 5). As a result, proposals to raise taxes on the wealthy are so often dismissed as wild-eyed populist rhetoric—“soaking the rich”—rather than legitimate, reasonable policy prescriptions.

Furthermore, while we take for granted that the wealthy have more political power than the average citizen, we figure that the sheer numbers of middle class and low-income voters can outweigh the preferences of the rich in swaying public officials. Robert Reich, for example, has argued that the rich have the political power to block higher tax rates “only if we let them,” saying “here’s the issue around which Progressives, populists on the right and left, unionized workers, and all other working people who are just plain fed up ought to be able to unite” (par. 14). And indeed, that kind of coalition-building is the basis for much progressive politics. But as the wealth and power of the most privileged Americans increases, it’s becoming harder and harder for the rest of us to keep up even in the aggregate.

So instead we’re getting caught in a negative feedback cycle: as the rich get richer and more powerful, policies are increasingly aligned with their interests, which increases inequality still further. Meanwhile, the middle and working classes are left with shrinking incomes and correspondingly less and less power to demand investment in a more equitable economy—and a broader tax base. Unions used to be able to counter the power of the wealthy, but their decline has left the average worker with little recourse. Instead of presenting an organized alternative to the views endorsed by the rich, average Americans are left to voice their political preferences through the vague format of an opinion poll. It’s no wonder that, as political scientists Jacob Hacker and Paul Pierson write, “America’s public officials have rewritten the rules of American politics and the American economy in ways that have benefitted the few at the expense of the many” (6).


So while it’s absolutely true that the rich pay far more in income taxes than the rest of us (Robyn and Prante, par. 3)—the wealthiest 1 percent of Americans pay 38 percent of income tax—that tiny fraction of the population also receives about 24 percent of income, accounts for about 34 percent of net worth, and holds 42.7 percent of financial wealth (net worth minus the value of one’s home) (Domhoff, par. 16). And those are statistics from before the crash—though there are only tentative estimates of current wealth distribution, many economists actually think it’s gotten more unequal.

Since average people’s wealth is largely tied up in their homes, the wealth of the median household has dropped an estimated 36 percent since the housing bubble popped, while the wealth of the top 1 percent has fallen a comparatively small 11 percent (Wolff, 33). As Bartels concludes, “the economic order of the contemporary United States poses a clear and profound obstacle to realizing the democratic value of political equality” (32). In other words, as long as economic inequality is as extreme as it is now, political equality will remain an ideal rather than a reality. We need to make this case over and over again—right now we’re in danger of drawing exactly the wrong lessons from the economic nightmare of the past few years.

In a Wall Street Journal piece a couple of weeks ago, former California economic forecaster Brad Williams states “We created a revenue cliff…. We built a large part of our government on the state’s most unstable income group” (Frank, par. 5). The people protesting with signs reading “We Love Jobs,” he suggested, were “missing the real point.” But it’s Williams who’s missing the point: what we really did was build a large part of our economy around an unstable income group and industry, and what we need to do is build an economy with a broader base and more evenly distributed resources.

Indeed, many arguments given against raising taxes are in fact reasons for decreasing the financial and political power of the wealthy. Worried that rich people will leave the state, or even the country, to avoid property or income tax? Don’t build an economy that depends on a small number of people who have the resources to leave sticking around. Worried that rich people won’t invest in their businesses or create new jobs if we tax them? Don’t build an economy that depends on a few wealthy people hiring the rest of us. Worried that rich people’s incomes are too volatile? Don’t build an economy so heavily dependent on financial markets. And make no mistake: although the Journal would have you believe the distribution of wealth is a naturally occurring phenomenon, state investment and regulation plays an essential role in the structure of the economy. If we want a more equal playing field, we can have it—but we need to start now.

Works Cited

Bartels, Larry. “Economic Inequality and Political Representation.” Aug 2005. Web. 25 Apr 2011.

Domhoff, G. William. “Who Rules America: Wealth, Income, and Power.” Who Rules America? Jan 2011. Web. 25 Apr 2011.

Frank, Robert. “The Price of Taxing the Rich.” Wall Street Journal 26 Mar 2011. Web. 25 Apr 2011.

Gilens, Martin. “Inequality and Democratic Responsiveness.” Public Opinion Quarterly 69.5 (2005): 778–796. Print.

Gilson, Dave, and Carolyn Perot. “It’s the Inequality, Stupid. Mother Jones. Web. 25 Apr 2011.

Haker, Jacob, and Paul Pierson. Winner-Take-All-Politics. New York, NY: Simon & Schuster, 2010. Print.

Jacobs, Lawrence, and Benjamin Page. “Who Influences U.S. Foreign Policy.” American Political Science Review 99.1 (2005): 107–123. Print.

Johnston, David Cay. “Income Gap Is Widening, Data Shows.” New York Times 29 Mar 2007. Web. 25 Apr 2011.

Klein, Ezra. “More Money, More Problems.” Newsweek 31 Oct 2010. Web. 25 Apr 2011.

Overby, Peter. “2012: The Year Of The Billion-Dollar Campaigns?” National Public Radio 18 Feb 2011. Web. 25 Apr 2011.

Reich, Robert. “Why We Must Raise Taxes on the Rich, ASAP!” 4 Apr 2011. Web. 25 Apr 2011.

Robyn, Mark, and Gerald Prante. “Summary of Latest Federal Individual Income Tax Data.” Tax Foundation. Web. 25 Apr 2011.

“Washington Post-ABC News Poll.” Web. 25 Apr 2011.

“Will Voters Accept Tax Increases?—Room for Debate.” 13 Apr 2011. Web. 25 Apr 2011.

Winters, Jeffrey, and Benjamin Page. “Oligarchy in the United States.” Perspectives on Politics 7.4 (2009): 731–751. Print.

Wolff, Edward. “Recent Trends in Household Wealth in the United States: Rising Debt and the Middle-Class Squeeze—an Update to 2007.” Mar 2010. Print.

A CLOSER LOOK AT The Public Overwhelmingly Wants It: Why Is Taxing the Rich So Hard?

1. One of Battistoni’s main arguments is that politicians often overlook the needs of the public, because the (Paine 765-768)

Paine, Richard Johnson-Sheehan C. Writing Today, 2nd Edition. Pearson Learning Solutions, 01/2012. VitalBook file.

The citation provided is a guideline. Please check each citation for accuracy before use.

A CLOSER LOOK AT The Public Overwhelmingly Wants It: Why Is Taxing the Rich So Hard?

1. One of Battistoni’s main arguments is that politicians often overlook the needs of the public, because they share the same views as wealthy and powerful business executives. Why, according to Battistoni, do politicians think and behave this way? According to Battistoni’s argument, what are some of the effects of this kind of thinking by politicians?

2. The author sounds angry and frustrated. Identify five places in this research paper where she creates this style or tone. In what ways does Battistoni express her anger and frustration at the power of the rich over politicians?

3. The author also argues that the United States is actually too dependent on the rich to pay taxes. She writes, “What we really did was build a large part of our economy around an unstable income group and industry, and what we need to do is build an economy with a broader base and more evenly distributed resources.” Do you think she is contradicting herself at this point? In the end, is she really arguing for more taxes on the rich?


1. Write a commentary in which you express your own views on taxes. Do you think some people are overburdened with taxes? Do you think some people don’t pay enough taxes? Use research to support and explain your views.

2. Write an elevator pitch in which you briefly explain how you would make politicians more responsive to the general public. (The pitch is a microgenre of the proposal, described in Chapter 11.) Right now, most people agree that politicians are overly influenced by special interest groups, wealthy donors, labor unions, and corporate lobbyists, while they are not listening to regular people. In your elevator pitch, explain how you would change the political system so politicians hear about the needs and concerns of all.

Rapture Ready: The Science of Self Delusion


In this research paper, Chris Mooney explores why people often believe things that can be proven wrong. Reviewing psychology and political-science research on reasoning and denial, he discusses studies that show how the reasoning processes of conservatives and progressives often depend more on their prior beliefs and expectations than on factual evidence and solid reasoning. Pay attention to the way that Mooney does more than just report his sources’ findings; he uses them to inform his own argument and back up his claims.

“A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point” (Festinger, Riecken, & Schacter, 1956, p. 3). So wrote the celebrated Stanford University psychologist Leon Festinger and his coauthors, in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger and his coauthors were actually describing a famous case study in psychology.

Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.

Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.

Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment they had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?


At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction” (p. 171). Their willingness to believe in the prophecy had saved Earth from the prophecy!

From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous” (p. 171), wrote the researchers. The devastation of all they had believed had made them even more certain of their beliefs.

In the annals of denial, it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president, and much else. (For an overview of the phenomenon of motivated reasoning, see Kunda, 1990.) It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience: As Damasio explains, reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment (p. 144). It’s a “basic human survival skill,” explains political scientist Arthur Lupia of the University of Michigan (personal communication). We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

10 Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind (personal communication). “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”

In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we’re being scientists, but we’re actually being lawyers (2000, p. 10). Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

That’s a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don’t want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn’t too emotionally invested to accept it, anyway. That’s not to suggest that we aren’t also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It’s just that we have other important goals besides accuracy—including identity affirmation and protecting one’s sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.

Modern science originated from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the “idols of the mind.” Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best ideas prevail.


Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment (Lord, Ross, & Lepper), pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”

Since then, similar results have been found for how people respond to “evidence” about affirmative action, gun control, the accuracy of gay stereotypes (Munro & Ditto, 2010), and much else. Even when study subjects are explicitly instructed to be unbiased and even-handed about the evidence, they often fail. And it’s not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider “scientific consensus” to lie on contested issues.

In Kahan’s research, individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook (Kahan, Jenkins-Smith, & Braman, 2011, p. 148). (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In this study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert” (p. 153). A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another” (p. 153). The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert” (p. 163). Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. The alliances did not always hold. In another study (Kahan, Braman, Monaha, Callahan, & Peters, 2010), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.

In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family) could lead to outcomes deleterious to society (Kahan et al., 2010). Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “antiscience”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” says Kahan (Cone & Kahan, 2010).


And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler (2010) showed subjects fake newspaper articles in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually “ban” embryonic stem-cell research. Liberals weren’t particularly amenable to persuasion, either, but no backfire effect was observed.)

Another study gives some inkling of what may be going through people’s minds when they resist persuasion. Northwestern University sociologist Monica Prasad and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up a study in which they discussed the topic with some of these Republicans in person (Prasad, Perrin, Bezila, Hoffman, Kindleberger, Manturuk, & Powers, 2009). They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had “said the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”

As it turned out, not even Bush’s own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those?

Respondent: Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that. (Prasad et al., 2009, p. 154)

The same types of responses are already being documented on divisive topics facing the current administration. Take the “Ground Zero mosque.” Using information from the political myth-busting site, a team at Ohio State (Nisbet & Garrett, 2010) presented subjects with a detailed rebuttal to the claim that “Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer.” Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.


A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,’” explains Stanford social psychologist Jon Krosnick (personal communication). Indeed, there’s a sense in which science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well” (Cone & Kahan, 2010).

This may help explain a curious pattern Nyhan and Reifler found when they tried to test the fallacy that President Obama is a Muslim (2010). When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president’s religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using “social desirabililty” to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.

Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, a separate study (Nisbet & Garrett, 2010) showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.

Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or “narrowcast” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age” (personal communication).

If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it’s an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you’re a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.


So perhaps it should come as no surprise that more education doesn’t budge Republican views. On the contrary: In a 2008 Pew survey, for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of noncollege educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.

Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy—for example, abortion—if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments” (personal communication, April 12, 2011). These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right—and so their minds become harder to change.

That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one’s ideology.

Climategate had a substantial impact on public opinion, according to Anthony Leiserowitz, director of the Yale Project on Climate Change Communication. It contributed to an overall drop in public concern about climate change and a significant loss of trust in scientists (personal communication, April 5, 2011). But—as we should expect by now—these declines were concentrated among particular groups of Americans: Republicans, conservatives, and those with “individualistic” values. Liberals and those with “egalitarian” values didn’t lose much trust in climate science or scientists at all. “In some ways, Climategate was like a Rorschach test,” Leiserowitz says, “with different groups interpreting ambiguous facts in very different ways.”

So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr., 2009) and numerous Hollywood celebrities (most notably Jenny McCarthy [2011] and Jim Carrey). The Huffington Post gives a very large megaphone to denialists. And Seth Mnookin, author of the new book The Panic Virus (2011), notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.


Vaccine denial has all the hallmarks of a belief system that’s not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rates has been undermined by multiple epidemiological studies (see Mooney, 2009). It has been undermined as well by the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.

Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, after his 1998 Lancet paper—which originated the current vaccine scare—was retracted and he subsequently lost his license (General Medical Council, 2010, p. 9) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.

It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?

There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.

Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are “system justifiers”: They engage in motivated reasoning to defend the status quo.


This is a contested area, however, because 40 as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a “bourgeois” science and officially banned.

The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?

Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

This theory is gaining traction in part because of Kahan’s work at Yale. In one study (Kahan, Braman, Slovic, Gastil, & Cohen, 2007), he and his colleagues packaged the basic science of climate change into fake newspaper articles bearing two very different headlines—“Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming” (p. 5)—and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan and his colleagues infer that the effect occurred because the science had been written into an alternative narrative that appealed to their pro-industry worldview.

You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact” (Kahan et al., 2007). In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.


Cone, J. (Interviewer), & Kahan, D. (Interviewee). (2010, June 10). Cultural Cognition Project, Part 2. [Interview transcript]. Retrieved from…

Damasio, A. R. (1994, October). Descartes’ error and the future of human life. Scientific American, 271(4), 144.

Festinger, L., Riecken, H. W., & Schacter, S, (1956) When prophecy fails: A social and psychological study of a modern group that predicted the destruction of the world. Minneapolis: University of Minnesota Press.

General Medical Council. (2010, May 24). Determination on serious professional misconduct (SPM) and sanction [Letter to Jeremy Wakefield]. Retrieved from… (Paine 768-775)

Paine, Richard Johnson-Sheehan C. Writing Today, 2nd Edition. Pearson Learning Solutions, 01/2012. VitalBook file.

The citation provided is a guideline. Please check each citation for accuracy before use.

Everyone needs a little help with academic work from time to time. Hire the best essay writing professionals working for us today!

Get a 15% discount for your first order

Order a Similar Paper Order a Different Paper