Nothing Special   »   [go: up one dir, main page]

NIH Public Access: Author Manuscript

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

NIH Public Access

Author Manuscript
Cognition. Author manuscript; available in PMC 2009 June 1.
Published in final edited form as:
NIH-PA Author Manuscript

Cognition. 2008 June ; 107(3): 1144–1154.

Cognitive Load Selectively Interferes with Utilitarian Moral


Judgment

Joshua D. Greene1, Sylvia A. Morelli2, Kelly Lowenberg3, Leigh E. Nystrom4, and Jonathan
D. Cohen4
1Department of Psychology, Harvard University
2Department of Psychology, Stanford University
3Stanford Law School
4Department of Psychology, Center for the Study of Brain, Mind, and Behavior, Princeton University

Abstract
NIH-PA Author Manuscript

Traditional theories of moral development emphasize the role of controlled cognition in mature moral
judgment, while a more recent trend emphasizes intuitive and emotional processes. Here we test a
dual-process theory synthesizing these perspectives. More specifically, our theory associates
utilitarian moral judgment (approving of harmful actions that maximize good consequences) with
controlled cognitive processes and associates non-utilitarian moral judgment with automatic
emotional responses. Consistent with this theory, we find that a cognitive load manipulation
selectively interferes with utilitarian judgment. This interference effect provides direct evidence for
the influence of controlled cognitive processes in moral judgment, and utilitarian moral judgment
more specifically.

Keywords
moral judgment; morality; utilitarian; cognitive control

1. Introduction
Traditional theories of moral development emphasize the role of controlled cognition in mature
NIH-PA Author Manuscript

moral judgment (Kohlberg, 1969; Turiel, 1983), while a more recent trend emphasizes the role
of intuitive or automatic emotional processes (Blair, 1995; Haidt, 2001; Mikhail, 2000;
Nichols, 2002, 2004; Pizarro & Salovey, 2002; Rozin, Lowery, Imada, & Haidt, 1999; Van
den Bos, 2003). Our previous work (Greene, Sommerville, Nystrom, Darley, & Cohen,
2001; Greene, Nystrom, Engell, Darley, & Cohen, 2004) suggests a synthesis of these two
perspectives in the form of a “dual-process” theory (Chaiken & Trope, 1999; Kahneman,
2003; Lieberman, Gaunt, Gilbert, & Trope, 2002; Posner & Snyder, 1975) according to which
both automatic emotional responses and more controlled cognitive responses play crucial and,
in some cases, mutually competitive roles. More specifically, we have argued that utilitarian
moral judgments are driven by controlled cognitive processes while non-utilitarian

Correspondence to: Joshua D. Greene, Department of Psychology, 33 Kirkland St., Cambridge, MA 02138, jdgreene@wjh.harvard.edu
(617) 495-3898.
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers
we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting
proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could
affect the content, and all legal disclaimers that apply to the journal pertain.
Greene et al. Page 2

(characteristically deontological) judgments are driven by automatic emotional responses


(Greene, in press).1 Although non-utilitarian judgments do not typically involve the application
of stereotypes, we propose that their dynamics may be similar to those observed in the
NIH-PA Author Manuscript

application of stereotypes, with utilitarian judgments requiring additional cognitive resources


(Devine, 1989; Gilbert and Hixon, 1991; Wegener and Petty, 1997) and with individuals
varying in their response to cognitive demands depending on their affinities for (non-)utilitarian
judgment (Devine, 1989; Cunningham et al., 2004).

Utilitarian (or, more broadly, consequentialist) judgments are aimed at maximizing benefits
and minimizing costs across affected individuals (Mill, 1861/1998). The utilitarian perspective
contrasts with the deontological perspective (Kant, 1785/1959), according to which rights and
duties often trump utilitarian considerations.2 The tension between these two perspectives is
nicely captured by the well-known footbridge dilemma (Thomson, 1986), in which a runaway
trolley is about to run over and kill five people. One can save them by pushing a different
person off of a footbridge and into the trolley’s path, stopping the trolley but killing the person
pushed. A prototypical utilitarian would (if all else is equal) favor performing this action in the
name of the greater good, while a prototypical deontologist would regard this as an
unacceptable violation of rights, duties, etc.3 With respect to this case, our dual-process theory
specifies that automatic emotional responses incline people to disapprove of pushing the man
off of the footbridge, while controlled cognitive processes incline people to approve of this
action.
NIH-PA Author Manuscript

The evidence in support of this theory is compelling but limited. Previous work has
demonstrated that “personal” moral dilemmas4 like the footbridge dilemma, as compared to
similar “impersonal” moral dilemmas, elicit increased activity in brain regions associated with
emotion and social cognition (Greene et al., 2001,2004). These data, however, are correlational
and do not demonstrate a causal relationship between emotional responses and moral
judgments. Three more recent studies, however, provide evidence for such a causal
relationship. Mendez, Anderson, & Shapria (2005) found that patients with frontotemporal
dementia, who are known for their “emotional blunting,” were disproportionately likely to
approve of the action in the footbridge dilemma (the utilitarian response). Koenigs et al.
(2007) generated similar results testing patients with emotional deficits due to ventromedial
prefrontal lesions. Finally, Valdesolo & DeSteno (2006) found that normal participants were
more likely to approve of the action in the footbridge dilemma following positive emotion
induction, a manipulation aimed at counteracting negative emotional responses. Together,
these three experiments provide strong evidence for our claim that non-utilitarian judgments
in cases such as these are driven by emotional responses. These experiments do not, however,
demonstrate the involvement of opposing cognitive control processes. As Haidt’s (2001) Social
Intuitionist Model might suggest, these could be cases in which two equally automatic and
NIH-PA Author Manuscript

emotional processes are competing, with one process compromised by brain damage or induced
countervailing emotion.

1We emphasize that this is an empirical hypothesis concerning a general trend rather than a conceptual claim. For a discussion of likely
exceptions see Greene (in press).
2The utilitarian perspective also contrasts with the Aristotelian virtue-based tradition, which we discuss elsewhere (Greene, in press).
3Deontological judgments in our sense need not be driven by the conscious application of deontological principles. See Cushman et al.
(2006) and Greene (in press).
4The present experiment focuses exclusively on “personal” moral dilemmas, and “high-conflict” personal dilemmas (Koenigs et al.,
2007) more specifically. These are the dilemmas that, according to our theory, involve a tension between automatic emotional processes
and controlled cognitive processes. Thus, we would not expect to see the effects reported here in “impersonal” dilemmas.
In our first study (Greene et al., 2001) we distinguished between “personal” and “impersonal” moral dilemmas/violations using three
criteria. “Personal” moral dilemmas/violations are those involving (a) serious bodily harm (b) to one or more particular individuals, where
(c) this harm is not the result of deflecting an existing threat. The latter criterion is aimed at capturing a sense of “moral agency.” Recent
work suggests that this criterion requires revision (Greene et al., submitted).

Cognition. Author manuscript; available in PMC 2009 June 1.


Greene et al. Page 3

Previous reaction time (RT) data (Greene et al., 2001) suggest that controlled cognitive
processes drive utilitarian judgments, but these data are inconclusive.5 Alternative evidence
comes from a subsequent neuroimaging study (Greene et al., 2004) in which brain regions
NIH-PA Author Manuscript

associated with cognitive control exhibited increased activity preceding utilitarian moral
judgments, made in response to difficult personal moral dilemmas. Nevertheless, as before,
these data are correlational and thus insufficient to establish a firm causal relationship between
cognitive control processes and utilitarian moral judgment. Several recent studies suggest a
role for controlled cognitive processes in moral judgment (Pizarro, Uhlmann, & Bloom
(2003); Cushman, Young, & Hauser, 2006; Valdesolo & DeSteno, in press), but none establish
a causal relationship between controlled cognitive processes and utilitarian moral judgment.
The primary aim of the present study is to do this.

2. Experiment
We presented participants with “high-conflict” (Koenigs et al., 2007) personal moral dilemmas
(Greene et al., 2001, 2004) in which one can kill one person in order to save several others.
These included the footbridge dilemma, as well as other more difficult dilemmas in which the
non-utilitarian option involves the death of all concerned. For example, in the “crying baby”
dilemma one must decide whether to smother one’s own baby in order to prevent enemy
soldiers from finding and killing oneself, one’s baby, and several others. Participants responded
under cognitive load (a concurrent digit-search task) and in a control condition. According to
NIH-PA Author Manuscript

our theory, utilitarian moral judgments (favoring the sacrifice of one life to save several others)
are supported by cognitive control processes, and therefore we predicted that increasing
cognitive load by imposing another control-demanding task would interfere with utilitarian
moral judgments, yielding increased RT and/or decreased frequency for utilitarian moral
judgment. Crucially, our theory predicts that RT increases will be selective for utilitarian
judgments, with no comparable increase in RT for non-utilitarian judgments.

3. Method
3.1 Participants and procedure
Eighty-two undergraduates (52 females, 30 males) participated in return for course credit under
approval of Princeton University’s IRB. After giving informed signed consent, participants
responded to forty “personal” and “impersonal” moral dilemmas (Greene et al., 2001, 2004)
presented on a computer, including twelve personal dilemmas designated as “high-conflict”
by Koenigs et al. (2007). The crying baby dilemma is an example of a high-conflict dilemma:
Enemy soldiers have taken over your village. They have orders to kill all remaining
civilians. You and some of your townspeople have sought refuge in the cellar of a large
NIH-PA Author Manuscript

house. Outside you hear the voices of soldiers who have come to search the house for
valuables.
Your baby begins to cry loudly. You cover his mouth to block the sound. If you remove
your hand from his mouth his crying will summon the attention of the soldiers who will
kill you, your child, and the others hiding out in the cellar. To save yourself and the others
you must smother your child to death.

5The influence of cognitive control is suggested by increased RT for judgments in favor of (as opposed to against) personal moral
violations (e.g. pushing the man off of the footbridge), consistent with the extra time needed for cognitive processes to compete with a
countervailing emotional response (akin to the competition between color naming and word reading in the Stroop task). (No comparable
effect was found for impersonal moral violations.) However, many of the dilemmas contributing to this effect have no clear utilitarian
solution or are cases in which utilitarian considerations count against the action in question (e.g. killing someone because you don’t like
him). A closer examination of the subset of cases in which utilitarian and non-utilitarian considerations clearly compete revealed no
reliable differences in RT between utilitarian and non-utilitarian judgments, providing further motivation for the present study. (Thanks
to Liane Young on this point.)

Cognition. Author manuscript; available in PMC 2009 June 1.


Greene et al. Page 4

Is it appropriate for you to smother your child in order to save yourself and the other
townspeople?
In all of the high-conflict dilemmas, the agent must decide whether to harm one person in order
NIH-PA Author Manuscript

to save the lives of several people. Within this constraint, the structure of these dilemmas varies.
Notably, the high-conflict dilemmas vary in terms of whether the potential victim’s death is
inevitable and whether the agent is among those who will be saved by the action. Only high-
conflict dilemmas are suitable for examining the conflict between utilitarian and non-utilitarian
judgment processes. However, because these dilemmas share a common structure, we
diminished repetition by presenting them along with the remaining dilemmas in our standard
battery. (Testing materials available online at [insert url].) We note that in each of the high-
conflcit dilemmas, the utilitarian response is also the affirmative (“Yes”) response. However,
an examination of results from the “impersonal” dilemmas (See online supplementary
materials) indicates that there is no general effect of affirmative vs. negative responses on RT.
Dilemmas were presented as horizontally streaming text (left to right, 36 pt. courier font,
approximately 16 characters per second). Participants indicated their judgments by pressing
one of two buttons. There was no time limit. Dilemmas were presented in pseudorandom order
in two blocks of twenty dilemmas each (control block and load block), subject to the constraint
that there be five personal dilemmas in each block expected to be difficult (“high conflict”)
based on previous work. Order of conditions/blocks was counter-balanced across participants.
In the load condition, adapted from Gilbert, Tafarodi, & Malone (1993), a stream of numbers
NIH-PA Author Manuscript

scrolled across the screen beneath the text and during the deliberation period. Numbers
appeared at a rate of approximately 3.5 per second. Participants were instructed to hit a button
each time they detected the number “5” (20% of digits) and were told that they would be
checked for accuracy. To counteract practice effects (observed in pilot testing), the speed of
the number stream increased to 7 numbers per second halfway through the load block.
Participants were instructed to perform the main task and the digit-search task simultaneously.
In both the load and no-load (control) conditions, participants were instructed to read aloud
and were made aware of their being recorded by a nearby microphone.

3.2 Analysis
Our analysis here focuses exclusively on dilemmas identified as “high-conflict” by Koenigs
et al. (2007). (See online supplementary materials for results from other dilemmas.) This set
of dilemmas is consistent with those observed to be difficult in our previous work (Greene et
al., 2004). Data were trimmed based on RT to within two SDs of the group mean. RT data were
analyzed using a mixed effects model and the restricted maximum likelihood (REML) fitting
method. This model included participant as a random effect and load and judgment as fixed
effects. Judgment data were analyzed using a likelihood ratio χ2 test for the effect of load.
NIH-PA Author Manuscript

4. Results
There was no main effect of load (F(1, 83.2) = 2.29, p = .13). There was a marginally significant
main effect of judgment (F(1, 71.7) = 3.9, p = .052), with longer RT for utilitarian judgments
(LS Means (SEM) ms: utilitarian = 6130 (207), non-utilitarian = 5736 (221)). Critically, we
observed the predicted interaction between load and judgment (F(1, 62.9) = 8.5, p = .005). (See
Figure 1.) Planned post-hoc contrasts revealed a predicted increase in RT for utilitarian
judgment under load (F(1, 106.3) = 9.8, p = .002; LS Means (SEM) ms: load = 6506 (238), no
load = 5754 (241)), but no difference in RT for non-utilitarian judgment resulting from load
(F(1, 169.6) = .10, p = .75; LS Means: load = 5691 (264), no load = 5781 (261)). Utilitarian
judgments were slower than non-utilitarian judgments under load (p = .001), but there was no
such effect in the absence of load (p = .91). This general pattern also held when item, rather
than participant, was modeled as a random effect, although the results in this analysis were not
as strong. There was no effect of load on judgment (χ2(1, N = 82) = .24, p = .62), with 61%

Cognition. Author manuscript; available in PMC 2009 June 1.


Greene et al. Page 5

utilitarian judgments under load (95% CI: 57%-66%) and 60% (95% CI: 55%-64%) in the
absence of load.
NIH-PA Author Manuscript

We conducted a follow-up analysis to explore the possibility that patterns of RT vary


systematically among participants based on their tendency to make utilitarian vs. non-utilitarian
judgments. We ranked participants based on the percentage of utilitarian judgments made in
response to high-conflict dilemmas and divided participants into equal high-utilitarian and low-
utilitarian groups based on these rankings. The high-utilitarian group averaged 80% utilitarian
judgments, the low-utilitarian group 42%. Both groups exhibited the predicted interaction
between load and judgment (high-utilitarian: F(1, 39.8) = 3.0, p = .046, one-tailed; low-
utilitarian: F(1, 30.8) = 4.4, p = .02, one-tailed). More specifically, both groups exhibited
increased RT for utilitarian judgment under load (high-utilitarian: F(1, 43.3) = 6.0, p = .01,
one-tailed, LS Means (SEM) ms: load = 6247 (339), no load = 5371 (340); low-utilitarian: F
(1, 75.8) = 3.3, p = .04, one-tailed, LS Means (SEM) ms: load = 6841 (331), no load = 6258
(337)), and neither group exhibited an effect of load on non-utilitarian judgment (p > .7). (See
Figure 2) The high-utilitarian group was generally faster than the low-utilitarian group to make
utilitarian judgments (F(1, 107.3) = 3.5, p = .06), but RT did not differ significantly between
groups for non-utilitarian judgments (p = .38). Load did not have a significant effect on
judgment in either group (p > .6). Low-utilitarian participants made 43% utilitarian judgments
under load (95% CI: 37%-50%) and 41% utilitarian judgments in the absence of load (95%
CI: 35%-48%). High-utilitarian participants made 79% utilitarian judgments under load and
NIH-PA Author Manuscript

(95% CI: 73%-84%) and 78% utilitarian judgments in the absence of load (95% CI: 72%-83%).

5. Discussion
Cognitive load selectively increased RT for utilitarian judgment, yielding the predicted
interaction between load and judgment type. In the full sample, load increased the average RT
for utilitarian judgments by three quarters of a second, but did not increase average RT for non-
utilitarian judgments at all. The predicted RT effects were observed in participants who tend
toward utilitarian judgment as well those who do not. These results provide direct evidence for
the hypothesized asymmetry between utilitarian and non-utilitarian judgments, with the former
driven by controlled cognitive processes and the latter driven by more automatic processes.
While load impacted RT, it did not reduce the proportion of utilitarian judgments, as one might
have expected based on our theory. We will return to this observation below.

These RT data have broader significance because the evidence implicating controlled cognitive
processes in moral judgment has been limited. Haidt’s (2001) Social Intuitionist Model allows
that some moral judgments may be driven by controlled cognitive processes, but this aspect of
the model is not supported by positive evidence. As noted earlier, our prior RT data (Greene
NIH-PA Author Manuscript

et al., 2001) are inconclusive and our prior neuroimaging data (Greene et al., 2004) are
correlational. Pizarro et al. (2003) altered participants’ judgments of moral responsibility by
instructing them to make either “rational, objective” judgments or “intuitive” ones. These
results implicate controlled processes, but, as the authors note, the use of explicit participant
instructions may artificially induce participants to engage controlled processes and to rely on
naïve theories concerning which judgments are more “rational” than others. Cushman et al.’s
(2006) results suggest that people may consciously deploy some moral principles in making
moral judgments, but conscious reasoning is not conclusively implicated. A recent study by
Valdesolo & DeSteno (in press) used a cognitive load paradigm to demonstrate that controlled
cognitive processes are involved in rationalizing one’s own unfair behavior. Here, controlled
cognitive processes are clearly implicated in people’s moral judgments, but these judgments
are, in a sense, post-hoc (Haidt, 2001) since these participants are evaluating actions
immediately after having chosen to perform them. Thus, the present data may provide the

Cognition. Author manuscript; available in PMC 2009 June 1.


Greene et al. Page 6

strongest evidence yet that controlled cognitive processes play a causal role in ex ante moral
judgment.
NIH-PA Author Manuscript

As noted above, the cognitive load manipulation did not reduce the proportion of utilitarian
judgments. One explanation for this is that participants were keenly aware of the interference
created by the load manipulation and were determined to push through it. Like motorists facing
highway construction, they may have been delayed, but not ultimately prevented from reaching
their destinations. If this is the case, then other manipulations (e.g. transcranial magnetic
stimulation applied to the dorsolateral prefrontal cortex) may be more successful in altering
judgment. We leave this for future research, as our primary concern here is with the hypothesis
that controlled cognitive processes play a special role in utilitarian judgments, as demonstrated
by the RT data.

In light of this hypothesis, one might expect utilitarian judgments to be slower than non-
utilitarian judgments in the absence of load. This effect was not observed in our sample as a
whole, but was observed in low-utilitarian participants. (Figure 2, right.) Why didn’t the high-
utilitarian participants exhibit this effect? One possibility is that there are competing effects at
work in these participants. On the one hand, making a counter-intuitive judgment requires
additional cognitive resources, implying increased RT (as seen in the low-utilitarian
participants). On the other hand, high-utilitarian participants exhibit a general bias toward
utilitarian judgment, which appears to involve decreased RT for utilitarian judgment. In the
NIH-PA Author Manuscript

absence of load, the latter effect may dominate. Consistent with this idea, we found a robust
correlation (r = -.43, p < .0001) between a participant’s tendency toward utilitarianism (i.e.
percent utilitarian judgments made) and that participant’s average RT for utilitarian judgments
in the absence of load.6 Interestingly, we found that utilitarian tendency bore no relationship
to RT for utilitarian judgments under load (r = .08, p = .47) and no relationship to RT for non-
utilitarian judgments (r = -.16 (load), r = .04 (no load), p > .1). This suggests that there is an
additional process that drives down RT in high-utilitarians in the absence of load, although this
process still remains susceptible to cognitive interference. To account for this process will
require a significant expansion and/or modification of our dual-process theory. One possibility
is that utilitarian normative principles are more consciously accessible than competing
deontological principles (Cushman et al., 2006), and that they are therefore more easily
routinized into a decision procedure. This hypothesis may be tested via an experiment in which
one “evens the playing field” by making a competing deontological principle (e.g. “It’s wrong
to harm someone as a means to an end”) more accessible.

Several other issues deserve attention: First, the present results do not address the appraisal
mechanisms that govern the emotional responses that, according to our theory, support non-
utilitarian judgments. These mechanisms may be sensitive to familiar moral distinctions, such
NIH-PA Author Manuscript

as the distinction between harmful actions and omissions (Haidt and Baron, 1996; Cushman
et al., 2006; Schaich Borg et al., 2006) and the distinction between harms that are intended and
those that are merely foreseen (Aquinas, unknown/2006; Mikhail, 2000; Schaich Borg et al.,
2006; Cushman et al., 2006). Other distinctions may be operative here as well (Royzman and
Baron, 2002; Waldmann and Dieterich, 2007; Greene et al., submitted). For present purposes
we are agnostic as to which non-utilitarian principles are operative in these judgments. We are
likewise agnostic as to whether these principles are suitable normative moral rules (Nichols &
Mallon, 2006). We note that neither our dual-process theory nor the present results implies that
the human brain houses systems specifically dedicated to utilitarian and deontological
judgment. On the contrary, we have argued that, at least in the case of utilitarian judgment, the
relevant cognitive systems are somewhat domain-general (Cohen, 2005). Finally, while the
present results, bolstered by previous neuroimaging data (Greene et al, 2004), indicate that

6Data were z-transformed separately for each participant.

Cognition. Author manuscript; available in PMC 2009 June 1.


Greene et al. Page 7

controlled cognitive processes play a special role in utilitarian judgments, these results leave
open many further details concerning the nature (e.g. reasoning vs. inhibitory control),
sequencing (e.g. parallel vs. serial), or timing of these processes. These issues remain to be
NIH-PA Author Manuscript

explored in future research.

Supplementary Material
Refer to Web version on PubMed Central for supplementary material.

Acknowledgements
We thank Jonathan Haidt, whose comments and suggestions prompted this research. We also thank Andrew Conway,
two anonymous reviewers, and the NIH (MH067410, award to JDG).

References
Aquinas, T. Summa Theologiae. Cambridge University Press; unknown2006.
Blair RJ. A cognitive developmental approach to mortality: investigating the psychopath. Cognition
1995;57(1):1–29. [PubMed: 7587017]
Chaiken, S.; Trope, Y., editors. Dual-Process Theories in Social Psychology. New York: Guilford Press;
1999.
Cohen JD. The vulcanization of the human brain: A neural perspective on interactions between cognition
NIH-PA Author Manuscript

and emotion. Journal of Economic Perspectives 2005;19:3–24.


Cushman F, Young L, Hauser M. The role of conscious reasoning and intuition in moral judgment: testing
three principles of harm. Psychol Sci 2006;17(12):1082–1089. [PubMed: 17201791]
Cunningham WA, Johnson MK, Raye CL, Chris Gatenby J, Gore JC, Banaji MR. Separable neural
components in the processing of black and white faces. Psychol Sci 2004;15(12):806–813. [PubMed:
15563325]
Devine PG. Stereotypes and prejudice: Their automatic and controlled components. Journal of Personality
and Social Psychology 1989;56:5–18.
Gilbert DT, Hixon JG. The trouble with thinking: Activation and application of stereotypic beliefs.
Journal of Personality and Social Psychology 1991;60(4):309–317.
Gilbert DT, Tafarodi RW, Malone PS. You can’t not believe everything you read. J Pers Soc Psychol
1993;65(2):221–233. [PubMed: 8366418]
Greene, JD. The Secret Joke of Kant’s Soul. In: Sinnott-Armstrong, W., editor. Moral Psychology:
Morality in the Brain. 3. Cambridge, MA: MIT Press; in press
Greene JD, Nystrom LE, Engell AD, Darley JM, Cohen JD. The neural bases of cognitive conflict and
control in moral judgment. Neuron 2004;44(2):389–400. [PubMed: 15473975]
Greene J, Lindsell D, Clarke A, Lowenberg K, Nystrom L, Cohen J. Pushing moral buttons: The
interaction between personal force and intention in moral judgment. submitted
NIH-PA Author Manuscript

Greene JD, Sommerville RB, Nystrom LE, Darley JM, Cohen JD. An fMRI investigation of emotional
engagement in moral judgment. Science 2001;293(5537):2105–2108. [PubMed: 11557895]
Haidt J. The emotional dog and its rational tail: A social intuitionist approach to moral judgment.
Psychological Review 2001;108:814–834. [PubMed: 11699120]
Haidt J, Baron J. Social roles and the moral judgment of acts and omissions. European Journal of Social
Psychology 1996;26:201–218.
Kahneman D. A perspective on judgment and choice: mapping bounded rationality. Am Psychol 2003;58
(9):697–720. [PubMed: 14584987]
Kant, I. Foundation of the metaphysics of morals. Beck, LW., translator. Indianapolis: Bobbs-Merrill;
17851959.
Koenigs M, Young L, Adolphs R, Tranel D, Cushman F, Hauser M, et al. Damage to the prefrontal cortex
increases utilitarian moral judgments. Nature 2007;446(7138):908–911. [PubMed: 17377536]

Cognition. Author manuscript; available in PMC 2009 June 1.


Greene et al. Page 8

Kohlberg, L. Stage and sequence: The cognitive-developmental approach to socialization. In: Goslin,
DA., editor. Handbook of socialization theory and research. Chicago: Rand McNally; 1969. p.
347-480.
NIH-PA Author Manuscript

Knobe J. Theory of mind and moral cognition: exploring the connections. Trends Cogn Sci 2005;9(8):
357–359. [PubMed: 16006176]
Lieberman MD, Gaunt R, Gilbert DT, Trope Y. Reflection and reflexion: A social cognitive neuroscience
approach to attributional inference. Advances in Experimental Social Psychology 2002;34:199–249.
Mendez MF, Anderson E, Shapira JS. An investigation of moral judgement in frontotemporal dementia.
Cogn Behav Neurol 2005;18(4):193–197. [PubMed: 16340391]
Mikhail, J. Rawls’ Linguistic Analogy: A Study of the “Generative Grammar” Model of Moral Theory
Described by John Rawls in A Theory of Justice. Cornell University; 2000. Unpublished doctoral
dissertation
Mill, JS. Utilitarianism. Crisp, R., editor. New York: Oxford University Press; 18611998.
Nichols, S. Sentimental Rules: On the Natural Foundations of Moral Judgment. New York: Oxford
University Press; 2004.
Nichols S. Norms with feeling: towards a psychological account of moral judgment. Cognition 2002;84
(2):221–236. [PubMed: 12175573]
Nichols S, Mallon R. Moral dilemmas and moral rules. Cognition 2006;100(3):530–542. [PubMed:
16157325]
Posner, MI.; Snyder, CRR. Attention and cognitive control. In: Solso, RL., editor. Information processing
and cognition. Hillsdale, NJ: Erlbaum; 1975. p. 55-85.
NIH-PA Author Manuscript

Pizarro, DA.; Salovey, P. On being and becoming a good person: the role of emotional intelligence in
moral development and behavior. In: Aronson, J., editor. Improving academic achievement: Impact
of psychological factors on education. San Diego: Academic Press; 2002. p. 247-266.
Pizarro D, Uhlmann E, Bloom P. Causal deviance and the attribution of moral responsibility. Journal of
Experimental Social Psychology 2003;39:653–660.
Royzman EB, Baron J. The preference for indirect harm. Social Justice Research 2002;15:165–184.
Rozin P, Lowery L, Imada S, Haidt J. The CAD triad hypothesis: a mapping between three moral emotions
(contempt, anger, disgust) and three moral codes (community, autonomy, divinity). J Pers Soc
Psychol 1999;76(4):574–586. [PubMed: 10234846]
Thomson, JJ. Rights, restitution, and risk : essays, in moral theory. Cambridge, Mass: Harvard University
Press; 1986.
Turiel, E. The development of social knowledge: Morality and convention. Cambridge, England:
Cambridge University Press; 1983.
Valdesolo P, DeSteno D. Manipulations of emotional context shape moral judgment. Psychol Sci 2006;17
(6):476–477. [PubMed: 16771796]
Valdesolo P, DeSteno D. Moral hypocrisy: the flexibility of virtue. Psychological Science. in press
Van den Bos K. On the subjective quality of social justice: the role of affect as information in the
psychology of justice judgments. Journal of Personality and Social Psychology 2003;85:482–498.
NIH-PA Author Manuscript

[PubMed: 14498784]
Waldmann MR, Dieterich JH. Throwing a bomb on a person versus throwing a person on a bomb:
intervention myopia in moral intuitions. Psychol Sci 2007;18(3):247–253. [PubMed: 17444922]
Wegener, D.; Petty, R. The flexible correction model: The role of naive theories of bias in bias correction.
In: Zanna, M., editor. Advances in Experimental Social Psychology. Mahwah, NJ: Erlbaum; 1997.

Cognition. Author manuscript; available in PMC 2009 June 1.


Greene et al. Page 9
NIH-PA Author Manuscript
NIH-PA Author Manuscript

Figure 1.
The effect of cognitive load on RT for utilitarian (black) and non-utilitarian (gray) moral
judgment. Data shown for the entire group (n = 82). Error bars indicate standard error of the
mean.
NIH-PA Author Manuscript

Cognition. Author manuscript; available in PMC 2009 June 1.


Greene et al. Page 10
NIH-PA Author Manuscript
NIH-PA Author Manuscript

Figure 2.
Effects of load on RT for high-utilitarian (n = 41) and low-utilitarian (n =41) groups.
NIH-PA Author Manuscript

Cognition. Author manuscript; available in PMC 2009 June 1.

You might also like