Embracing the Complexity of Moral Action

A man was walking from Jerusalem down to Jericho when he fell among thieves who beat him and left him by the side of the road, naked and to all appearances dead. A priest was also passing that way and saw the man, but walked to the other side of the road to avoid him. Likewise, a Levite (a religious assistant at the Temple) passed by, moving to the other side of the road. But a Samaritan (a despised group) was moved with compassion for him, bandaged his wounds, took him to an inn and paid for his care (Luke 10: 25-37).

This well-known story presents us with a time-honored image of doing good for others. Those who volunteer are often called Good Samaritans to praise them; Hospitals, charitable societies and volunteer organizations use the name to indicate their approach to helping. But considered from the perspective of Lawrence Kohlberg’s moral psychology the Samaritan’s action does not count as “moral, ” because the Samaritan does not make a conscious choice “preceded by a judgment of right or wrong”.1 Instead, as the story reads, he is moved by compassion. This means he is swayed by his emotion2 and therefore acting from influences that Kohlberg, following Kant, called heteronomous, since they are not based in the autonomy of reasoned action alone. Thus, though the Samaritan binds the victim’s wounds and brings him to the inn, he is not from this perspective acting morally. This odd conclusion is based in a commonsense idea about what it means to be moral: that ethics cannot just be doing the right thing but must also be doing it for the right reason. Kohlberg, in an effort to rescue the psychology of morality from relativism, moved away from the deterministic approaches that were broadly accepted in his day. Instead, taking his sources from Kant and other philosophers, he emphasized the importance of conscious moral judgment and decision making: “An action … is neither good nor bad unless it has been preceded by a judgment of right or wrong [by the actor]”.3 In later work he would call this the assumption of phenomenalism, opening up, but at the same time restricting the domain of morality to conscious, explicit moral reasoning.4 In doing so, Kohlberg also posits a developmental hierarchy of moral reasons, with reasons associated with self-enhancement or social influence being less developed than reasons of concern for justice for others.5 These assumptions have been widely shared in early moral psychology.6 Though the Kohlberg approach to morality acts as a helpful corrective to earlier mechanistic approaches, its insistence on the sole legitimacy of autonomous cognitive reasoning disallows other sociological and psychological aspects of the ethical situation such as personality, non-conscious influence, emotion, and even skilled action. It treats them as, at best, irrelevant and at worst a biasing influence on moral action.7

But moral action like that illustrated in the story of the good Samaritan is far more complex than moral reasoning alone. Indeed, one might think moral reasoning would be more the expert domain of the other actors in the story, the priest and Levite, who by profession have been steeped and practiced in moral judgment processes. But they, confronted with the same ethical situation at the roadside, rush by. As the story goes, the Samaritan has been first moved by an emotion that in turn seems to be crucial in shaping his response to the ethical situation. He appears to act immediately, without extended deliberation, and in doing so reaches across social divides to offer needed help. He may even have simply felt compelled to respond, without much moral reasoning at all, in order to “do what one has to do”.8

The rich metaphor of moral action we find in the ancient story of the good Samaritan is open to a multitude of interpretations of the actors’motives. We use it here as a reminder that moral action cannot be reduced to one-dimensional explanations but must be understood in the complexity of the whole narrative. To try to pull a single thread of rational deliberation, or of emotional response, out of this dense weave of influences and processes would reduce the richness of this image of good acting to a thin caricature. The recent flowering of research in moral psychology is likewise transcending the narrow confines of the Kohlbergian approach and attempting to better grasp the complexity of moral action. It allows us to do justice to the richness of the ethical situation and to the variety of individual responses to it. We introduce here six aspects that capture the recent work on the complexity of moral action: moral identity, moral reasoning, skill and habits, personality and character, moral emotion, and Bildung.^[9] We give a suggestion of the importance of each influence on moral action, some idea of the variety of their interactions, and discuss them in the light of the heteronomy debate. We build on this framework to suggest that almost all moral action has influences that Kohlberg would count as heteronomous. The processes of moral action are so intricately interwoven that it would be rare indeed to find pure rational deliberation as a singular, isolated influence. We conclude that psychology and philosophy will need to embrace the complex, heteronomous, weave of moral action in order to better understand its philosophical and psychological underpinnings.

1. Moral Identity

Even when people are capable of making reasoned moral judgments, this ability often does not result in moral action.9 Blasi catalogued the consistently small correlations10 that obtained between increases in ability at moral reasoning in Kohlberg’s sense and taking the corresponding moral action. He labeled this disappointing state of affairs the judgment-action gap, and proposed moral identity as the motivational factor needed to bridge it. Moral identity is the possession of a particular kind of self concept11: a self concept with regard to issues of moral obligation, responsibility, and the good. The judgment-action gap is most problematic for those approaches, like Kohlberg’s, that have conscious moral judgment as the central aspect of the model.12 These models use psychological consistency pressures as the main engine to bridge the gap.13 If one judges something to be the moral thing to do, it is often quite uncomfortable to think that one has not done it. For Kohlberg, this consistency pressure to act morally comes from holding a principle (e. g., justice) and is thus about consistency with the principle.14 Moral identity approaches are also powered by cognitive consistency but here it is consistency with one’s moral identity and associated moral goals.15 Thus, for Kohlberg, immoral action is a betrayal of a principle, but for Blasi, it is a betrayal of the self, with presumably more motivating power.16 With the addition of moral identity as an important motivator of moral action, we have our first heteronomous aspect of moral action. If individuals are acting because of their moral identity as, e. g., a caring person, then they are being influenced by some good for the self. Walker and colleagues17 have shown that for some of the morally exemplary actors they study, there is a fusion of the sense of self with particular moral goals. Thus in acting in service of a moral goal (an autonomous action description), they are also acting in service of the self, which is identified with the goal (a heteronomous action description).18 But the heteronomy of moral identity does not stop here. Many researchers have shown how one’s identification with groups can be a source of moral motivation.19 Hart20 argues that “If the notion of identity is to contribute to an understanding of moral functioning, then it must be a construct with deep roots in a social world”. Thus, moral identity itself is heteronomous. In research on rescuers in the holocaust, the Oliners21 show that the largest category of these extraordinary helpers consists of those who helped because of their identification with certain groups that were committed to helping. Others helped because of moral identity commitments to abstract ethical principles, and still others helped because of the compassion they felt for individual victims. The theme again is one of the complexity of motives and influences in moral action. Here, social roles might come to the rescue of the priest and Levite in our story, with the hypothesis that they might be motivated to maintain purity by avoiding a person who is likely dead.22 This social embeddedness of moral identity is only one aspect of the multidimensional nature of the moral self. There is a voluminous literature showing that self-concept (and thus the moral identity that is a part of self concept) can have different facets that are relevant in different domains (e. g., religion, work, tribal affiliation, obligations to guests and travelers) with often only a modest drive toward unity.23 Thus, moral identity, a central motivating influence on moral action, is shot through with heteronomy.

2. Moral Reasoning

There is now a growing literature in qualitative studies of morally exemplary individuals.24 One of the central findings in all the reports is that none of the individuals studied spend much time doing reason-based evaluations of the right thing to do. For instance, none of the 24 computer scientists and engineers interviewed in Huff and Barnard25 ever mention using the codes of ethics of their professional societies.26 This was not for lack of time — the interviews lasted between 3 to 5 hours over two days, and the individual transcripts run from 12 to 21 thousand words in length.27 Nor was it for lack of familiarity — some of the interviewees had written the ethics codes for their national organization. But morally exemplary individuals do share a common use of moral reasoning. They are constantly engaged in the instrumental use of reason to help them achieve, and to argue for, their moral goals. Recent work on moral cognition can help us to understand this odd disjunction of moral reason and moral action.

2.1. Conscious and non-conscious reason

Some cognitive processing requires effort — it takes concentration and working memory. But at other times, and for other kinds of stimuli, processing is relatively effortless, non-conscious, requires little in the way of working memory, and gives us access only to the outcome rather than the process (e. g. insight, intuition, highly practiced routines). This latter kind of process has been called “system 1” processing, in partial reference to its earlier evolution. Controlled, working-memory-intensive, “system 2, ” processing gives us access to both outcomes (e. g. decisions, judgments) and also to the inputs and processes (e. g. assumptions, values, goals, evaluations, etc.). People may well have individual differences in the extent to which they use and develop one system or the other28 but everyone uses both. In addition, by practice one can move routines from the effortful system 2 to the practiced system 1.29Like Kohlberg,30 when we think of cognition we normally think of conscious, deliberate, guided processing, or type 2 processing. Here, one has conscious access to both the processes and their outcomes. That is, we are aware of and deliberatively guide our selection of the things we consider, how and how long we consider them, and the conclusions we reach. This deliberation takes effort, is associated with intelligence, and can be abstract and hypothetical (e. g. What if the man at the roadside was not dead? Do obligations of compassion trump obligations of purity?). Type 1 processes are rapid, more hidden from our awareness, usually proceed without our conscious guidance, and we only have access to their seemingly self-evident conclusions. For this reason, type 1 processes are often called “intuitive, ” but might better be called highly practiced.

2.2. Contents of non-conscious reason

So far we have only talked about characteristics of Type 1 processes (e. g. automatic, fast, nonconscious). But it is, after all, the contents of these processes that might allow us to decide if the processes are autonomous or not. What are people (not) thinking about when engaging in type 1 and type 2 processes in the moral domain? We will mention here two approaches.31 Work by Haidt and colleagues32 suggests that there is a relatively small number of moral foundations that underlie moral judgment and action (e. g. in recent work,33 6 are listed: harm, fairness, ingroup, authority, purity, and liberty). These foundational moral evaluation schemas most often work in system 1, autonomously, and give rise to simple intuitions about whether something is morally good or bad, along with a characteristic emotional response (e. g. disgust for purity violations). Thus, coming back to the image, the Levite might, for non-consciously-processed reasons, automatically cross the road to avoid impurity. He would not be consciously deciding to do so, and might not even remember that he had. Another approach to the contents of moral (non) cognition is a massively cross-cultural research program on values, which concludes that cultures do in fact share underlying values, though they differ in the emphasis they place on them. Depending on how finely one slices it, the program identifies between 10 and 19 values, but always grouped as a circle within a two dimensional space: 1) openness to change vs conservation, 2) self-transcendence vs. self-enhancement.34 Some of these values seem easily to fall into autonomous sources of motivation (e. g. universalism, benevolence) while others seem not to (conformity, tradition).

2.3. Naturalistic Decision Making

Many models of reason (and most published research on reason) presume a correct way to reason35 and this benchmark allows researchers to document ways people depart from the model, thus also achieving the goal of description. A glaring shortcoming of this approach is that the models implicitly restrict their reach to those places where the “correct” answer can be calculated, and thus ignore those domains of life in which people do important reasoning, but not of the kind the model can track. In response, many researchers are now adopting purely descriptive approaches to rationality. These approaches have variously been called naturalistic decision making,36 real-life decision structuring,37 grounded cognition,38 and autobiographical reasoning.39 Here, we refer to them all as naturalistic decision making. Naturalistic decision-making research finds that perceiving situations and matching roles or solutions to them is more important than calculating outcomes of courses of action. For instance, chess experts match possible solutions to situations40 and managers (one can imagine the Levite here) match situations to roles in a “logic of obligation”.41 It also finds that this sort of naturalistic decision making is best modeled within domains, based on the kinds of decisions made in those domains rather than in generic, one-size-fits-all reasoning models like those of Kohlberg.42 The logic and function of autobiographical narrative is, for instance, much more complicated than that of simple historical truth-discovery.43 Narratives about the self do have a directive function that involves understanding the past and predicting the future, but they also have important functions in self-definition.44Finally, in the spirit of naturalistic decision-making, one must note the research that suggests that experts in a moral domain (e. g. science ethics, environmental ethics, business ethics etc.) do their reasoning using concepts that are at the middle level.45 For example, when asked to comment on cases, philosophers who are expert in computer ethics use concepts like informed consent and privacy, while novice undergraduates tend to use higher-level concepts from consequentialist frameworks.46 Thus, like moral identity, actual moral reasoning, as it occurs in the real world, is embedded in the domains and tasks where it occurs, is often automatic in its action, is consistently (for both good and bad) intermixed with emotion, and only in part resembles the independent, generic, logical form imagined by theorists of autonomy. Reason does have a significant guiding role to play (see the section on Bildung below). But most forms of reasoning that guide moral action are blended with goal seeking based on (often laudable) desires. And these desires are rarely limited to abstract goals such as general justice or respect for the moral law.

3. Skills, Habit, and un-conscious moral action

Ethical education guidelines for middle school47 and college48 and guidelines for ethics instruction in computing,49 dentistry,50 psychology,51 science,52 and many other areas make explicit links between skill and ethical competence. In the same way that skill is central to professional ethics, it is also a centerpiece of most Aristotelian approaches to virtue ethics.53 We have already mentioned that most moral exemplars do not ponder ethical difficulties as much as they ponder how to achieve ethical goals. Much of the reason they have become exemplary is because they have invested large amounts of time and effort in following a particular moral goal. This continual practice helps their actions to become more skillful, often to the point where they feel effortless and unplanned, even automatic. Work on expertise54 and habit55 suggest that it is this kind of practice effect that can turn consciously guided action (playing scales, doing surgery, listening carefully) into automatic or semi-automatic routines — in the language of the previous section, moving from conscious system 2 to automatic system 1. At this point, many things that require attention or conscious weighing of alternatives become automatic and under normal circumstances can happen without further conscious guidance. This is helpful because it can free the actor to concentrate on higher-level goals or decisions. Of course, when circumstances are not normal, the most highly skilled actor can recognize the difficulty and “slow down when you should, ” switching to more controlled processes to guide action.56 Thus these skills and the decisions that underlie and compose them, are automatic but still capable of following a moral goal (e. g. good surgery, thoughtful listening). To claim that these skills are not really moral, is to miss the embeddedness of real moral choices within them. The doctor, business manager, or soldier doing triage under severe time pressure uses moral decision procedures, trades off values, chooses goals, and evaluates outcomes and reasons, though with the speed and grace of expert automaticity. To ask that this trading, choosing, and evaluating be done consciously at all times would be to risk failure of the entire enterprise. Because these procedures have become automatized, the skilled moral actor can make decisions under pressure, and even note when it is necessary to slow down to consciously evaluate things. But the morality must be trained into the system through extensive conscious practice, which include the appropriation of moral goals or even of the highest good. The Danish thinker Søren Kierkegaard (1813-1855) has revived this Aristotelian idea of ethical practice and developed a psychology of appropriation (understood as a internal process of embodying and incorporating the moral goal deeply into the self concept) that is still waiting to be unpacked for the context of moral psychology.57

It is important to note that this sort of automatized, skilled moral action is not confined to the professions. It is a part of the skills we all learn (more or less) as we mature from children to adults. As children we learn self-regulation, the process by which we manage goals and standards, selecting and ordering them and then monitoring them to make sure they’re being met.58 We also learn emotional-regulation, how to calm ourselves, recognize pain in others, how to share emotion with others, and to plan for stressful times.59 And we learn moral attentiveness, recognizing situations as containing moral import, evaluating that import, spotting stakeholders, perpetrators, and victims.60 All these are skills that can become highly automatized, with real moral decisions being pushed below conscious awareness in the service of more effective interaction with the world and others. Much of the good work of highly skilled actors (care-givers, health professionals, engineers, rescue workers, social reformers) would be impossible without the skilled, automatized action that belongs to those professions. Some scholars suggest that the priest and Levite were professionally skilled actors, with a morality embedded in their profession that led them to prioritize purity over compassion.61 But they might have simply been in a hurry, or not known what to do. Once we accept this embeddedness of morality in skilled behavior, it seems also reasonable to say that the more general skills that we must all learn (self-regulation, emotional-regulation, and moral attentiveness) are crucial to moral action. It is an open research question how much of our moral life is guided by this sort of automatic moral expertise, but to the extent that it is, its very automaticity counts against it as real morality in the eyes of autonomy theory.

4. Personality and character

Often what we mean by personality is the continuity of an individual’s behavior across situations, a continuity that is “characteristic” of the way that individual interacts with the world — or, character. There has already been a great deal of ink spilled by psychologists and philosophers in describing the influences of cultural and situational pressures on moral action.62 Here we suggest the relation of moral action to three levels63 of personality: dispositions (general personality traits), characteristic adaptations (smaller scale attitudes and commitments), and narrative identity (our story of who we are).

4.1. Dispositions

Despite some claims to the contrary, recent work has found reliable effects of personality dispositions in areas such as helping, cooperation, criminal behavior, and espoused moral values. Thus, “despite the pessimism of earlier reviews in this area, a growing body of literature suggests the importance of individual differences in helping”.64 The Big 5 is a widely accepted approach to personality traits,65 consisting of 5 dimensions (extraversion, agreeableness, conscientiousness, neuroticism, and openness), with each dimension being a broad level summary of more specific traits (e. g. extraversion has somewhat independent sub-traits of sociability and dominance). In their study of Canadians who won prizes for bravery or social service, Walker and colleagues66 found that brave prize winners were more likely to score highly on dominance, while social service prize winners scored more highly on nurturance (an aspect of Big 5 agreeableness). Huff and Barnard found that, among the moral exemplars in computing they studied, the more extraverted exemplars tended to be involved in social change movements while some of the more introverted exemplars tended to use their craft to help individuals.67 These examples with extraversion suggest the complexity of the likely relationships of dispositional personality traits with moral action. Morality is related in varying and complicated ways to each of the Big 5 dispositional traits and it might be best to say these dispositions more describe the different ways that individuals are moral: individuals on different ends of a dispositional dimension can be differently moral rather than more moral.68

4.2. Characteristic Adaptations

This level of personality consists of the characteristic ways that an individual adapts to his or her environment(s). To say “his or her” already suggests that there is likely more than one way of adaptation, and more than one kind of environment. These aspects of personality differ from the broad trait aspects mentioned above in that they are more closely linked to particular motivations and cognitions and more likely to change over time (through therapy, Bildung, or environmental influence). Two examples of this level include prosocial personality and cynicism. Other characteristic adaptations that are relevant to moral action are optimism,69 internal locus of control and efficacy,70 and generativity.71 The prosocial personality is closely linked to beliefs about social responsibility and motivations driven by other oriented empathy.72 It strongly predicts long term volunteering and spontaneous helping during accidents. And though it is enduring, the nature and extent of its commitments can change over time.73 It is not surprising that someone who has a negative attitude towards others and suspects them of dealing selfishly and dishonestly will be unlikely to help those individuals. Cynicism has been most extensively studied in the organizational literature,74 but has recently made an appearance in work on ethics in science.75 In these and other domains, cynicism is a reaction to perceived inequity and unfairness, and the source of a disposition not to help others.

4.3. Narrative Identity

In one of the most careful studies of moral exemplars, Walker and colleagues76 did extensive interviews with those who had been nominated in Canada for national prizes for heroic rescue or for sustained social service. They also interviewed a matched sample of “normal” individuals. They coded the interviews for how they structured their life narratives.77 Prize-winners did not differ very much from non-prize winners in terms of Big Five trait characteristics or lower level characteristic adaptations, but they did differ markedly on how they told the stories of their lives. Exemplars’stories emphasized early secure attachment to parental figures, a lack of early enemies and presence of early support. Their narratives had more positive affective tone, they spoke more positively about communication, and were more likely to emphasize the needs of others. Finally, the stories they told also tended to see good coming out of bad occurrences, something McAdams calls a “redemptive” theme.78 The conclusion from this short review of personality research is that at every level of personality, one can find influences that shape moral action but are somewhat separate from considered moral reasoning. We can also conclude that there is no single moral personality and that personality characteristics and their expression dramatically shape moral behavior.79 The priest, Levite, and Samaritan can surely be thought of as differing in the way they would tell their story about the ethical situation on the road to Jericho.

5. Moral Emotion

Philosophers often make a distinction between emotion-based and reasoned based influences on moral action. Almost all current approaches to moral cognition or emotion reject the necessary opposition of emotion and reason and often reject even the distinction between them, seeing reason, values, emotion, etc. bundled together in structured ways that support action.80 Most models of cognition and of emotion see the two working in complex interaction rather than simply in opposition.81 One makes a judgment (conscious or not) of unfairness and is angry. One feels disgust at terrorist beheadings or bombings and condemns them. One sees a suffering victim and is moved with compassion. Reason can lead us to fruitful questioning of our values and of our emotional commitment to them, but our empathic response to suffering can lead us to question the reasons we give for our non-involvement. Reason, too, can serve as a biasing influence, allowing us to rationalize bad behavior, just as the priest and Levite might excuse their non-helping with obligations of purity. Bandura82 provides a model of the many different ways that reasoning can help to distance actors from the harm they cause. Thus both reason and emotion can support or undermine moral behavior. In the same way that a once monolithic reason has now been shown to be multifaceted,83 the notion that “emotion” is a single thing and opposed to reason has been found to be much more complicated. Theories of emotion range from evolutionary models that identify discrete states,84 to appraisal theories85 that emphasize the structure of a limited set of emotions, to cultural models86 that see infinite variety. All these approaches recognize the complexity of emotions and share common errors in the way they sometimes treat emotion.87 For instance, cognition, emotion, and motivation have often been approached as separate systems, while emotion, mood, and affect are often treated as synonyms. Emotion is reduced to a feeling without regard to its other components and is often treated as a single state (one is angry) instead of a process. As an alternative, Sherer and Peper88 propose a list of components in the range of things that we call emotion, including: cognitive processes, physiological system regulation, motor expression, action tendencies, and associated subjective feelings. Each of these can change within any “one” emotion, like anger, and give it a different aspect or profile. They also propose a set of processes that humans (and some other animals) go through that involve or shape the emotion: initial appraisal, priority setting, action selection, behavior preparation, and behavior execution. Other processes like self-regulation, might be added to these.

One final note to this review of the complexity of emotion: they way we respond emotionally can change over time, both as a result of circumstances89 and as a result of learning. Compassion, for instance, can be cultivated,90 as can empathy.91 One can use reason to reappraise the basis for an emotional reaction and moderate or intensify it.92 And one can learn long-term patterns of reappraisal or of emotion regulation as a part of a program of self-education or therapy.93 What qualifies an emotion as “moral” is disputed,94 but emotions that involve social evaluations are the most likely candidates. These include: disgust, anger, contempt, guilt, shame, compassion, pride, awe or elevation, and gratitude.95 In the voluminous literature on each of these emotions, they are usually treated as though each is discrete. But there is clear evidence that they are at least overlapping in any real occurrence of an emotion.96 Thus the moral emotions are emotions and like other emotions have intimate links to appraisal and reason, and direct and indirect influences on action. They cannot be separated easily from reason or from reasoned evaluation. Nor can reason be easily abstracted from emotion. Emotions, and thus moral emotions, are a mixture of cognition and emotion. Thus the Samaritan’s acting “in compassion” can encompass the moral emotion of compassion, a compassionate cognitive judgment that this was a person in need, as well as the compelling motivation to help the other.97The complexity of emotion, and also of reason, should give us pause in making any simple statements about the relation among the two, or the supposed autonomy of reason from the tangled weave. This is not bad news, for it allows there to be people who are passionately committed to principles of justice. It allows us to be moved by compassion and to ponder the parable as an impulse to reassessing our values.

6. Bildung

We have mentioned in earlier sections that moral action can be supported by a variety of cognitive, emotional, and perceptual skills one learns. But why would individuals want to learn these skills? Certainly having a basic level of skill in emotion regulation helps one be less distressed in everyday life. One can say the same thing for many other skills that are associated with moral action — they help one live in reasonable cooperation with others. So most people98 will feel the need to develop these skills to some minimal level. There is no good psychological terminology for this intentional attempt to become better at things moral. Colby and Damon document the pattern among the moral exemplars they interview, and speak of it as a “transformation of goals through social influence”.99 Others call it “control processes”100 or self-regulation.101 We propose a term from German that sometimes appears in English: Bildung, which literally means to build up knowledge and character.102 This is a kind of encounter with the world that serves the purpose of self-development. With its overtones of intentional construction of the self, we think it is a useful term for this process. Education in the sense of Bildung should also be “upbuilding” in the sense of facilitating the development of the whole person — including moral identity and virtue. To engage in Bildung toward a more moral self one not only needs to identify the goal, but also incorporate it into one’s personal identity and narrative, embody it not only in one’s life philosophy but also in one’s daily practice. Thus, moral education must include self-formation.103 There clearly are individuals whose life trajectory arcs toward justice, or compassion for others, or concern for specific moral goals.104 These exemplars of moral focus may not show up (and perhaps should not show up) in large samples of regular people. But studying them helps us to understand how extraordinary moral commitment develops. An important thing to note in this arc of development is that it is a transformation of moral goals through social influence105. Some, but not all, of this social influence comes in reasoned (and sometimes passionate) discussion with others. The work of many researchers on moral exemplars also documents how they are embedded in social networks that help them achieve their goals but who also shape their goals. Thus even among those whom we are most likely to praise for leading lives of moral commitment, we find crucial influence from sources not based in conscious moral reason.

7. Conclusion: The Ubiquity of Heteronomy

This short survey of influences on, and processes of, moral action makes clear that in the broad range of action directed toward good only a quite small space might be called truly autonomous action. There are indeed times when even experts or moral exemplars pause and consciously reflect. But these do not seem likely to be a significant fraction of people’s moral acting. What is the fraction? In his Nicomachean Ethics, Aristotle suggests that the virtuous person was prepared (or “cultivated”) for virtue by learning good habits.106 The importance of habits for moral behavior can be seen from how much of our behavior they constitute. In experience sampling studies, habitual actions comprise 45% of everyday activity.107 So we are talking about significantly large fractions of what we would like to call moral action being eliminated by a strict criterion of autonomy that follows Kohlberg’s definition. Still, one has to be willing to learn such habits, one has to decide to begin incorporating them into ones lifelong practice. Thus, as Aristotle says, good habits start with practicing the good, and this is something the individual must decide to do. This brings us back, through Bildung, to the realm of the reasoning-action gap. In order to be able to act on one’s moral judgment, reasoning, or philosophy, one must start to cultivate what one considers the good in one’s own life. Such cultivation could in turn help explain why some people seem to be more ready or equipped to respond to an ethical situation. We should understand this Bildung as an active and in part autonomous process, where one seeks out teachers but also teaches oneself. This autonomy stands outside Kohlberg’s conception of autonomy, which only looks at reasoning in the ethical situation itself and not at the processes of Bildung and appropriation as life long developmental processes. But this does not supplant the need for consciously guided reflective reasoning in those situations that call for it. Even though much moral action is influenced by social, cultural, personality, self-identity, and emotional factors, there are still times when conscious moral reflection is helpful and even necessary.108 Indeed, the work of many moral exemplars consists of constant moral critique of social systems that lead to injustice.109 The work of personal moral development can be guided by reasoned argument. The parable of the good Samaritan is a reasoned attempt to influence people to reconsider their values. Parables like the one we have investigated may be one of the most appropriate mediums for Bildung because they facilitate reasoned reflection and emotional engagement and appropriation. The ubiquity of heteronomy suggests that the story of how moral action is best supported is more complicated than isolated, consciously guided, reflective reasoning. The multiple influences on moral action and their interactions are so diverse, layered, multidimensional, and extended over time, that considering them as a whole should change our evaluation of the role of conscious moral deliberation in moral action. If we want to learn how people do good, and become good, or to evaluate whether they or their actions are good, we will need to embrace this complexity.


  • –. (1989), The Epic of Gilgamesh (M.G.Kovacs, Trans.), Stanford, CA: Stanford University Press.
  • Aldao, A. (2013), “The Future of Emotion Regulation Research: Capturing Context”, Perspectives on Psychological Science, 8(2), 155-172. doi:10.1177/1745691612459518.
  • Aristotle, (1941), Nicomachean Ethics (W.D.Ross, Trans.). In R.McKeon, ed., The Basic Works of Aristotle (pp. 927-1112), New York: Random House.
  • Badhwar, N.K. (1993), “Altruism versus self-interest: sometimes a false dichotomy”, Social Philosophy and Policy: Special Issue on Altruism, 10(1), 90-117.
  • Bandura, A. (1999), “Moral disengagement in the perpetration of inhumanities”, Personality and social psychology review, 3(3).
  • Bandura, A. (2002), “Selective moral disengagement in the exercise of moral agency”, Journal of Moral Education, 31(2), 101-119.
  • Barsolou, L.W., Simmons, W.K., Barbey, A.K., & Wilson, C.D. (2003), “Grounding conceptual knowledge in modality specific systems”, Trends in Cognitive Sciences, 7(2), 84-91.
  • Bebeau, M.J. (1994), “Influencing the moral dimensions of dental practice”, in J.R.Rest & D.Narvaez, eds., Moral development in the professions: Psychology and applied ethics, Hillsdale, NJ: Lawrence Erlbaum, pp. 121-146.
  • Bebeau, M.J., & Thoma, S.J. (1999), “Intermediate Concepts and the Connection to Moral Education”, Educational Psychology Review, 11(4), 343-360.
  • Berntsen, D., Johannessen, K.B., Thomsen, Y.D., Bertelsen, M., Hoyle, R.H., & Rubin, D.C. (2012), “Peace and war: trajectories of posttraumatic stress disorder symptoms before, during, and after military deployment in afghanistan”, Psychol Sci, 23(12), 1557-1565, doi:10.1177/0956797612457389.
  • Besser-Jones, L. (2011), “The Role of Practical Reason in an Empirically Informed Moral Theory”, Ethical Theory and Moral Practice, 15(2), 203-220, doi:10.1007/s10677-011-9284-9.
  • Blasi, A. (1980), “Bridging moral cognition and moral action: A critical review of the literature”, Psychological Bulletin, 88(1), 1-45.
  • Blasi, A. (1984), “Moral Identity: Its role in moral functioning”, in W.M.Kurtines & J.L.Gewirtz, eds., Morality, moral behavior, and moral development, New York: Wiley, pp. 129-139.
  • Bluck, S., Alea, N., Habermas, T., & Rubin, D.C. (2005), “A tale of three functions: The self-reported uses of autobiographical memory”, Social Cognition, 23(1), 91-117.
  • Boekarts, M., Pintrich, P.R., & Zeidner, M. (2005), eds, Handbook of Self Regulation, San Diego: Elsevier.
  • Bronk, K.C. (2012), “The exemplar methodology: An approach to studying the leading edge of development”, Psychology of Well Being: Theory, Research and Practice, 2(5).
  • Bronk, K.C., King, P.E., & Matsuba, M.K. (2013), “An introduction to exemplar research: a definition, rationale, and conceptual issues”, New Dir Child Adolesc Dev, 2013(142), 1-12, doi:10.1002/cad.20045.
  • Callahan, D. (1980), “Goals in the Teaching of Ethics”, in D.Callahan & S.Bok, eds., Teaching ethics in Higher Education, New York: Plenum, pp. 61-74.
  • Cameron, C.D., Lindquist, K.A., & Gray, K. (2015), “A constructionist review of morality and emotions: No evidence for specific links between moral content and discrete emotions”, Personality and social psychology review, 1-24. doi:10.1177/1088868314566683.
  • Carver, C.S., & Connor-Smith, J. (2010), “Personality and Coping”, Annual Review of Psychology, 61, 679-704. doi:10.1146/annurev.psych.093008.100352.
  • Carver, C.S., & Scheier, M.F. (2002), “Control processes and self-organization as complimentary principles underlying behavior”, Personality and social psychology review, 6(4), 304-315.
  • Cervone, D., Shadel, W.G., Smith, R.E., & Fiori, M. (2006), “Self-regulation: Reminders and suggestions from personality science”, Applied Psychology, 55(3), 333-385.
  • Cieciuch, J., Schwartz, S.H., & Davidov, E. (2015), “Values”, Social Psychology, 41-46, doi:10.1016/b978-0-08-097086-8.25098-8.
  • Colby, A., & Damon, W. (1992), Some do care: contemporary lives of moral commitment, New York: Free Press.
  • Cosmides, L., & Tooby, J. (2008), “Can a general deontic logic capture the facts of human moral reasoning? How the mind interprets social exchange rules and detects cheaters”, in W.Sinnott-Armstrong, ed., Moral psychology (Vol. 1, pp. 53-119), Cambridge, MA: MIT Press.
  • De las Fuentes, C., Willmuth, M.E., & Yarrow, C. (2005), “Competency training in ethics education and practice”, Professional Psychology: Research and Practice, 36, 362-366.
  • de St. Aubin, E., McAdams, D.P., & Kim, T.-C. (2004), eds., The generative society: caring for future generations. Washington, DC: American Psychological Association.
  • De Vries, R., Anderson, M.S., & Martinson, B.C. (2006), “Normal misbehavior: scientists talk about the ethics of research”, Journal of Empirical Research on Human Research Ethics, 1(1), 43-50.
  • Dean, J.W., Brandes, P., & Dharwadkar, R. (1998), “Organizational Cynicism”, Academy of Management Review, 23(2), 341-352.
  • Doris, J.M. (2002), Lack of character: Personality and moral behavior, Cambridge, UK: Cambridge University Press.
  • Dovidio, J.F., Piliavin, J.A., Gaertner, S.L., Schroeder, D.A., & Clark, R.D. (1991), “The arousal: Cost-reward model and the process of intervention: A review of the evidence Clark, Margaret S (pp. (1991). Prosocial behavior”, Review of personality and social psychology, Vol. 1912, pp. 1986-1118. Thousand Oaks, CA, US: Sage Publications, Inc. 1327.
  • Dreyfus, H.L., & Dreyfus, S.E. (2004), “The Ethical Implications of the Five-Stage Skill-Acquisition Model”, Bulletin of Science, Technology, & Society, 24, 251-264, doi:10.1177/0270467604265023.
  • Eisenberg, N. (2005), “The development of empathy-related responding”, in C.Pope-Edwards & G.Carlo, eds., Nebraska symposium on motivation, Vol 51: Moral motivation through the life span, Lincoln, NE: University of Nebraska Press, pp. 72-117.
  • Ekman, P. (1999), “Basic Emotions”, in T.Dalgleish & M.Power, eds., Handbook of cognition and emotion, New York: Wiley & Sons, pp. 45-60.
  • Ellsworth, P.C. (2013), “Appraisal theory: Old and new questions”, Emotion Review, 5(2), 125-131, doi:10.1177/1754073912463617.
  • Evans, J.S.B.T., & Elqayam, S. (2011), “Towards a descriptivist psychology of reasoning and decision making”, Behavioral and Brain Sciences, 34(5), 275-290, doi:10.1017/S0140525X1100001X.
  • Evans, J.S.B.T., & Stanovich, K.E. (2013), “Dual-process theories of higher cognition: Advancing the debate”, Perspectives on Psychological Science. doi:10.1177/1745691612460685.
  • Feinberg, M., Willer, R., Antonenko, O., & John, O.P. (2012), “Liberating Reason From the Passions: Overriding Intuitionist Moral Judgments Through Emotion Reappraisal”, Psychological science. doi:10.1177/0956797611434747.
  • Festinger, L. (1957), A theory of cognitive dissonance, Stanford, CA: Stanford University Press.
  • Fiske, A.P., & Haslam, N. (2005), “The four basic social bonds: Structures for coordinating interaction”, in M.Baldwin, ed., Interpersonal cognition. New York: Guilford Press.
  • Fowles, D.C. (2011), “Current Scientific Views of Psychopathy”, Psychological Science in the Public Interest, 12(3), 93-94, doi:10.1177/1529100611429679.
  • Frimer, J.A., & Walker, L.J. (2008), “Towards a new paradigm of moral personhood”, Journal of Moral Education. Special Issue: Towards an integrated model of moral functioning, 37(3), 333-356. doi:10.1080/03057240802227494.
  • Frimer, J.A., & Walker, L.J. (2009), “Reconciling the self and morality: an empirical model of moral centrality development”, Developmental Psychology, 45(6), 1669-1681. doi:10.1037/a0017418.
  • Furchert, A. (2012), Das Leiden fassen: zur Leidensdialektic Søren Kierkegaards. Freiburg, Germany: Verlag Karl Alber.
  • Galotti, K.M. (2007), “Decision structuring in important real-life choices”, Psychological science, 18(4), 320-325.
  • Gergen, K. (2000), The saturated self: dilemmas of identity in contemporary life. New York: Basic Books.
  • Graham, J., & Haidt, J. (2012), “Sacred values and evil adversaries: A moral foundations approach”, in P.R.Shaver & M.Mikulincer, eds., The social psychology of morality: Exploring the causes of good and evil, New York: Americam Psychlgical Association Books, pp. 11-31.
  • Graham, J., Nosek, B.A., Haidt, J., Iyer, R., Koleva, S., & Ditto, P.H. (2011), “Mapping the Moral Domain”, Journal of Personality and Social Psychology, 101, 366-385.
  • Habermas, T. (2011), “Identität und Lebensgeschichte heute. Die form autobiographischen Erzahlens”, Psyche, 65(7), 646-667.
  • Haidt, J. (2001), “The emotional dog and its rational tail: A social intuitionist approach to moral judgment”, PSYCHOLOGICAL REVIEW-NEW YORK-, 108(4), 814-834.
  • Haidt, J. (2003), “The moral emotions”, in R.J.Davidson, K.R.Scherer, & H.H.Goldsmith, eds., Handbook of affective sciences, Oxford: Oxford University Press, pp. 852-870.
  • Haidt, J., & Joseph, C. (2004), “Intuitive ethics: How innately prepared intuitions generate culturally variable virtues”, Daedalus, 133(4), 55-66.
  • Haidt, J., & Joseph, C. (2007), “The moral mind: How 5 sets of innate moral intuitions guide the development of many culture-specific virtues, and perhaps even modules”, in P.Carruthers, S.Laurence, & S.Stitch, eds., The Innate Mind, New York: Oxford University Press, Vol. 3, pp. 367-391.
  • Hanson, K.C., & Oakman, D.E. (2008), Palestine in the time of Jesus: Social structures and social conflicts. Minneapolis: Fortress Press.
  • Hart, D. (2005). “Adding Identity to the Moral Domain”, Human Development, 48, 257-261.
  • Hart, D., Murzyn, T., & Archibald, L. (2013), “Informative and inspirational contributions of exemplar studies”, in M.K.Matsuba, P.E.King, & K.C.Bronk, eds., Exemplar methods and research: Strategies for investigation: New directions for child and adolescent development, New York: Wiley, pp. 75-84.
  • Hill, P.L., & Roberts, B.W. (2010), “Propositions for the Study of Moral Personality Development”, Current Directions in Psychological Science, 19(6), 380-383. doi:10.1177/0963721410389168.
  • Huff, C.W. (in preparation). Taking moral action: influence, process, and context in moral psychology. New York: Blackwell.
  • Huff, C.W., & Barnard, L.K. (2009), “Good computing: moral exemplars in the computing profession”, IEEE TEchnology and Society Magazine(Fall), 47-54.
  • Huff, C.W., Barnard, L.K., & Frey, W. (2008), “Good Computing: a pedagogically focused model of virtue in the practice of computing (part 2)”, Journal of Information, Communication & Ethics in Society, 6, 284-316.
  • Huff, C.W., & Furchert, A. (2014), “Toward a pedagogy of ethical practice”, Communications of the ACM, 27(7), 25-27.
  • Huff, C. ., & Martin, C.D. (1995), “Computing consequences: A framework for teaching ethical computing”, Communications of the ACM, 38(12), 75-84.
  • Hutcherson, C.A., & Gross, J.J. (2011), “The moral emotions: A social-functionalist account of anger, disgust, and contempt”, Journal of Personality and Social Psychology, 100(4), 719-737.
  • John, O.P., & Srivastava, S. (1999), “The Big-Five trait taxonomy: History, measurement, and theoretical perspectives”, in L.A.Pervin & O.P.John, eds., Handbook of personality: Theory and research, New York: Guilford, pp. 139-153.
  • Kant, I. (1785/2011), Groundwork of the metaphysics of morals: A German–English edition (M. Gregor & J. Timmermann, Trans.), Cambridge, UK: Cambridge University Press.
  • Keane, W. (2015), Ethical Life: Its natural and social histories, Princeton, NJ: Princeton University Press.
  • Keefer, M.W., & Ashley, K.D. (2001), “Case-based Approaches to Professional Ethics: a systematic comparison of students’ and ethicists’ moral reasoning”, Journal of Moral Education, 30(4), 377-398.
  • Kohlberg, L. (1958), The development of modes of moral thinking and choice in the years 10 to 16 (Ph.D. Dissertation), Chicago: University of Chicago.
  • Kohlberg, L. (1963), “The development of children’s orientations toward a moral order: I. Sequence in the development of moral thought”, Vita Humana, 6(1-2), 11-33.
  • Kohlberg, L. (1964), “The development of moral character and moral ideology”, in M.L.Hoffman & L.W.Hoffman, eds., Review of child development research, New York: Russel Sage Foundation, Vol. 1, pp. 383-431.
  • Kohlberg, L., Levine, C., & Hewer, A. (1983), Moral stages: A current formulation and a response to critics. Basel, Switzerland: S. Karger.
  • Koole, S.L., & Aldao, A. (in press), “The self-regulation of emotion: Theoretical and empirical answers”, in K.D.Vohs & R.F.Baumeister, eds., Handbook of Self-Regulation, 3rd Edition. New York: Guilford.
  • Kuhl, J., & Koole, S.L. (2004), “Workings of the will: A functional approach”, in J.Greenberg, S.L.Koole, & T.Pyszczynski, eds., Handbook of Experimental Existential Psychology, New York: Guilford Press, pp. 411-430.
  • Lapsley, D.K. (2008), “Moral Self-Identity as the aim of education”, in L.P.Nucci & D.Narvaez, eds., Handbook of Moral and Character Education, New York: Routledge, pp. 30-52.
  • Lapsley, D.K., & Narvaez, D. (2005), “Moral psychology at the crossroads”, in D.K.Lapsley & F.C.Power, eds., Character psychology and character education, South Bend, IN: University of Notre Dame Press, pp. 18–35.
  • Lipschitz, R., Klein, G., Orasanu, J., & Salas, E. (2001), “Focus article: Taking stock of naturalistic decision making”, Journal of Behavioral Decision making, 14(Journal Article), 331-352.
  • March, J.G. (1982), “Theories of choice and the making of decisions”, Society, 20, 29-39.
  • McAdams, D.P. (2009), “The moral personality”, in D.Narvaez & D.K.Lapsley, eds., Personality, identity, and character: Explorations in moral psychology, New York, NY, US: Cambridge University Press, pp. 11-29.
  • McAdams, D.P., & Pals, J.L. (2006), “A new Big Five: Fundamental principles for an integrative science of personality”, American Psychologist, 61(3), 204-217.
  • McAdams, D.P., Reynolds, J., Lewis, M., Patten, A.H., & Bowman, P.J. (2001), “When bad things turn good and good things turn bad: Sequences of redemption and contamination in life narrative and their relation to psychosocial adaptation in midlife adults and in students”, Personality and Social Psychology Bulletin, 27(4), 474-485. doi:10.1177/0146167201274008.
  • McGregor, I., & Little, B.R. (1998), “Personal projects, happiness, and meaning: On doing well and being yourself”, Journal of Personality and Social Psychology, 74, 494-512.
  • Midlarsky, E., Fagin Jones, S., & Corley, R.P. (2005), “Personality Correlates of Heroic Rescue During the Holocaust”, Journal of Personality, 73(4), 907-934, doi:10.1111/j.1467-6494.2005.00333.x.
  • Mischel, W. (2004), “Toward an integrative science of the person”, Annual Review of Psychology, 55, 1-22.
  • Moll, J., Krueger, F., Zahn, R., Pardini, M., De Oliveira-Souza, R., & Grafman, J. (2006), “Human fronto-mesolimbic networks guide decisions about charitable donation”, Proceedings of the National Academy of Sciences, 103(42), 15623-15628.
  • Moulton, C.A.E., Regehr, G., Mylopoulos, M., & MacRae, H.M. (2007), “Slowing down when you should: A new model of expert judgment”, Academic Medicine, 82(10), S109-S116.
  • Mumford, M.D., Connelly, S., Brown, R.P., Murphy, S.T., Hill, J.H., Antes, A.L., . . . Devenport, L.D. (2008), “A Sensemaking Approach to Ethics Training for Scientists: Preliminary Evidence of Training Effectiveness”, Ethics & behavior, 18(4), 25p. doi:10.1080/10508420802487815.
  • Mumford, M.D., Murphy, S.T., Connelly, S., Hill, J.H., Antes, A.L., Brown, R.P., & Devenport, L.D. (2007), “Environmental Influences on Ethical Decision Making: Climate and Environmental Predictors of Research Integrity”, Ethics & behavior, 17(4), 30p. doi:10.1080/10508420701519510.
  • Narvaez, D. (2005), “The neo-Kohlbergian tradition and beyond: Schemas, expertise, and character”, in C.Pope-Edwards & G.Carlo, eds., Nebraska symposium on motivation, Vol 51: Moral motivation through the life span, Lincoln, NE: University of Nebraska Press, Vol. 51, pp. 119-163.
  • Narvaez, D. (2006), “Integrative Ethical Education”, in M.Killen & J.Smetana, eds., Handbook of moral development, Mahwah, NJ: Erlbaum, pp. 703-733.
  • Narvaez, D. (2010), “Moral Complexity: The fatal attraction of truthiness and the importance of mature moral functioning”, Perspectives on Psychological Science, 5(2), 163-181.
  • Narvaez, D., & Bock, T. (2014), “Developing ethical expertise and moral personalities”, in L.P.Nucci, D.Narvaez, & T.Krettenauer, eds., Handbook of moral and character education, New York: Routledge, pp. 140-158.
  • Neal, D.T., Wood, W., & Quinn, J.M. (2006), “Habits-A repeat performance”, Current Directions in Psychological Science, 15(4), 198-202.
  • Newell, A., & Simon, H.A. (1972), Human Problem Solving, Englewood Cliffs, NJ: Prentice Hall.
  • Nisbett, R.E., & Ross, L. (1980), Human inference: Strategies and shortcoming of social judgment, Mahwah, NJ: Prentice Hall.
  • Oatley, K., Keltner, D., & Jenkins, J.M. (2006), Understanding emotions. New York: Wiley-Blackwell.
  • Oliner, S.P., & Oliner, P.M. (1988), The altruistic personality: rescuers of Jews in Nazi Europe. New York: Free Press.
  • Penner, L.A., & Orom, H. (2010), “Enduring goodness: A person-by-situation perspective on prosocial behavior”, in M.Mikulincer, & P.Shaver, eds., Prosocial motives, emotions, and behavior: The better angels of our nature, Washington, DC, US: American Psychological Association. xiv, 2468, pp. 2055-2072.
  • Pessoa, L. (2008), “On the relationship between emotion and cognition”, Nature Reviews Neuroscience, 9(2), 148-158.
  • Plaisance, P.L. (2014), “Virtue in Media: The Moral Psychology of U.S. Exemplars in News and Public Relations”, Journalism & Mass Communication Quarterly, 91(2), 308-325. doi:10.1177/1077699014527460.
  • Reynolds, S.J. (2008), “Moral Attentiveness: Who Pays Attention to the Moral Aspects of Life?”, Journal of Applied Psychology, 93(5), 1027-1041.
  • Roccas, S., Sagiv, L., Schwartz, S.H., & Knafo, A. (2002), “The big five personality factors and personal values”, Personality and Social Psychology Bulletin, 28(6), 789-801.
  • Rothman, A.J., Sheeran, P., & Wood, W. (2009), “Reflective and Automatic Processes in the Initiation and Maintenance of Dietary Change”, Annals of Behavioral Medicine, 38, S4-S17. doi:10.1007/s12160-009-9118-3.
  • Scherer, K.R., & Peper, M. (2001), “Psychological theories of emotion and neurological research”, in F.Boller & J.Grafman, eds., Handbook of neuropsychology, Amsterdam: Elsevier, Vol. 5, pp. 17-48.
  • Schön, D.A. (1984), The reflective practitioner: How professionals think in action. New York: Basic Books.
  • Shoda, Y., Mischel, W., & Wright, J.C. (1994), “Intraindividual stability in the organization and patterning of behavior: Incorporating psychological situations into the idiographic analysis of personality”, Journal of Personality and Social Psychology, 67(Journal Article), 674-687.
  • Snow, N.E. (2010), Virtue as social intelligence: An empirically grounded theory. New York: Routledge.
  • Snyder, M., & Omoto, A.M. (2008), “Volunteerism: Social Issues Perspectives and Social policy implications”, Social Issues and Policy Review, 2(1), 1-36.
  • Stojanov, K. (2012), The concept of Bildung and its moral implications, Paper presented at the Philosophy of Education Society of Great Britain, New College, Oxford UK, http://www.philosophy-of-education.org/uploads/papers2012/Stojanov.pdf.
  • Stürmer, S., & Snyder, M. (2010). “Helping us versus them: Towards a group-level theory of helping and altruism within and across group boundaries”, in S.Stürmer & M.Snyder, eds., The Psychology of prosocial behavior, Malden, MA: Blackwell, pp. 33-58.
  • Sturmer, S., Snyder, M., Kropp, A., & Siem, B. (2006), “Empathy-motivated helping: The moderating role of group membership”, Personality and Social Psychology Bulletin, 32(7), 943.
  • Tamir, M. (2015), “Why Do People Regulate Their Emotions? A Taxonomy of Motives in Emotion Regulation”, Personality and social psychology review : an official journal of the Society for Personality and Social Psychology, Inc. doi:10.1177/1088868315586325.
  • Tangney, J.P., Stuewig, J., & Mashek, D.J. (2007), “Moral emotions and moral behavior”, Annual Review of Psychology, 58(Journal Article), 345-372.
  • Treviño, L.K., Weaver, G.R., & Reynolds, S.J. (2006), “Behavioral ethics in organizations: A review”, Journal of Management, 32(6), 951-990.
  • Walker, L.J. (2004a), “Gus in the gap: Bridging the judgment-action gap in moral functioning”, in D.Lapsley & D.Narvaez, eds., Moral development, self, and identity, Mahwah, NJ: Lawrence Erlbaum, pp. 1-20.
  • Walker, L.J. (2004b), “Progress and prospects in the psychology of moral development”, Merrill-Palmer Quarterly, 50(4), 546-557.
  • Walker, L.J. (2013), “Exemplars’ moral behavior is self-regarding”, New Directions for Child and Adolescent Development, 142(27-40).
  • Walker, L.J., & Frimer, J.A. (2007), “Moral personality of brave and caring exemplars”, Journal of Personality and Social Psychology, 93(5), 845.
  • Walker, L.J., Frimer, J.A., & Dunlop, W.L. (2010), “Varieties of Moral Personality: Beyond the Banality of Heroism”, Journal of Personality, 78(3), 907-942. doi:10.1111/j.1467-6494.2010.00637.x.
  • Weng, H.Y., Fox, A.S., Shackman, A.J., Stodola, D.E., Caldwell, J.Z., Olson, M.C., … Davidson, R.J. (2013), “Compassion Training Alters Altruism and Neural Responses to Suffering”, Psychol Sci. doi:10.1177/0956797612469537.
  • Zsambok, C.E., & Klein, G. (2014), eds., Naturalistic decision making. New York: Psychology Press.

  1. Kohlberg (1958), p. 5. ↩︎

  2. See the section on moral emotion to appreciate the complexity of claiming that something is “an emotion.” ↩︎

  3. Kohlberg (1958), p. 5. ↩︎

  4. Kohlberg & Levine & Hewer (1983), p. 69. ↩︎

  5. Kohlberg (1963), Kohlberg (1964). ↩︎

  6. Lapsley and Narvaez (2005). ↩︎

  7. Walker (2004b). ↩︎

  8. It is a widely shared finding in the study of moral exemplars that they often feel like they “must” do the moral thing. See the section on moral identity to understand this claim. ↩︎

  9. This difficulty is recognized in foundational documents in all major religions and in documents as old as the Epic of Gilgamesh, from the 18th century BC. ↩︎

  10. Both words in “consistently small” are important to note [Blasi (1980)]. There is a very consistent correlation (about .33) between moral cognition and moral action, but this consistency is at best “somewhat modest” [see Walker (2004a), p. 2]. ↩︎

  11. Blasi (1984). ↩︎

  12. Frimer and Walker (2008). ↩︎

  13. Festinger (1957). ↩︎

  14. Kohlberg’s approach was philosophically based on Kant’s [Frimer & Walker (2008); Lapsley & Narvaez, (2005)] and borrows its motivational structure from it. Indeed, in a well-known footnote in Kant’s (1785/2011, p. 30-31) Groundwork for the Metaphysics of Morals, Kant bridges the gap by treating the feeling of respect or reverence (Achtung) for a principle as both a cognitive and a motivational construct: “through a motivation self-wrought by a rational concept” (durch einen Vernunftbegriff selbstgewirktes Gefühl). ↩︎

  15. Blasi (1980). ↩︎

  16. See Lapsley (2008), for a review. ↩︎

  17. Frimer and Walker (2009); Walker (2013). ↩︎

  18. Badhwar (1993) argues that this fusion of self and moral goals helps to resolve the philosophical puzzle of altruism. ↩︎

  19. Snyder and Omoto (2008); Stürmer and Snyder (2010); Sturmer, Snyder, Kropp, and Siem (2006). ↩︎

  20. Hart (2005), p. 260. ↩︎

  21. Oliner and Oliner (1988). ↩︎

  22. Hanson and Oakman (2008). ↩︎

  23. For reviews, see Frimer and Walker (2008); Gergen (2000); Mischel (2004). ↩︎

  24. Bronk (2012); Bronk, King, and Matsuba (2013); Hart, Murzyn, and Archibald (2013). ↩︎

  25. Huff and Barnard (2009). ↩︎

  26. Huff and Furchert (2014). ↩︎

  27. Each interview transcript is 2 to 3 times longer than this chapter. ↩︎

  28. Evans and Stanovich (2013). ↩︎

  29. Theorists disagree on a range of issues about this approach, including whether there are 2 or even more systems, whether the difference is one of type (with a clear distinction) or of mode (with a range of intermediate positions), the extent to which they can be consciously controlled, and a host of other disagreements [Evans & Stanovich (2013)]. Any generalization at this early stage of theorizing should be done with caution. ↩︎

  30. Kohlberg et al. (1983). ↩︎

  31. For two other approaches, see Cosmides and Tooby (2008) and Fiske and Haslam (2005). ↩︎

  32. Graham and Haidt (2012); Graham et al. (2011); Haidt (2001); Haidt and Joseph (2004); Haidt and Joseph (2007). ↩︎

  33. Graham et al. (2011). ↩︎

  34. See Cieciuch, Schwartz, and Davidov (2015) for several versions of this approach, and its history. ↩︎

  35. Evans and Elqayam (2011). ↩︎

  36. Lipschitz, Klein, Orasanu, and Salas (2001); Zsambok and Klein (2014). ↩︎

  37. Galotti (2007). ↩︎

  38. Barsolou, Simmons, Barbey, and Wilson (2003). ↩︎

  39. Bluck, Alea, Habermas, and Rubin (2005); Habermas (2011). ↩︎

  40. Newell and Simon (1972). ↩︎

  41. March (1982). ↩︎

  42. Evans and Elqayam (2011); Lipschitz et al. (2001). ↩︎

  43. Habermas (2011). ↩︎

  44. Bluck et al. (2005). ↩︎

  45. Bebeau and Thoma (1999). ↩︎

  46. Keefer and Ashley (2001). ↩︎

  47. Narvaez (2006). ↩︎

  48. Callahan (1980). ↩︎

  49. Huff and Martin (1995). ↩︎

  50. Bebeau (1994). ↩︎

  51. De las Fuentes, Willmuth, and Yarrow (2005). ↩︎

  52. Mumford et al. (2008). ↩︎

  53. Snow (2010). ↩︎

  54. Dreyfus and Dreyfus (2004); Narvaez and Bock (2014). ↩︎

  55. Rothman, Sheeran, and Wood (2009). ↩︎

  56. Moulton, Regehr, Mylopoulos, and MacRae (2007); Schön (1984). ↩︎

  57. Furchert (2012). ↩︎

  58. Kuhl and Koole (2004). ↩︎

  59. Aldao (2013); Tamir (2015). ↩︎

  60. Reynolds (2008). ↩︎

  61. Hanson and Oakman (2008). ↩︎

  62. Some conclude that there is no such thing as personality or character [Doris (2002)]. Most of these critiques have missed much of the recent history of personality theory, and seem to be refighting the skirmishes of the situation vs. person debates of the 1980s [Nisbett & Ross (1980)]. There is indeed continuity across situations, if one looks for consistency in the pattern (or “behavioral signature”) of how individuals respond to situations they construe as relevant [Mischel (2004); Shoda, Mischel, & Wright (1994)]. ↩︎

  63. McAdams and Pals (2006). ↩︎

  64. Dovidio, Piliavin, Gaertner, Schroeder, and Clark (1991), p. 101. ↩︎

  65. John and Srivastava (1999); Roccas, Sagiv, Schwartz, and Knafo (2002). ↩︎

  66. Walker, Frimer, and Dunlop (2010). ↩︎

  67. Huff and Barnard (2009). ↩︎

  68. See Hill and Roberts (2010); McAdams (2009) for a larger review. ↩︎

  69. Oliner and Oliner (1988). ↩︎

  70. Midlarsky, Fagin Jones, and Corley (2005); Penner and Orom (2010); Treviño, Weaver, and Reynolds (2006). ↩︎

  71. de St. Aubin, McAdams, and Kim (2004). ↩︎

  72. Penner and Orom (2010). ↩︎

  73. Colby and Damon (1992). ↩︎

  74. Dean, Brandes, and Dharwadkar (1998). ↩︎

  75. De Vries, Anderson, and Martinson (2006); Mumford et al. (2007). ↩︎

  76. Walker and Frimer (2007). ↩︎

  77. McAdams, Reynolds, Lewis, Patten, and Bowman (2001). ↩︎

  78. McAdams et al. (2001). ↩︎

  79. Hill and Roberts (2010). ↩︎

  80. The massive interconnectivity of emotional and judgment areas in the brain [Pessoa (2008)] attest to this blending of cognition, emotion, and action, and Moll et al (2006) refer to (and also track over time) “cognitive-emotional” complexes in moral judgments. ↩︎

  81. Evans and Stanovich (2013). ↩︎

  82. Bandura (1999); Bandura (2002). ↩︎

  83. Evans and Stanovich (2013). ↩︎

  84. Ekman (1999). ↩︎

  85. Ellsworth (2013). ↩︎

  86. Cameron, Lindquist, and Gray (2015). ↩︎

  87. Scherer and Peper (2001). ↩︎

  88. Scherer and Peper (2001). ↩︎

  89. e.g. trauma - Berntsen et al. (2012). ↩︎

  90. Weng et al. (2013). ↩︎

  91. Eisenberg (2005). ↩︎

  92. Feinberg, Willer, Antonenko, and John (2012). ↩︎

  93. Koole and Aldao (in press). ↩︎

  94. Oatley, Keltner, and Jenkins (2006). ↩︎

  95. see Haidt (2003); Tangney, Stuewig, and Mashek (2007) for a review. ↩︎

  96. Cameron et al. (2015); Ellsworth (2013); Hutcherson and Gross (2011). ↩︎

  97. Indeed, compassion might be an excellent candidate for a central commitment that serves as a cognitive, emotional, and motivational construct, just like the moral emotion of respect in the Kantian sense which Kant describes precisely as a self-wrought feeling embedded in reasoning (durch einen Vernunftbegriff selbstgewirktes Gefühl).. ↩︎

  98. Not psychopaths, perhaps. But it is likely the development of these skills that allows some psychopaths to “pass” and to take advantage of others [Fowles (2011)]. ↩︎

  99. Colby and Damon (1992), p. 169. ↩︎

  100. Carver and Connor-Smith (2010); Carver and Scheier (2002). ↩︎

  101. Boekarts, Pintrich, and Zeidner (2005); Cervone, Shadel, Smith, and Fiori (2006); Koole and Aldao (in press). ↩︎

  102. Stojanov (2012). ↩︎

  103. Huff and Furchert (2014). But note that self development is not always in the service of moral goals. McGregor and Little (1998) have documented the primary life goals of a sample of Western students, and identify agentic, communal, and hedonistic goals. These are not necessarily moral goals and thus suggests caution in looking for all self-development to be moral Bildung. ↩︎

  104. Colby and Damon (1992); Huff and Furchert (2014); Plaisance (2014). ↩︎

  105. Colby and Damon (1992), p. 169. ↩︎

  106. Aristotle (1941), NE 10.9 1179b4-31. ↩︎

  107. Neal, Wood, and Quinn (2006). ↩︎

  108. Besser-Jones (2011). ↩︎

  109. Keane (2015). ↩︎