This module is a resource for lecturers
Mainstream approaches to ethics education often ask students to reflect on ethical matters in the hope that they will thus learn to live more ethically. This Module offers an alternative approach by focusing on the close relationship between ethical living and living without self-deception. The approach of this Module is based on the observation that a mere intellectual commitment to being ethical does not have a measurable impact on ethical conduct. Thus, for example, a study by philosophers Eric Schwitzgebel and Joshua Rust has shown that moral philosophers are on average no more ethical than anyone else (2013). This suggests that things other than having an intellectual understanding of ethics seem to be required to translate this intellectual commitment into action. In this light, we may wish to reconsider the standard way in which we teach ethics, and move beyond discussing ethics as an intellectual exercise. This Module aims to unsettle student understanding of what they should be looking for when seeking to improve themselves from an ethical point of view.
The approach of this Module draws inspiration from diverse thinkers from around the world that do not necessarily fit comfortably into any of the standard ethical theories discussed in Integrity and Ethics Module 1 (Introduction and Conceptual Framework), namely: utilitarianism, deontology or virtue ethics. One philosopher that has influenced this Module's approach is Albert Camus (1913-1960). For him, ethical living amounts to living lucidly, that is, without self-deception. Camus has little interest in finding theoretical foundations or ultimate justifications for ethics. Rather, his aim is to invite us to see and feel how ethics is part of the human condition. He shares this approach with philosophers as diverse as Ludwig Wittgenstein (1889-1951), Mary Midgley (1919- ) and Philip Hallie (1922-1994).
Steve Biko (1946-1977) and Frantz Omar Fanon (1925-1961) are also significant influences given the central role that they attribute to social conditions in forming minds and their concern for what could be described as self-ascribed bigotry (inferiority complex, as they call it). Related to Biko's and Fanon's concerns are those of social psychology and behavioural economics. Both of these empirical disciplines have played significant roles in inspiring the approach to ethics informing this Module. The reason for listing these thinkers here is to invite lecturers to engage with them to deepen their understanding of the material covered in this Module. However, one can teach the course without having engaged directly with the work of the above philosophers and social scientists.
This Module examines some of the internal and external forces that can threaten our autonomy as agents and undermine our ability to drive our lives as ethical beings. It shows that these forces, while typically playing very positive roles in our lives, can lead us to act unethically if we are not attentive and if we cannot resist becoming passive followers of the norms of our times, places and natural inclinations. The Module aims to inspire students to become aware of these pitfalls, become committed to avoiding them, and live ethically as responsible agents. It will give students a taste of the complexity of living ethically and show them the extent to which taking responsibility for our lives is a central aspect not only of living ethically, but also, more broadly, living lives that we will deem worthwhile.
The challenge of living ethically
We are ethical creatures by nature, guided through life by normative considerations. As shown in this video, research suggests that even pre-linguistic infants exhibit signs of possessing ethical prototypes that become ethical in the full sense after a long process of socialization (see also Bloom, 2013). Another example that illustrates the claim that at a basic level we all strive to be ethical is that people almost always rationalize (i.e. use reasons to trick ourselves into believing what is not the case) in the direction of making themselves seem better from a moral point of view than they actually are (Ariely, 2012; Tavris and Aronson, 2015). This is not simply because we want to be acknowledged by others, but it is also a matter of self-esteem, of avoiding painful inner conflicts.
Take the following example: when some accountants adjust the accounts to deceive, they seldom - if ever - do so out of ignorance, in the sense of failing to understand that this is unethical. Trying to enlighten such accountants by informing them that they violated the moral law is not typically an effective strategy for behavioural modification. At some level, they realize that they are doing wrong, but they tell themselves dissonance-reducing stories, or rationalizations that make it seem as if their behaviour is not only acceptable, but even perhaps heroic.
We tell ourselves these sorts of stories all the time. Typically, we rationalize when trying to convince ourselves that we are more moral than we actually are (Ariely, 2012). Perpetrators of atrocities typically describe themselves as freedom fighters or something very similar to this from their perspective (Sereny, 1995). Everyday criminals tend to find attenuating circumstances, that is, excuses, for their crimes (Baumeister, 1999). They might say things like: "I did it, but that is because forces that I have little or no control over, such as upbringing and bad company, led me to do it." One thing corrupt accountants, perpetrators of mass atrocities and common-variety criminals have in common is that they rationalize their behaviour, as does everyone else.
It is worth noting that rationalization typically happens in the direction of exculpation (Ariely, 2012; Tavris and Aronson, 2015). We rarely come across morally exemplary individuals who try to convince themselves that they are morally bankrupt. This is further evidence that at a basic level we all seek to be ethical. Related to the concept of rationalization is the 'Fudge Factor', a term referring to the extent to which one can cheat and still feel good about oneself because of the pull of powerful countervailing desires (Ariely, 2012).
If it is true that we are ethical by nature, then why is living ethically a problem for all of us without exception? It is a problem because, among other things, we are not only ethical beings. We are other things as well. We are, for instance, rational, pain-avoiding, pleasure-seeking, creative-storytelling, social, status-concerned, self-loving, and driven by powerful desires. We are also living in various contexts that influence how we behave and can cause us to violate our intrinsic values out of fear. Ethics is largely there to regulate our impulses, dispositions and behaviour. It arguably brings everything together into a semi-coherent tapestry called the self, something that demands ongoing concerted effort (Midgley, 2001). Things can go wrong very easily, and part of the problem is that aspects of ourselves that are typically identified as good can play dirty tricks on us.
Here are some examples: rationality is typically a positive quality, but, as we have seen, it also allows for the possibility of rationalization, that is, reason brought to the service of self-deception aimed at pain avoidance, particularly pain caused by the conflict between the desire to be good and the fact that we have done or want to do wrong (Ariely, 2012). In Benjamin Franklin's words:
"So convenient a thing is it to be a reasonable creature, since it enables one to find or make a reason for everything one has a mind to do." (1962, p. 43)
Creative storytelling - also generally considered a positive quality - can lead us to form fantasies about ourselves that lead to unethical action. We are social beings, indeed, caring beings. But our sociality can lead us to join an unthinking mob. We care about status. This is part of caring for the self and seeking self-improvement. It is also tied up with our social natures; part of being social is that we need affirmation from others. But status concerns can lead to out-of-control materialism and an unhealthy obsession with power. Similarly, self-concern is a condition for caring for the self, for having the motivation to meet our basic needs and flourish as human beings, but it can lead to excessive self-concern, to a form of narcissism that makes us struggle to grasp others as genuine human beings. And, of course, our powerful passions can be both deeply rewarding and deeply destructive.
The remainder of this section explores some of the mechanisms that undermine our ability to drive our lives as ethical beings. It is important to reiterate that these mechanisms also play important positive roles in our lives. This suggests that taking responsibility for our lives requires ongoing vigilance to stop mechanisms that typically serve us well from undermining our ability to act ethically. There are many other mechanisms that affect our ability to act ethically that are outside the scope of this Module, but the discussions will ideally trigger long-term interest in exploring such mechanisms further. Lecturers can encourage students to enhance their understandings by engaging with the readings, documentaries and movies listed in this Module.
Selective attention and psychological distance
When we look at a particular scene, we never grasp everything that is there. Instead, we see some things and not others. Typically, we tend to see what solicits our attention, but what does and what does not stand out for us is largely interest relative. Selective attention plays an important positive role in our lives. It allows us to pay attention to that which interests us. If one is busy studying, then zoning out background distractions may be a very successful learning strategy. However, this ability to zone things out may blind us to other things that may be happening that demand our immediate attention (such as the presence of someone in need of urgent help). Selective attention establishes a hierarchy of relevance, indeed of value (the belief that this is more important than that), which may not accord with what we genuinely value. Importantly, selective attention is not a mechanism we have full control over. It operates largely in the background and does the job for us without our knowledge, unless we make an effort to observe its operation.
In a short video, Daniel Simons explains this mechanism through an experiment that provides a powerful visual representation of selective attention. Simons stresses the positive role of selective attention. He also suggests that we tend to think that we see more than we actually do. Simons observes that we need to focus our attention on something in order to see it. Exercise 1 of this Module allows the students to experience this mechanism first hand.
Sometimes we may see something problematic unfolding right in front of us, but we are unable to fully grasp its significance and therefore do not respond or react properly. This basic feature of our lives, the ability to attend to some things and not to others, may not prima facie seem terribly relevant for understanding ourselves as ethical beings. However, the famous Good Samaritan Experiment shows that we may miss many ethically salient things that present themselves to us because we are in too much of a rush (for example, to get to an appointment) to fully grasp their significance.
In the experiment, which is the focus of Exercise 2 of the Module, a group of theology students see a person posing as someone in need of urgent help, but many of them fail to offer assistance. This case may not, strictly speaking, be a case of selective attention, at least not in the perceptual sense (all students see the person posing as someone in need of urgent help), but it is a case of not being able to properly attend to what is right in front of us. It could be argued that the students who did not aid the person in need failed to grasp salience. The failure here is not a failure of commitment or understanding, but a failure stemming from circumstances, specifically being in a rush.
We may miss many ethically salient things that present themselves to us because our attention is drawn away from our immediate surroundings, impairing our ability to fully grasp what we would want to grasp if we were not in a rush. What does this say, for example, about workaholic professionals and others working under extreme time pressure? As in the case of selective attention, being able to focus on the task at hand is also a very useful skill, and it is important that in most instances what goes in or what goes out of our spheres of attention happens automatically, behind our backs, so to speak. Were this not so, the business of living our day-to-day lives would be extremely difficult and time-consuming. In fact, without selective attention we would probably not be able to get on with the actual business of living our lives. Therefore, shortcuts are required. In the literature, these shortcuts are known as heuristics - rules of thumb that guide our lives. They normally serve us well, but at times they can be great hindrances. The rule in this case goes something like this: focus on the task at hand and attribute less importance to those things that do not contribute directly to achieving your aims.
Relatedly, we can also miss the importance of something because of a phenomenon known as psychological distance, which is one of the reasons that modern warfare - for example drone warfare - is so pernicious. The physical distance of attacking parties also distances soldiers emotionally from the event, blinding them to the full significance of their actions. Psychological distance can also lead to moral apathy, without us even knowing that this mechanism is largely responsible for the apathy. Students who are interested in exploring these issues further can watch the 2015 film Eye in the Sky that illustrates some of the ethical challenges of drone warfare including issues related to privacy, surveillance and human rights.
Conformity, obedience, and the bystander effect
The influential Solomon Asch experiment vividly shows the extent to which we tend to model our judgments on the judgments of others. One of the reasons it is such a powerful experiment is its simplicity. Asch asks experimental subjects to compare line lengths and to match lines of equal length with one another. In each enactment of the experiment, all but one of those answering questions are confederates of the experiment (that is, actors who are instructed to deliberately give wrong answers). Only one participant is the subject of the experiment, the person whose reactions are being measured. The subject of the experiment does not know that all other participants who are asked to give answers are confederates of the experiment. In most cases, subjects of the experiment repeated the replies of the actors, showing the extent to which peer pressure can affect our ability to see what is right in front of us. Even in basic low-stake situations, such as those created in Asch's experiment, we observe that people tend to follow the lead of the group. Asch's experiment also shows that either we tend to conform because we do not want to create conflict by disagreeing with others (normative conformity) or because we genuinely come to see things in the wrong way because of group pressure (informational conformity). Normative conformity is driven by the explicitly endorsed norm that we should not puncture group conformity. Informational conformity is named as such because the failing happens at the level of perception. The information given to us by the senses is distorted. Asch's experiment also shows us how the pull of conformity can be weakened by the presence of a partner (an actor) who is asked by the experimenter to give the right answers to the questions regarding line lengths. Another variation of the experiment shows that asking subjects to give their answers in writing rather than orally radically changes the results of the experiment. This experiment is the focus of Exercise 3 of the Module. For more information on the experiment see Asch's " Opinions and Social Pressure".
We move on now from conformity to obedience to authority. In Stanley Milgram's controversial obedience experiment, "teachers" were asked by the "authority figure" to punish "learners" by flicking a switch which they thought produced escalating electrical shocks. This experiment, which is the focus of Exercise 4 of the Module, shows that there is a strong tendency among humans to follow the dictates of authority figures, including when following the instructions of an authority figure can be extremely harmful, even lethal, to others. Milgram's conclusion is not that people tend to be morally bereft. Rather, his conclusion is that obedience can lead good people to do bad things. Obedience, like conformity, plays a very important positive role in society, but we can end up doing terrible things if we blindly succumb to the pull of obedience. This has serious implications for leadership and hierarchy in organizations (Milgram, 1973).
It should be noted that only a minority of experimental subjects unquestionably flicked the switches. Typically, experimental subjects try to resist the pull of authority figures. In the end, however, well over 50% of experimental subjects, teachers as they are called in the experiment, ended up punishing the learner with what they thought were potentially lethal shocks (even more staggeringly, most subjects tended to continue punishing the learner with shocks of increasingly higher voltages, even after they thought that the learner was unconscious, completely defeating the aims of what they were told the experiment was about). The pull of authority figures tends to trump countervailing forces within us and one sees this clearly when observing the tremendous amount of dissonance typically experienced by participants.
One key factor playing a role in participant behaviour is a common psychological mechanism which could be described as "passing the buck", or deferring responsibility to others. Having a sense that the responsibility is entirely on the shoulders of an authority figure can relieve us from the unpleasantries of guilt, making it easier for us to act in ways that we would regret if we had a chance to sit back and reflect on our actions (for a rich and influential discussion of this topic see Arendt, 2006, particularly where the author addresses the inability of Adolf Eichmann to take responsibility for his actions). Similarly, we often pass on the responsibility to groups, feeling that "if everyone else is doing it, then why can't I?" It should also be stressed that psychological mechanisms such as these are triggered in specific circumstances. In the case of the Milgram experiment, participants were put under considerable pressure by an authority figure. They could, however, only be put under pressure because we are prone to follow the dictates of those we consider to be authority figures. Psychological and environmental factors act together to produce these sorts of results.
If we are thinking of avoiding situations, such as those present in the Milgram experiment, we need to think both about training ourselves to recognize when and where not to succumb to the pressure of authority figures as well as about changing environmental circumstance and, for instance, considering leadership styles that are less prone to encourage obedience beyond the limits of the acceptable.
A related phenomenon worth discussing is that of diffusion of responsibility, for example where subjects tend to feel less responsible for helping someone in need if others are also present. Taking responsibility can be a difficult and sometimes risky affair, so we often prefer to pass on the responsibility to others. However, it is also the case, and this speaks to the issue of conformity, that when others are present we tend to mirror our behaviour on that of others, something that does not happen as readily when there is only one potential helper available. It has also been shown that the phenomenon of diffusion of responsibility is punctured when someone takes the lead and helps. The phenomenon of diffusion of responsibility is one of the principle mechanisms that accounts for the Bystander Effect (Garcia, 2002). A thought-provoking case that triggered bystander research is the case of the murder of Kitty Genovese.
Another feature that can have a deep impact on how we behave, often driving unethical behaviour, are the roles we play in specific environments. This has been illustrated in the Stanford Prison Experiment. In this experiment from 1971, which is the focus of Exercise 5 of the Module,
the psychological effects of perceived power and related environmental or situational factors were investigated. The experiment involved volunteer students who assumed the roles of guards and prisoners. While this was one of the most controversial psychological experiments ever conducted, there are many extremely interesting insights that we can draw from it. These reveal the extent to which situational factors can influence behaviour, including the extent to which the roles we play in specific environments can have a deep impact on how we behave. This is known as the problem of situationism.
Although the experiment has recently come under scrutiny in the media, its results are consistent with many other experiments the results of which are widely accepted by the scientific community, some of which are included in this Module (Selective Attention, Conformity, Solomon Asch's Experiment, The Milgram Obedience Experiment and The Bystander Effect). Click here for the journalistic piece critiquing the experiment and click here for a reply from Zimbardo. It may be worth discussing this controversy with students. Even Zimbardo agrees that his experiment is unethical, and it is clear that the experiment is, to put it mildly, irregular from the scientific point of view, but it has captured the imagination of generations, arguably because it highlights the extent to which acquiring mastery over our lives is always an imperfect achievement and the consequences of losing control over our lives can be extremely high. Much cutting edge work in psychology and cognate disciplines is pointing in this direction. So, although Zimbardo's experiment is questionable from the ethical and scientific points of view, it nevertheless nicely exemplifies features of our lives that may be hard to accept, but which we ought to accept if we are genuinely committed to doing the hard work of bettering ourselves from the moral point of view.
The pull to conform, to defer to authority, to pass the buck, to focus too much on the specific task at hand, and to lose ourselves in our roles, impaired the abilities of the experiment participants to distance themselves from the forces pushing them to act as they did, setting them down the path of becoming ruthless guards or humiliated and emotionally broken prisoners. The uniforms-reflective sunglasses, batons, chains, and prisoner gowns-the replacement of names for numbers and of real names for nicknames, such as 'John Wayne', helped participants forget that they were in a mock prison situation. Some scholars, most notably John M. Doris (2002), defend the view that experiments such as this one show that people do not really have characters. If circumstances play such a decisive role in affecting the ways we behave, Doris argues, then it is not character that motivates people to act, but circumstances. This extreme position, however, can certainly be questioned. After all, not all guards behaved in the same way and the same can be said about the prisoners. In fact, behaviour patterns varied significantly among participants, although they were all in one way or another deeply influenced by their particular situation.
It should be stressed that conformity plays an extremely important positive social role. The power of situation is also important in a positive way. It allows us to adapt quickly to situations, for instance. The ease with which we adapt, however, has pitfalls that are highlighted by the Stanford Prison Experiment. It should be noted that this discussion is related to debates about the impact of the environment and design of a particular organization on ethical behaviour, which are explored in Integrity and Ethics Module 8 (Behavioural Ethics).
The tendency discussed earlier to pass on the responsibility to groups can also lead to dishonest behaviour. It is easy to steal a little if everyone is doing it, the adverse consequences of stealing are minimal and, crucially, if we are able to tell ourselves stories that make us look like good honest people and steal at the same time. However, as the Fudge Factor tells us, the cost of stealing a little and thinking of ourselves as good honest people is that we end up distorting the lenses through which we see the world and, perhaps most importantly, ourselves.
In his book The (Honest) Truth About Dishonesty , Dan Ariely (2012) identifies a dissonance between wanting to be good and wanting to have things that we desire. This dissonance accounts for the fact that very few people will become hardened crooks. It also accounts for the fact that many of us are little cheaters, as this dissonance leads us to see the world and ourselves through distorted lenses living as little cheaters. In other words, dishonesty is everywhere but it is almost always kept within bounds. He also explains why in some cases small cheaters become big ones, why a series of small temptations motivate some to switch over and become big cheaters, to give in to temptation. In typical circumstances the pull to look good in our own eyes is not completely defeated by our rationalizing tendencies, but in some cases it can be.
In such cases the "solution" to the dissonance-producing competition between the desire to look ethical in our own eyes and to get what we want is found in the rationalization that the good thing from the moral point of view coincides with our need to satisfy a desire by illicit means. He calls the mechanism involved the what the hell effect. Click here for a fun illustration of the effect in action. In the illustration provided the competition is between a prudential rather than ethical "ought" (avoid eating cake either because it is not yours or because it is not good for you, or for some other reason) and the powerful desire to eat mouth-watering cake in abundance.
Ariely suggests that in order to diminish crime we need to change incentive structures, to create social conditions where dissonance-producing conflicts of interest are minimized, thus helping to neutralize the effect of our rationalizing tendencies. Ariely's book and the above issues are the focus of the Pre-class exercise of the Module
The fact that we like to look good in our own eyes is a positive thing. It highlights just how important ethics is to us and it tends to limit bad behaviour to some extent. It can also, however, become contaminated by our need to rationalize, which protects us from the psychological unease. It is generally a good thing that we have desires that we believe will bring us advantages. However, ethical oughts and wants, in conjunction with the protective work of rationalizations, can also play distorting roles in our lives, as studied by Ariely, among others.
This Module highlights the extent to which taking responsibility for our lives is central to being ethical. Not to take responsibility amounts to letting internal and external mechanisms drive our lives to an unacceptable degree, as when one is led by one's group to commit unspeakable acts perhaps only later to realize the extent to which one has betrayed one's own most deeply held values by letting the natural inclination to conform rule supreme.
One thing that should be stressed is the extent to which ethical failures are common and the extent to which our ability to take responsibility for our lives is diminished by ethical failures of the sort discussed in this Module. This Module could be used to trigger a process of ethical improvement - a process that requires students to commit themselves to working against the corrupting tendencies of many of the mechanisms that typically serve us well.
- Arendt, Hannah (2006). Eichmann in Jerusalem: A Report on the Banality of Evil. London: Penguin. (Originally published in 1963).
- Ariely, Dan (2012). The (Honest) Truth About Dishonesty: How We Lie to Everyone-Especially Ourselves. London: HarperCollins Publishers.
- Baumeister, Roy R. (1999). Evil: Inside Human Violence and Cruelty. New York: Henry Holt and Company.
- Bloom, Paul (2013). The Origins of Good and Evil. London: Random House.
- Biko, Steve (1987). I Write What I Like. Oxford: Heinemann.
- Camus, Albert (2013). The Rebel. London: Penguin. (Originally published in 1951).
- Doris, John M. (2002). Lack of Character: Personality and Moral Behavior. Cambridge: Cambridge University Press.
- Fanon, Frantz (2008). Black Skin, White Masks.London: Pluto. (Originally published in 1952).
- Franklin, Benjamin (1962). Autobiography of Benjamin Franklin. New York: MacMillan. (Originally published in 1791).
- Garcia, Stephen M. and others (2002). Crowded minds: the implicit bystander effect. Journal of Personality and Social Psychology, vol. 83, No. 4.
- Hallie, Philip (1998). Tales of Good and Evil, Help and Harm. New York: Harper Perennial.
- Midgley, Mary (2001). Wickedness: A Philosophical Essay. London: Routledge.
- Milgram, Stanley (1973). The perils of obedience. Harper's, vol. 247, No. 1483.
- Schwitzgebel, Eric and Joshua Rust (2013). The moral behavior of ethics professors: relationships among self-reported behavior, expressed normative attitude, and directly observed behavior. Philosophical Psychology, vol. 27, No. 3.
- Sereny, Gitta (1974). Into That Darkness: From Mercy Killings to Mass Murder. London: Pimlico.
- Tavris, Caroll and Elliot Aronson (2015). Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts. New York: Houghton Mifflin Harcourt.
- Wittgenstein, Ludwig (2014). Lectures on Ethics. Hoboken, NJ: John Wiley & Sons. (The lectures were originally delivered in 1929).