• عربي
  • 中文
  • English
  • Français
  • Русский
  • Español
 
  This module is a resource for lecturers  

 

Key issues

 

People intuitively believe that ethical behaviour is a product of personal beliefs and characteristics, but there is increasing evidence that a person's context exerts a surprisingly powerful influence on behaviour. This Module adopts a psychological approach to understanding ethical behaviour. It addresses one of the most basic problems of ethics: why do ethical people sometimes behave unethically? Answering this question requires an understanding of fundamental psychological processes that can lead anyone down a slippery slope towards unethical behaviour, destroying careers and businesses and bring shame to individuals and organizations. This Module complements other modules in the E4J Integrity and Ethics Module Series, although it offers a different perspective and works with different assumptions.

First, it is useful to understand the discourse surrounding ethical behaviour. Behavioural science has identified at least four common misunderstandings, or "myths", about ethical behaviour that can impair or bias our ability to manage it effectively. By "myth" we mean a belief that has some element of truth but is generally exaggerated or oversimplified. These four basic myths about ethical behaviour can be summarized as follows:

  • Myth 1: It's the individual: there are good people and bad people
  • Myth 2: It's all about motives
  • Myth 3: It's about ethical principles
  • Myth 4: Everyone is different

The first myth is that ethical behaviour is a property of individual people, such that there are good people who act well and bad people who act badly. And, of course, the presumption is that you can identify these good people and bad people. However, in reality most people behave ethically in some circumstances and unethically in other circumstances. Ethical character is not as stable as one might expect.

The second myth is that behaviour is guided by intention: Bad acts are guided by bad intentions, and good acts are guided by good intentions. This, however, fails to take into account the power of context. Bad things can be done with good intentions; this is known as "Ethical Blindness".

The third myth is that it's all about ethical principles: ethical actions are guided by ethical reasoning. But the reality is that reasoning often follows action taken in order to justify, explain, or rationalize it.

The fourth myth is that everyone is different, and everything is relative. However, most people and societies recognize a basic moral foundation to build upon, even amid differences generated by individual experiences, background, and immediate context.

Behavioural science has demonstrated that there are two often overlooked aspects of decision-making: The first relates to the way in which individuals make moral choices: psychological shortcuts, misperceptions, and temptations can often divert the best intentions. Understanding the dynamics and pitfalls of moral choices can help guide decisions towards ethical ones. The second aspect relates to the ways in which social dynamics impact individual behaviour. Morality is influenced by the context people are in, not just by the type of people they are, and that this contextual influence is more powerful than people generally expect. Ethics is not just a question of individual moral choice, it is influenced by society, peers, family, neighbours and colleagues. Ethics can therefore be thought of as a design problem, in which social interactions play a critical role. When discussing these issues, lecturers can present the results of the Pre-class surveys 1 and 2 of the Exercises section, focusing on self-righteousness (Klein and Epley, 2016) and moral foundations (Graham, Haidt and Nosek, 2009). This implies that ethics should be treated as a problem of design rather than simply as a problem of beliefs or attitudes.

Current compliance programmes and policies to combat unethical behaviour are often based on the understanding that people will exploit an opportunity for misconduct whenever its profits are worth risking potential negative consequences. In other words, individuals are assumed to weigh the probability of getting caught and the ensuing sanctions against the undue gain they could obtain through action or inaction. The policy implications of such a view usually involve a high level of monitoring and enforcement. However, in reality such stringent policies do not always work, particularly in a context where unethical behaviour has become a norm and thus, there is a collective action trap in which moral appeals will fall on deaf ears.

Turning ethical principles into practice involves two basic steps. The first step is to understand the internal dynamics of moral choice making and the second step is to create norms that guide ethical action.

Moral choice is a dynamic process. Evidence shows that individuals balance their moral choices by continuously comparing their current moral self-perception with their own moral reference point. The moral reference point represents the level of integrity individuals perceive as morally acceptable for themselves. If they find their own action deviates too much from their personal reference point, they counteract. This is known as 'moral balancing'. This process of moral balancing is often unconscious. People do not like to be confronted with their own unethical behaviour. So they may apply justifications to make the dissonance between their moral standards and their actual behaviour appear less grave. When they do acknowledge the dissonance, they often feel bad about their behaviour, resulting in a desire to compensate or 'balance'.

Another related issue is "ethical unawareness": principles guide behaviour only when thinking about them, and people might not think about the principles when confronted with ethical dilemmas. To illustrate these points, the lecturer can discuss the investment advisor demonstration (Zhang and others, 2015) as an illustration of ethical (un)awareness (see Pre-class survey 3). The demonstration illustrates how people might overlook an ethical goal (recommending an ethical company to invest in) if they are focusing on another goal (in this case, maximizing financial profit). However, this demonstration requires some understanding of financial systems including mutual funds and investment advisors (additional information is provided in the pre-class survey to make concepts clearer). If students are unfamiliar with these concepts, then this demonstration can be omitted. In its place, lecturers can discuss similar examples of cases in which people overlook ethical principles while pursuing another goal, such as unfairly helping a friend to get a job, or taking bribes to benefit oneself in the short run that come at a cost to others in the long run.

Policies can affect the internal dynamics of moral choice making and encourage individuals to follow their moral compass as well as to always be "ethically awarded" by:

  • Raising the moral reference point by inducing people to compare their own actions against higher internal standards. Clearly defining ethical expectations and emphasizing the trust that the organization (and/or the public) has placed in them to help adjust the moral reference point upwards. Discouraging information, surveillance and distrust can, in turn, lower the moral reference point against which a person assesses his or her own behaviour. 
  • Emphasizing the moral reference point. Even individuals with very high internal moral standards sometimes fail to follow them. If this happens, a dissonance arises between a (considered) behaviour and the moral reference point. Addressing people's personal morality and encouraging them to reflect on the ethical consequences of their actions can lead to more ethical choices.

The second step in enhancing ethical practices is to understand the context in which decisions take place. Moral choices are usually not taken in isolation. In fact, most human decisions are often driven by social motives such as loyalty, trust building, returning favours or helping someone out of a tricky situation. People take decisions in their own best interest, but they also care about what others think or do. Social motives can work in favour or against ethical decisions. The opinions of outside observers usually matter to a decision maker: People prefer to act in a self-serving manner, but at the same time like to appear moral to others (Batson et al., 1999). Transparency and accountability mechanisms could thus reduce unethical behaviour. The perception that one's behaviour is visible and potentially observed introduces an element of accountability that makes it more difficult to justify, because potential observers could easily detect an excuse.

Transparency could also create a 'social multiplier' effect if it triggers dialogue. For example, a committee whose work is publicly observable and which occasionally receives comments or complaints from citizens, might feel more liable to the public. The regular reminder to its members that their decisions affect citizens reduces the perceived distance between action and harm, and thus, limits moral wiggle room.

"Reciprocity" plays a key role in most social interactions and it also lies at the core of many corrupt practices. Reciprocity can function as a motivation or excuse for engaging in corruption or unethical behaviour. Hiding behind good intentions can deter people from admitting the actual moral implications of misconduct to themselves or others. Typical justification patterns include:

  • Self-serving altruism: When someone else also benefits from misconduct, the other person's interest is used to justify the action in place of one's own. For example, helping a friend to get a job, overshadows the corrupt act, in this case, nepotism (Ayal, Gino, Barkan and Ariely, 2015).
  • Robin Hood logic: A harm done to a stronger/powerful/richer entity is justified on the basis of a preference for equality.
  • Diffusion of responsibility: When several people engage in misconduct, the chances of one individual speaking up against it are reduced. Each individual feels less responsibility for the action and does not want to limit the other person's freedom of choice or indicate their distrust (Moore and Gino, 2013).

More recent evidence from research into behavioural ethics confirms the relevance of social norms and identities for moral choices. If one person lies or cheats without facing consequences, this behaviour can spread among friends or colleagues (Gino and Bazerman, 2009). In particular, gradual divergences from an ethical behaviour tend to be more accepted by others, creating a slippery slope towards generalized dishonesty (Gino and Bazerman, 2009). A key factor in creating and strengthening behavioural norms is social identity (Akerlof and Kranton, 2011) defined as the role individuals assign to themselves in a group and the group in society. It is, therefore, important to establish ethical identity in organizations. A code of ethics can be used to emphasize ethical behaviour as a social norm.

In summary, ethical behaviour can be affected by changing the context in which people must make decisions and act: we can design contexts that help people avoid ethical risks, bring ethics to the top of people's minds, and motivate ethical behaviour. Key principles of behaviour design are:

  1. Making desired behaviour easy (remove barriers that make ethical actions harder than they need to be)

  2. Protection from risk (it is easy to underestimate ethical risks)

  3. Design to be better (no system is perfect, and one should not let perfection be the enemy of improvement)

A final note is that too much conversation about ethics focuses on unethical behaviour, rather than on positive examples of ethical conduct. An important component of designing a more ethical organization or society is to identify organizations or societies that seem to be having some success from an ethical perspective. A general overview of good practice in designing ethical public and private organizations can be found in other modules of the present module series, in particular, Module 11 (Business Integrity and Ethics), Module 13 (Public Integrity and Ethics) and Module 14 (Professional Ethics). Given the importance of positive examples, the present Module includes an exercise in which students choose and analyse their own case study of an ethical beacon, i.e. an organization or society that seems most ethical to them and that they might want to emulate (see Case study). Lecturers are also encouraged to discuss concrete examples of organizations designing more ethical systems into everyday practices of hiring, promoting, rewarding, and monitoring. As mentioned above, the Module also includes pre-class surveys (see Exercises section) that students could complete before taking the class, and which the lecturer could discuss during the class to illustrate important concepts of behavioural ethics.

 

References

  • Akerlof, George A. and Rachel E. Kranton (2011). Identity Economics: How Our Identities Shape Our Work, Wages, and Well-Being. Princeton, NJ: Princeton University Press.
  • Ayal, Shahar, Francesca Gino, Rachel Barkan and Dan Ariely (2015). Three Principles to REVISE People's Unethical Behaviour. Perspectives on Psychological Science, vol. 10, pp. 738-741.
  • Batson, Daniel, Elizabeth Thompson, Greg Seuferling, Heather Whitney and Jon A. Strongman (1999). Moral hypocrisy: Appearing moral to oneself without being so. Journal of Personality and Social Psychology, vol. 77, pp. 525-537.
  • Graham, Jesse, Jonathan Haidt and Brian A. Nosek (2009). Liberals and conservatives rely on different sets of moral foundations. Journal of Personality and Social Psychology, vol. 96, pp.1029-1046.
  • Klein, Nadav and Nicholas Epley (2016). Maybe holier, but definitely less evil, than you: Bounded self-righteousness in social judgment. Journal of Personality and Social Psychology, vol.110, pp.660-674.
  • Moore, Celia and Francesca Gino (2013). Ethically adrift: How others pull our moral compass from true North, and how we can fix it. Research in Organizational Behaviour, vol. 33, pp. 53-77.
  • OECD (2018), Behavioural Insights for Public Integrity: Harnessing the Human Factor to Counter Corruption , OECD Public Governance Reviews, OECD Publishing, Paris.
  • Zhang, Ting, Pinar O. Fletcher, Francesca Gino and Max H. Bazerman (2015). Reducing bounded ethicality: How to help individuals notice and avoid unethical behaviour. Organizational Dynamics, vol. 44, No. 4, pp. 310-317.
 

Back to top