Skip to main content

Obstacles to good ethical decision making and behavior, and some things you can do to overcome them

The process that leads to effective moral action can be roughly divided into three components:

1 – Moral awareness: the process of identifying the ethical issues involved, the parties who have a stake in the action, what is at stake, and what the the action options are.

2 – Moral judgment: the process of weighing the ethical considerations that bear on the situation and determining the moral course of action.

3 – Acting in accordance with moral judgment: deciding the right thing to do is not enough.  One still needs to form the intention to do the moral thing and deal with practical obstacles in order to act effectively.

Below are some of the central obstacles that may be faced along each step of the process, along with suggestions about how they can be overcome.


OBSTACLES TO GOOD AWARENESS


Low or myopic moral perception: Some people fail to see the moral dimensions of given situations.  Others have distorted moral vision that results largely from rationalization or from an unwillingness to focus on the problem so that it is seen clearly.  The rationalizations contribute to and reinforce the perceptual problem.  (Drumwright & Murphy, 2004)

To overcome this obstacle: Be alert to the impact of your actions on stakeholders and increase your awareness by taking on their perspectives of the situation.

Moral disengagement: while the situation in question normally elicits a particular moral response from an observer, in this case, the agent lacks such a response.  The agent is “cold” to the situation. (Bandura, 1999)

To overcome this obstacle: Place yourself in the position of the person(s) likely to be harmed by the event and remind yourself of the golden rule - Do unto others as you would have them do unto you.  

Morally inattentive informal norms and formal codes:  Such norms and codes create a background of expectations that can distort, obscure, or oversimplify the moral considerations present in a situation.  

To overcome this obstacle:  As members of organizations and communities we can consciously strive to improve the content and practical applicability of our norms and codes of conduct.

Self-serving bias: People tend to look for information that will confirm their pre-existing views, to interpret information in ways that support their own view, and to selectively remember the information that supports their view.  (Hartman, 2008)

To overcome this obstacle: Try to find alternate interpretations of the same data, identify persuasive arguments, and soften the ground for efforts to consider a different, perhaps less self-serving, interpretation.

Issues of low “moral intensity” will not be recognized as frequently as issues of high moral intensity because their ethical elements tend to stand out less from the background and to be seen as less emotionally interesting, concrete, and visually provocative.  When this is the case, moral elements of the situation are likely to be obscured, thus limiting or distorting one’s moral awareness.  (Jones, 1991)

The following is a description of what Jones points to as the 6 components of moral intensity:

1. Social consensus – the degree of social agreement about the moral value (e.g. evil) of a proposed act.

2. Magnitude of consequences – the sum of the benefits/harms done to victims/beneficiaries of the moral act in question.

3. Concentration of effect – how spread out or concentrated the harms/benefits of the proposed action are.

4. Probability of effect – a joint function of the probability that the act in question will actually take place and that it will actually cause the harms/benefits predicted

5. Temporal immediacy – the length of time between the present and the onset of consequences of the moral act in question.

6. Proximity – the feeling of nearness (social, cultural, psychological, or physical) that the moral agent has for the victims/beneficiaries of the evil/good act in question.

Stimuli are salient to the extent that they stand out from their backgrounds.  High intensity moral issues are more salient than low intensity issues because either their effects are more extreme (magnitude), or they stand out in some particular way (higher concentration of effect), or involve significant others (proximity).

Stimuli are vivid to the extent that they are emotionally interesting, concrete, and proximate in a sensory, temporal, or spatial way.  High intensity moral issues are more vivid than low intensity ones because (a) their effects are emotionally interesting (magnitude or concentration), (b) they are more concrete (social consensus or probability of effect), or (c) they are more proximate.  Higher vividness means greater probability of moral recognition.

To overcome this obstacle: Compensate for the effect of low moral intensity situations by being mindful of the way components of moral intensity affect you; consider whether your intuitions about the moral issue in question are not misdirected by the psychological predispositions these situational components trigger.


OBSTACLES TO GOOD ETHICAL DECISION-MAKING

Poor moral awareness: poor moral awareness can either result in a failure to perceive the problem as being an ethical problem at all (in which cases one does not go through the steps of good ethical decision making), or can present the agent with a distorted or insufficient picture of the problem to be resolved.

To overcome this obstacle: Identify the relevant obstacles to moral awareness and address them as suggested in the section above.

Failure to gather relevant facts: good practical decisions require that we know important facts relevant to the decision, such as those that help us determine the likely impact of the action on stakeholders.  

To overcome this obstacle:

•    Make sure to take the time to gather whatever facts you need to come to a good decision.  Don’t rush to judgment before all the facts are in.

•    Since in many cases different stakeholders have interests that would be best served by different actions, it is important to carefully vet the facts one gathers to ensure that they represent as unbiased a representation of the situation as possible.

•    Precisely what facts or types of facts are necessary for a good judgment on the issue may not be clear at the beginning of the process.  As one considers the situation from the perspectives of different stakeholders, the need for additional facts (or clarification of already attained facts) will probably arise.

Rationalizing ourselves out of good moral decision-making:  It’s easy to convince ourselves that we can do what we’d like.  The following are poor, but unfortunately all too common, rationalizations we use to excuse our actions.  (Josephson, 2002)

•    If you have to do it, it’s ethical to do.
•    If it’s legal, then it’s moral
•    It’s just part of the job
•    It’s all for a good cause
•    I was just doing it for someone else’s sake
•    I’m just fighting fire with fire
•    It doesn’t hurt anyone
•    Everyone’s doing it
•    It’s OK if I don’t benefit personally
•    I deserve it

To overcome this obstacle:  Be mindful that these are rationalizations that we all commonly use – and avoid them.  Replace such rationalizations with a substantial and rigorous ethical deliberation process.

Insufficient attention/time given to ethical decision-making process because the situation has low moral intensity (see explanation of moral intensity above). (Jones, 1991)

To overcome this obstacle: Compensate for the effect of low moral intensity situations by being mindful of the way components of moral intensity affect you; consider whether your intuitions about the moral issue in question are not misdirected by the psychological predispositions these situational components trigger.

Slippery Slope: People are willing to do unethical things because they have already done smaller, less extreme acts that make the bigger choice appear less (or not at all) unethical. (Hartman, 2008)

To overcome this obstacle: Consider the positive power in this tendency! Break down your challenges into smaller, immediately actionable steps and you can tackle larger problems.  You can also zoom out even further out to the bigger picture of your life and see how you would feel about your decision when the dust settles.

Sunk Costs and Loss Aversion: We tend to continue toward an unethical course of action simply because we are reluctant to accept that our prior choices or investments were wrong or wasted. (Hartman, 2008)

To overcome this obstacle: Talk about what we have already learned from the prior decision or investment, even if it’s not a financial gain.

Common biases can unconsciously influence our decision-making process and result in unintentionally unethical conclusions.  (Messick and Bazerman, 1996)

Biases about the world:
 
1.  We tend to miscalculate the negative consequences of our behavior and the risk involved

2.  We create inaccurate judgments about causal perceptions

3.  We ignore low-probability events altogether

4.  We deny uncertainty

5.  We discount the future, giving disproportionally more weight to present consequences than anticipated future consequences.

Biases about other people:

1.  Through ethnocentrism and stereotyping we inaccurately believe that our values and beliefs are superior to those of a different group.

2.  We can be misguided by our trust in an “authority heuristic” – we often trust in the wisdom, expertise, and experience of authority figures, but occasionally this trust is misplaced and the heuristic becomes a harmful bias.

Biases about ourselves

1.  We have illusions of superiority (we’re morally better people than others), sometimes because we misremember the past in our favor.

2.  We have self-serving perceptions of fairness

3.  Overconfidence in our abilities causes us to mispredict our future ethical behavior.

To overcome this obstacle:  Be mindful of these biases and rationalizations that we all commonly use – and avoid them.  Replace such rationalizations with a substantial and rigorous ethical deliberation process.

OBSTACLES TO TAKING MORAL ACTION, ONCE A MORAL JUDGMENT HAS ALREADY BEEN REACHED

Rationalizing immoral action by deciding that morality just isn’t all that important:  Even after one has concluded that an action is immoral, it’s easy to convince oneself that doing the morally right thing isn’t important enough, given other considerations (economic, self-interest, etc.)  This is particularly easy to do with regards to a particular situation where acting morally appears to be against one’s interests.

To overcome this obstacle:  

•  Be mindful of this tendency – ask yourself whether it is something you truly believe, or merely an excuse for self-serving action.

•  Remember that you expect others to act morally and criticize those who fail to do so as callous, selfish, or evil.  Keep in mind that your actions reflect upon and determine on your own character – especially when faced with decisions such as this one.  Do you want to be the kind of person who chooses self-serving over morally right action?  Do you want to be a hypocrite that criticizes the immoral actions of others, while excusing your own?

Obedience to Authority: We tend to obey those in authority, including when authorities direct us to perform actions we believe are unethical.  (Hartman, 2008)

To overcome this obstacle:  

•  Keep in mind that you are ultimately responsible for your own actions, and that others will hold you responsible for these actions.

•  Identify alternate authorities to serve as role models.  

Consensus/Peer Pressure:  We have a tendency to succumb to peer pressure, both because we want to “fit in” and succeed within an organization, but also because our actual thinking is changed. (Hartman, 2008)

To overcome this obstacle:

•  Keep in mind that you are ultimately responsible for your own actions, and that others will hold you responsibility for these actions.

•  The more mindful we are of this pressure, the less power it has over us.   

•  Build your own group of like-minded individuals to create peer pressure that is more in line with your personal values.

The Inside/Outside Struggle: We do not want to be cast out for being different.  In many cultures, if you diverge, you are shunned.  The consequences for pushing against the in-group of the organization can include: blackballing, excommunication, disfellowship, discharge, expulsion, and denial. (Hartman, 2008)

To overcome this obstacle:  

•  Consider the impact of inaction.  Will your failure to do what’s right result in continual harm to others and/or the violation of others' rights?

•  Keep in mind that once enough people diverge, a new majority arises.

Perception that we have little influence over events: The more control we believe we have over an event, the more we tend to perceive ourselves as responsible for the events we bring about or allow to happen.  Situations where the consequences of our actions are far removed from us particularly give us this perception, even when the impact of our actions is considerable, and typically make it more difficult for us to form the intention to perform moral action and to act upon that intention.  (Jones, 1991)

To overcome this obstacle: Be mindful of the tendency to minimize the importance of your action – particularly in situations where the impact of your actions is not immediately evident.


And a general obstacle, which affects awareness, judgment, and action: lack of experience.  All of the above can be improved with practice and self-reflection.  

To overcome this obstacle:  Ethics education will probably not transform you into an ethical person overnight, but good ethics education that gives you practice dealing with difficult situations and increases awareness and self-reflection about how it is that you approach ethical problems can surely help.  Such tactics have been shown to make people feel more comfortable when faced with ethical challenges and to increase the sophistication of their moral reasoning.

 
Dr. Shlomo Sher
USC Levan Institute for Humanities and Ethics
 


CITED SOURCES:

Bandura, Albert   “Moral Disengagement in the Perpetration of Inhumanities”, Personality and Social Psychology Review, 1999 , Vol. 3, No. 3, 193-209

Drumwright, Minette E. and Murphy, Patrick E., “How Advertising Practitioners View Ethics: Moral Muteness, Moral Myopia, and Moral Imagination.” Journal of Advertising, Summer 2004

Hartman, Laura P. “Ethical Decision-Making: Processes and Frameworks”,   Conference Presentation, 2008

Jones, Thomas M.  “Ethical Decision Making by Individuals in Organizations: An Issue Contingent Model”, The Academy of Management Review; Apr 1991

Josephson, Michael S.  "Making Ethical Decisions", Josephson Institute of Ethics; 2002

Messick, D.M., & Bazerman, M.H. (1996). “Ethical leadership and the psychology of decision making”, Sloan Management Review, 9-22.