Link Search Menu Expand Document

The Argument and Some Objections

Feinberg & Willer (2013)’s brilliant argument for the influence of cultural differences in moral psychology on political conflict over climate change faces some compelling theoretical and empirical objections. If the objections are right, they leave us with a puzzle. Given that the evidence for cultural variation in moral psychology is at best weak, and given that the theoretical argument for moral reframing is flawed, why does moral reframing seem to work?

If the video isn’t working you could also watch it on youtube. Or you can view just the slides (no audio or video).

This recording is also available on stream (no ads; search enabled).

If the slides are not working, or you prefer them full screen, please try this link. The recording is available on stream and youtube.

Notes

We have explored Feinberg and Willer’s argument that cultural differences in moral psychology explain political conflict on climate change.

I broke this into five considerations:

  1. Moral convictions and the emotions they evoke shape political attitudes’ (see Do Ethical Attitudes Shape Political Behaviours?)
  2. Moral Foundations Theory is true (see Moral Pluralism: Beyond Harm; Moral Foundations Theory: An Approach to Cultural Variation; and Operationalising Moral Foundations Theory)
  3. ‘liberals and conservatives possess different moral profiles’ (see Liberals vs Conservatives)
  4. ‘liberals express greater levels of environmental concern than do conservatives in part because liberals are more likely to view environmental issues in moral terms’ (see Moral Psychology Drives Environmental Concern)
  5. ‘exposing conservatives to proenvironmental appeals based on moral concerns that uniquely resonate with them will lead them to view the environment in moral terms and be more supportive of proenvironmental efforts.’ (see Framing Changes Ethical Attitudes)

At this point you should understand the argument. You should also understand how it aims to support the claim that cultural differences in moral psychology explain political conflict on climate change.

What is a philosopher doing here? On the face of it, the argument is simply a (brilliant) piece of social science. No philosopher needed.1

But the argument gives rise to a puzzle. To see the puzzle, first consider some objections.

Objection 1 (weak)

What does the Moral Foundations Questionnaire measure?

On the Social Intuitionist Model of Moral Judgement (which is a part of Moral Foundations Theory; see Moral Foundations Theory: An Approach to Cultural Variation), unreflective ethical judgements are consequences of moral foundations plus cultural learning.

This gives us reason to think that your answers to the questions will reflect your culture.

If moral disengagement is real (see Moral Disengagement: The Evidence) unreflective ethical judgements are in part consequences reasoning from known principles. (They may also be consequences of moral foundations and cultural learning.)

In this case, your answers may not reflect your culture.

More generally, objections to the Social Intuitionist Model of Moral Judgement are objections to the theoretical justification for supposing that the Moral Foundations Questionnaire can get at cultural differences in moral psychology.

This is an objection to the claim that we know the third of the five points above (‘liberals and conservatives possess different moral profiles’) to be true.

Objection 2

Another, complementary objection to the third of the five points above (‘liberals and conservatives possess different moral profiles’) concerns measurement invariance.

As we have already seen (in Operationalising Moral Foundations Theory), attempts to demonstrate scalar invariance have all or mostly failed; and Iurino & Saucier (2020) even fail to find support for the five-factor model, which casts doubt on whether the Moral Foundations Questionnaire meets requirements for internal validity.

We are therefore not justified in using the Moral Foundations Questionnaire to compare means across different groups. But this is exactly what the claim that ‘liberals and conservatives possess different moral profiles’ requires us to do.

(Note that this objection, like Objection 1, seeks to establish that we do not know Claim 3; it is not an argument that this claim is false.)

Objection 3: Joan-Lars-Joseph

The evidence on cultural variation says socially conservative participants tend to regard all five foundations as roughly equally morally relevant.

This does not generate the prediction that socially conservative participants will be more likely to view climate issues as ethical issues when linked on one foundation (e.g. purity) than when linked to another foundation (e.g. harm).

Contrast Feinberg & Willer (2019, p. 4):

‘Why does moral reframing work? The primary explanation is that morally reframed messages are influential because targets perceive a “match” between their moral convictions and the argument in favor of the other side’s policy position.’

The Joan-Lars-Joseph objection2 is this: if we take the claims cultural differences in moral psychology to be true, framing environmental issues in terms of purity should not cause conservatives to perceive more or less of a “match” than framing environmental issues in terms of harm.

This is an objection to the theoretical argument for the fourth claim in the five points above (‘liberals express greater levels of environmental concern than do conservatives in part because liberals are more likely to view environmental issues in moral terms’).

Note that Objections 2 and 3 are complementary: #2 aims to show that we lack evidence that liberals and conservatives differ in their moral psychology; #3 assumes that we have such evidence and aims to show that it does not support the conclusion about moral framing.

A Puzzle

Given that the evidence for cultural variation in moral psychology is at best weak (Objections 1 and 2), and given that the theoretical argument for moral reframing is flawed (Objection 3), why does moral reframing seem to work?

Glossary

moral conviction : ‘Moral conviction refers to a strong and absolute belief that something is right or wrong, moral or immoral’ (Skitka, Bauman, & Sargis, 2005, p. 896).
moral disengagement : Moral disengagement occurs when self-sanctions are disengaged from inhumane conduct. Bandura (2002, p. 103) identifies several mechanisms of moral disengagement: ‘The disengagement may centre on redefining harmful conduct as honourable by moral justification, exonerating social comparison and sanitising language. It may focus on agency of action so that perpetrators can minimise their role in causing harm by diffusion and displacement of responsibility. It may involve minimising or distorting the harm that follows from detrimental actions; and the disengagement may include dehumanising and blaming the victims of the maltreatment.’
Moral Foundations Theory : The theory that moral pluralism is true; moral foundations are innate but also subject to cultural learning, and the \gls{Social Intuitionist Model of Moral Judgement} is correct (Graham et al., 2019). Proponents often claim, further, that cultural variation in how these innate foundations are woven into ethical abilities can be measured using the Moral Foundations Questionnare (Graham, Haidt, & Nosek, 2009; Graham et al., 2011). Some empirical objections have been offered (Davis et al., 2016; Davis, Dooley, Hook, Choe, & McElroy, 2017; Doğruyol et al., 2019). See Moral Foundations Theory: An Approach to Cultural Variation.
Social Intuitionist Model of Moral Judgement : A model on which intuitive processes are directly responsible for moral judgements (Haidt & Bjorklund, 2008). One’s own reasoning does not typically affect one’s own moral judgements, but (outside philosophy, perhaps) is typically used only to provide post-hoc justification after moral judgements are made. Reasoning does affect others’ moral intuitions, and so provides a mechanism for cultural learning.

References

Bandura, A. (2002). Selective Moral Disengagement in the Exercise of Moral Agency. Journal of Moral Education, 31(2), 101–119. https://doi.org/10.1080/0305724022014322
Davis, D., Dooley, M., Hook, J., Choe, E., & McElroy, S. (2017). The Purity/Sanctity Subscale of the Moral Foundations Questionnaire Does Not Work Similarly for Religious Versus Non-Religious Individuals. Psychology of Religion and Spirituality, 9(1), 124–130. https://doi.org/10.1037/rel0000057
Davis, D., Rice, K., Tongeren, D. V., Hook, J., DeBlaere, C., Worthington, E., & Choe, E. (2016). The Moral Foundations Hypothesis Does Not Replicate Well in Black Samples. Journal of Personality and Social Psychology, 110(4). https://doi.org/10.1037/pspp0000056
Doğruyol, B., Alper, S., & Yilmaz, O. (2019). The five-factor model of the moral foundations theory is stable across WEIRD and non-WEIRD cultures. Personality and Individual Differences, 151, 109547. https://doi.org/10.1016/j.paid.2019.109547
Feinberg, M., & Willer, R. (2013). The Moral Roots of Environmental Attitudes. Psychological Science, 24(1), 56–62. https://doi.org/10.1177/0956797612449177
Feinberg, M., & Willer, R. (2019). Moral reframing: A technique for effective and persuasive communication across political divides. Social and Personality Psychology Compass, 13(12), e12501. https://doi.org/10.1111/spc3.12501
Graham, J., Haidt, J., Motyl, M., Meindl, P., Iskiwitch, C., & Mooijman, M. (2019). Moral Foundations Theory: On the advantages of moral pluralism over moral monism. In K. Gray & J. Graham (Eds.), Atlas of Moral Psychology. New York: Guilford Publications.
Graham, J., Haidt, J., & Nosek, B. A. (2009). Liberals and conservatives rely on different sets of moral foundations. Journal of Personality and Social Psychology, 96(5), 1029–1046. https://doi.org/10.1037/a0015141
Graham, J., Nosek, B. A., Haidt, J., Iyer, R., Koleva, S., & Ditto, P. H. (2011). Mapping the moral domain. Journal of Personality and Social Psychology, 101(2), 366–385. https://doi.org/10.1037/a0021847
Haidt, J., & Bjorklund, F. (2008). Social intuitionists answer six questions about moral psychology. In W. Sinnott-Armstrong (Ed.), Moral psychology, Vol 2: The cognitive science of morality: Intuition and diversity (pp. 181–217). Cambridge, Mass: MIT press.
Iurino, K., & Saucier, G. (2020). Testing Measurement Invariance of the Moral Foundations Questionnaire Across 27 Countries. Assessment, 27(2), 365–372. https://doi.org/10.1177/1073191118817916
Kivikangas, J. M., Fernández-Castilla, B., Järvelä, S., Ravaja, N., & Lönnqvist, J.-E. (2021). Moral foundations and political orientation: Systematic review and meta-analysis. Psychological Bulletin, 147(1), 55–94. https://doi.org/10.1037/bul0000308
Milfont, T. L. L., Davies, C. L., & Wilson, M. S. (2019). The Moral Foundations of Environmentalism. Social Psychological Bulletin, 14(2), 1–25. https://doi.org/10.32872/spb.v14i2.32633
Schein, C., & Gray, K. (2015). The Unifying Moral Dyad: Liberals and Conservatives Share the Same Harm-Based Moral Template. Personality and Social Psychology Bulletin, 41(8), 1147–1163. https://doi.org/10.1177/0146167215591501
Skitka, L. J., Bauman, C., & Sargis, E. (2005). Moral Conviction: Another Contributor to Attitude Strength or Something More? Journal of Personality and Social Psychology, 88(6), 895–917.
  1. This is too quick. Philosophers sometimes act as cheerleaders. Nothing wrong with that, unless you think philosophy is about deriving truths using reason alone. But if you think that, you were very badly informed when you decided to take this module (sorry). 

  2. Thanks to Joan, Lars and Joseph. (I think they each came up with a version of this objection independently.)