Sie sind auf Seite 1von 57

Intention, emotion and perception:

Why people think the way they think about other peoples moral judgments
Miguel Garcia Barretto

Department of Psychology, Palma Hall Annex University of the Philippines Diliman 17th of September, 2013

Introduction

Findings Conclusion Directions

Key Questions
Why do we have biases of other peoples judgments on moral dilemmas (intentional judgment)? What processes does it involve? Automatic? Cognitive? Affect? Perception? Is there a difference between self-assessed moral judgment versus intentional judgment?

Introduction

Findings Conclusion Directions

Why moral judgment?


Inconsistencies in our moral intuitions, e.g. Trolley Problem (Thomson, 1986) and Knobe effect (Knobe, 2003)
Heuristics and biases (Tversky and Kahneman, 1974, 1980) Affect in decision making (Zajonc, 1980; Slovic et al., 2002; Damasio, 1994) Adaptive intuitions (Gigerenzer, 2007; Hogarth, 2001)

Starting point to integrate various psychological or brain processes, e.g. Haidts (2001) social intuitionist model Greene et al.s (2004) dual process model Cunningham et al.s (2007) iterative reproccessing model

Introduction

Findings Conclusion Directions

Example (from Tversky and Kahneman, 1981)


Imagine that the U.S. is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative program to combat the disease has been proposed. Assume that the exact scientific estimate of the consequences of the program are as follows:

Introduction

Findings Conclusion Directions

Example (from Tversky and Kahneman, 1981)


If Program A is adopted, 200 people will be saved. 72 % If Program B is adopted, there is 1/3 probability that 600 people will be saved, and 2/3 probability that no people will be saved. 28 % If Program C is adopted 400 people will die. 22 % If Program D is adopted, there is 1/3 probability that nobody will die, and 2/3 probability that 600 will die. 78 %

Which of the two programs would you favor?

Introduction

Findings Conclusion Directions

Why moral judgment? Its evolution


Developmental perspective (Piaget, 1932; Kohlberg, 1973) of moral cognition

Role of emotions, e.g. disgust (Haidt, 2001; Pizarro, Inbar and Helion, 2011) Experimental philosophy, i.e. folk (Knobe, 2003, Machery et al., 2008) psychology

Neural mechanisms, e.g. vMPFC, ventral striatum, amygdala, (Greene et al., 2001; Greene et al., 2004), RTPJ (Young et al., 2007), right anterior insular cortex (Shenhav and Greene, 2010)

Introduction

Findings Conclusion Directions

Why moral judgment?


Applications of the integrated model in: Game theory and neuroeconomics, i.e. expected utility and behavioral foundations (Camerer, Loewenstein, Prelec; Cohen, 2005; Shenhav and Greene, 2010) Political and social neuroscience, e.g. beliefs, norms, and cultures, ideology (Jost and Amodio, 2012; Amodio et al., 2007), and laws (Nadelhoffer et al., 2010) Human cooperation (Rand, Greene and Nowak, 2012; Rand, et al., 2013; Rand and Nowak, 2013)

Introduction

Findings Conclusion Directions

Current research
Framing effects of philosophical questions (Garcia Barretto, 2013) Moral judgment. v. intentional judgment (Garcia Barretto and Dulay, 2013) Affect and automatic processes in moral judgment (Garcia Barretto, Soto and Spath, 2013) Neural basis of intentional judgment (Ongoing)

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Thought experiment 1 (from Knobe, 2003)


The vice president of a company went to the Chairman of the Board and said, We are thinking of starting a new program. It will help us increase profits, but it will also harm the environment. The Chairman of the Board answered, I dont care about harming the environment. I just want to make as much profit as I can. Lets start the new program. They started the new program. Sure enough, the environment was harmed. Did the Chairman intentionally harm the environment?

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Thought experiment 1 (from Knobe, 2003)


The vice president of a company went to the Chairman of the Board and said, We are thinking of starting a new program. It will help us increase profits, and it will also help the environment. The Chairman of the Board answered, I dont care about helping the environment. I just want to make as much profit as I can. Lets start the new program. They started the new program. Sure enough, the environment was helped. Did the Chairman intentionally help the environment?

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Thought experiment 2 (Foot, 1978; Thompson, 1986)

A trolley has lost its brakes, and is about to crash into 5 workers at the end of the track. You find that you just happen to be standing next to a lever that veers into another side track. Is it moral to pull the lever to save 5, but kill 1?

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Thought experiment 2 (Foot, 1978; Thompson, 1986)

A trolley has lost its brakes, and is about to crash into 5 workers at the end of the track. You find that you just happen to be standing next to large man on top of a foot bridge. Is it moral to push the large man save 5, but kill 1?

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

The Knobe (side-effect) effect


Did the Chairman intentionally harm the environment? Did the Chairman intentionally help the environment? 82%, Yes 23%, Yes Inconsistency: Same non-intention, i.e. the Chairman did not care about the outcome

The Trolley Problem


Is it moral to pull the lever to save 5, kill 1? Yes Is it moral to push the large man to save 5, kill 1? No Inconsistency: Same outcome that 5 will be saved, 1 will be killed (utilitarian)

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Garcia Barretto (2013): Do the same (moral) intuitions operate in these philosophical thought experiments? What psychological processes are associated with these moral judgments? Experiment Framing the Trolley Problem in terms of the Knobe effect Eight question versions N = 130 respondents randomly selected in a social network website

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Lever Scenario
A runaway trolley is heading down the tracks toward five people who will be killed if it proceeds on its present course. A man is standing beside the lever that could change the tracks. He can save these five people by pulling the lever and diverting the trolley onto a different set of tracks, one that has one person on it, but if he does this, that person will be killed. The man pulled the lever. The five people were saved but the other person was killed.

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Lever Scenario
Harm Scenario Did he intentionally kill the person on the other side of the tracks? Rate how much regret the man deserves in killing the other person with 0 as the lowest and 6 as the highest. Help Scenario Did he intentionally help the five people? Rate how much praise the man deserves in saving the five people with 0 as the lowest and 6 as the highest.

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Footbridge Scenario
A runaway trolley is heading down the tracks toward five people who will be killed if it proceeds on its present course. A man is standing beside a fat man on a footbridge. He can save these five people by pushing the fat man to block the tracks, but if he does this, that person will be killed. The man pushed the fat man. The five people were saved but the fat man was killed.

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Footbridge Scenario
Harm Scenario Did he intentionally kill the fat man? Rate how much regret the man deserves in killing the fat man with 0 as the lowest and 6 as the highest. Help Scenario Did he intentionally help the five people? Rate how much praise the man deserves in saving the five people with 0 as the lowest and 6 as the highest.

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Footbridge Scenario
Harm Scenario Did he intentionally kill the fat man? Rate how much regret the man deserves in killing the fat man with 0 as the lowest and 6 as the highest. Help Scenario Did he intentionally help the five people? Rate how much praise the man deserves in saving the five people with 0 as the lowest and 6 as the highest.

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Key Findings

Introduction Findings

Conclusion Directions 1. Framing effects of philosophical questions

Key Findings
Intentionality responses associated in the lever scenario while mean response ratings associated in the footbridge scenario Morally acceptable = Intention to Save Morally unacceptable = Blameworthy
Greenes dual process through the Knobe effect: Intentionality linked with cognitive processes (intentional or not) while mean response linked with emotional processes (praise or blame) Framing effects as a way of illuminating moral intuitions

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

What is with our moral intuitions?


Garcia Barretto and Dulay (2013): Is there a difference between our own assessments of moral judgments (participants as actors) versus their perception of the moral intentions of others (participants as observers)?
What variables, i.e. moral reference points, can we use to infer our moral intuitions on moral judgment and intentional judgment?

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Expected utility theory (EUT)


Core theory in the economics of decision making under risk
Challenged by Kahneman and Tverskys (1979) prospect theory; undergoing experimental and neuroeconomic testing (Knutson et al., 2005; Platt and Huettel, 2008; Tom et al., 2007) Shenhav and Greene (2010) used the basic EUT framework to extend to decision making involving moral dilemmas We extend their work to compare with intentional judgments

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Experimental Design
N = 113 participants (M = 74, F = 39; Age: 17-32, = 23 years); 1,767 judgments collected
Task problem: 14 moral dilemma questions from 5 equivalent trolley scenarios (8 intentional action [Treatment], 6 moral choice [Control])

Help Survive Harm Survive

Help Death Harm Death

Moral Survive Moral Survive

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Experimental Design

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Experimental Design
Self-assessed moral dilemma. You are the building manager. You know that they only way to avoid the deaths of the workers in Room A is to block the main vent immediately by hitting a button that will close on of the vent doors. You also know that if you close the vent door the gas will be diverted into a different room, Room B, and will kill one worker there. You are also aware that there is a chance that the workers in Room A will escape before the gas reaches them. This would be impossible for the one in Room B.

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Experimental Design
Intentional action dilemma. The building manager is faced with a problem. He knows that the only way to avoid the deaths of the workers in Room A is to block the main vent immediately by hitting a button that will close on of the vent doors. He also knows that if he closes the vent door the gas will be diverted into a different room, Room B, and will kill one worker there. The building manager is also aware that there is a chance that the workers in Room A will escape before the gas reaches them. This would be impossible for the one in Room B. The building manger decided to block the vent. All the workers in Room A were saved but the one other worker in Room B was killed.

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Experimental Design

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Results

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Results

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Results

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Results

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Results

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Results

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Results

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Key Findings
Salient pattern in outcome probability: Own moral acceptability decreases with increasing probability victims will be saved Intention to save also decreases; with respect to our own moral preferences, intent to save has higher attribution Intention to kill increases linearly
Framing effects changes the slope: emphasis on negative outcome makes it steeper; however, no significant change in trajectory

Introduction Findings

Conclusion Directions 2. Moral judgment v. Intentional judgment

Key Findings

With respect to our moral preferences: We are kinder to others when attributing to save victims As hard as ourselves when punishing them

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Emotions on self v. other


Scenario 1 Scenario 2

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Risk as feelings or moral decision?


Garcia Barretto, Soto and Spath (2013): Does more information matter in emotionally driven decisions?

Choice theory currently see choice as an information set: Information overload (Malhotra, 1984; Herbig and Kremer, 1994; Lee and Lee, 2004) Cognitive overload, paradox of choice (Schwarz, 2004) Criticism: No emphasis on emotional or automatic responses
Our contention: Why not information as choice?

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Information as choice v. choice as information

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Predictions in information as choice

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Experimental Design

Process of Parts 1 to 3

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Experimental Design
Affective priming (Stage 1)

Choice justification (Stage 1)

Choice as information (Stage 2)

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Experimental Design

Photo information (Affective)

Information priming (Stage 3)

Descriptive information (Cognitive)

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Experimental Design

Photo information (Affective)

Information priming (Stage 3)

Descriptive information (Cognitive)

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Experimental Design
Decision making Acceptability (Stage 4)

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Results
Participants info demand behavior changes across scenarios

Percentage of participants wanting to know more information

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Results
The effect of information on the acceptability rates differ across scenarios

Acceptability rates conditional on requesting more information

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Results

Acceptability rates as a function of time

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Results
40% of participants exhibited both confirmation bias (Russian roulette) and confirmation avoidance (Organ donation)

Information demand across all rounds

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Results

Introduction Findings

Conclusion Directions

3. Affect and automatic processes in moral judgment

Key Findings
Affect-driven initial decisions important in getting final outcome; Organ Donation: majority of participants avoided information to maintain initial decision Russian Roulette: participants wanted to know more to confirm their initial fear of dying

Regressions show information selection process (information as choice) has at least as much explanatory power for participants characteristics as the acceptability rates (choice as information)

Introduction Findings

Conclusion Directions

Main Conclusions (so far)


Perception of intentions of others different from our own assessment of moral dilemmas Issue: Domain specific or domain general? If domain specific, what particular brain region? If domain general, how does the overall neural circuitry work?
Framing effects can inform us about our moral intuitions Issue: What other toy problems beyond the Trolley Problem? How to prove toy problems infer conflicts of decision making? Economic theory needs psychological, neural evidence: Issue: How to make these micro-foundations relevant at the policy level?

Introduction Findings

Conclusion Directions 4. Neural basis of intentional judgment

Understanding the insula

from Shenhav and Greene (2010)

Introduction Findings

Conclusion Directions 4. Neural basis of intentional judgment

Understanding the RTPJ

from Young and Tsoi (2013)

Das könnte Ihnen auch gefallen