Knowledge

Dual process theory (moral psychology)

Source đź“ť

853:, which is an improved version of the "Emotions Bad, Reasoning Good" argument. It is claimed that emotion-driven processes tend to involve fast heuristics, which makes them unreliable. It follows that deontological intuitions, being an emotional form or reasoning themselves, should not be trusted. According to Berker this line of thought is also flawed. This is so because forms of reasoning that consist in heuristics are usually those in which we have a clear notion of what is right and wrong. Hence, in the moral domain, where these notions are highly disputed, "it is question begging to assume that the emotional processes underwriting deontological intuitions consist in heuristics". Berker also challenges the very assumption that heuristics lead to unreliable judgements. Additionally, he argues that, as far as we know, consequentialist judgements may also rely on heuristics, given that it is highly unlikely that they could always be the product of accurate and comprehensive mental calculations of all the possible outcomes. 323:
removing time pressure leads to an increase in consequentialist response. Being under cognitive load while making a moral judgement decreases consequentialist responses. In contrast, solving a difficult math problem before making a moral judgement (meant to make participants more skeptical of their intuitions) increases the number of consequentialist responses. When asked to explain or justify their responses, subjects preferentially chose consequentialist principles – even for explaining characteristically deontological responses. Further evidence shows that consequentialist responses to trolley-problem-like dilemmas are associated with deficits in emotional awareness in people with alexithymia or psychopathic tendencies. On the other hand, subjects being primed to be more emotional or empathetic give more characteristically deontological answers.
650:
is presented to people, but instead of flicking a switch, subjects are asked whether they would push a fat man onto the rails in order to stop the trolley, intuitions usually have it that pushing the fat man is the wrong choice. Given that both actions lead to the saving of five people, why is one judged to be right, whereas the other one wrong? According to Greene, there is no moral justification for this difference of intuitions between the 'switch' and the 'fat man' trolley-case. Instead, what leads to such difference is the morally irrelevant fact that the 'fat man' case involves the use of personal force (thus leading most people to judge that pushing the fat man is the wrong action), whereas the 'switch' case doesn't (thus leading most people to judge that flicking the switch is the right action).
605:"push the bystander." On the other hand, System 1 activates an "alarm bell" emotion: "do not harm the bystander." Greene and colleagues claim that the prohibition to harm is "nonnegotiable": It cannot be weighed against other values, including utilitarian considerations. They argue that "ntractable dilemmas arise when psychological systems produce outputs that are... non-negotiable because their outputs are processed as absolute demands, rather than fungible preferences." The dual-process model predicts that subtle changes in context will cause people to flip between extreme judgments: deontic and utilitarian. Moral compromises will be infrequent "trembling hand mistakes." 369:, he was involved in an accident: an "iron rod used to cram down the explosive powder shot into Gage's cheek, went through the front of his brain, and exited via the top of his head". Surprisingly, not only Gage survived, but he also went back to his normal life just in less than two months. Although his physical capacities were restored, however, his personality and his character radically changed. He became vulgar and anti-social: "Where he had once been responsible and self-controlled, now he was impulsive, capricious, and unreliable". Damasio wrote: "Gage was no longer Gage." Moreover, also his moral intuitions were transformed. Further studies by means of 866:
unlike consequentialist intuitions, emotion-based deontological intuitions are the side effects of this evolutionary adaption to the pre-existing environment. Therefore, "deontological intuitions, unlike consequentialist intuitions, do not have any normative force". Berker states that this is an incorrect conclusion because there is no reason to think that consequentialist intuitions are not also by-products of evolution. Moreover, he argues that the invitation, advanced by Singer, to separate evolutionary-based moral judgements (allegedly unreliable) from those that are based on reason, is misleading because it is based on a
87: 556:–  appears to be activated for characteristically consequentialist judgements. While it is not clear how crucial a role this region plays in moral judgements, one can argue that all moral judgements seem to involve at least some emotional processing. This would disprove the simplest version of the dual-process hypothesis. Greene responded to this argument by proposing that emotions that drive deontological judgements are "alarmlike", whereas those that are present during consequentialist judgements are "more like currency." A response which Berker regards to be without empirical backing. 548:, in particular, has argued that a multitude of attitudes towards the agents involved is important in evaluating an individual's moral stance, as well as evaluating the motivations that may inform those decisions. Kahane and Shackel scrutinize the questions and dilemmas Greene et al. use, and claim that the methodology used in the neuroscientific study of intuitions needs to be improved. However, after Kahane and colleagues engineered a set of moral dilemmas specifically meant to falsify Greene's theory, their moral dilemmas turned out to confirm it instead. 508: 682:
that this is morally wrong, but Greene suggests that this intuition is the result of incest historically being evolutionary disadvantageous. However, if the siblings take extreme precautions, such as vasectomy, in order to avoid the risk of genetic mutation in their offspring, the cause of the moral intuition is no longer relevant. In such cases, scientific findings have given us reason to ignore some of our moral intuitions, and in turn revise the moral judgements which are based upon these intuitions.
270:
satisfying three conditions: a) The action in question could reasonably be expected to lead to bodily harm, b) The harm is inflicted in particular persons or members of a particular group and c) The harm is not a result of diverting a previously existing threat onto another party. All other dilemmas were classed as 'impersonal'. It was observed that when responding to personal dilemmas, the subjects displayed increased activity in regions of the brain associated with emotion (the medial
621:> 1 people are saved for each one sacrificed. Their experimental results show that people make many compromise judgments, which respect the axioms of rational choice. These results contradict the dual-process model, which claims that only deontic judgments are the product of moral rationality. The results indicate the existence of a moral tradeoff system that weighs competing moral considerations and finds a solution that is most right, which can be a compromise judgment. 262: 226: 499:
personal and close interactions with others. In the past century, our social organizations were altered and so these types of interactions have become less frequent. Therefore, Singer argues that we should rely on more sophisticated consequentialist judgements that fit better in our modern times than the deontological judgements that were useful for more rudimentary interactions.
576:
Susan Case", where the only way to save five people who sit on a Lazy Susan is to push the lazy Susan into an innocent bystander, killing him. so that serves as a counter-example. Although this thought experiment involves personal harm, the philosopher Francis Kamm arrives at an intuitive consequentialist judgement, thinking it is permissible to kill one to save five.
572:"inappropriate" responses. Because of this way of calculating, the differences from question to question significantly skewed the results, Berker points out that some questions involved "easy" cases that should not be classified as dilemmas. This is because of the way these cases were framed, people found one of the choices to be obviously inappropriate. 256:
happens to be very large. The only way to save the lives of the five workmen is to push this stranger off the bridge and onto the tracks below where his large body will stop the trolley. The stranger will die if you do this, but the five workmen will be saved. Is it appropriate for you to push the stranger onto the tracks in order to save the five workmen?
171:. We often rely on our "automatic settings" and allow intuitions to guide our behaviour and judgement. In "manual mode", judgments draw from both general knowledge about "how the world works" and explicit understanding of special situational features. The operations of this "manual mode" system requires effortful conscious deliberation. 765:, even if there's no disagreement about the underlying moral principles that govern the disputes. "If indeed we're wired for tribalism," Wright explains, "then maybe much of the problem has less to do with differing moral visions than with the simple fact that my tribe is my tribe and your tribe is your tribe. Both Greene and 249:
causing the deaths of the five workmen. The only way to avoid the deaths of these workmen is to hit a switch on your dashboard that will cause the trolley to proceed to the right, causing the death of the single workman. Is it appropriate for you to hit the switch in order to avoid the deaths of the five workmen?
895:
Berker criticises both premises and the move from C1 to C2. Regarding P1, Berker is not convinced that deontological judgments are correctly characterized as merely appealing to factors that make the dilemma personal. For example, Kamm's 'Lazy Susan' trolley case is an example of a 'personal' dilemma
649:
that differ across the dimension of personal force. When people are asked whether it would be right or wrong to flick a switch in order to divert a trolley from killing five people, their intuitions usually indicate that flicking the switch is the morally right choice. However, when the same scenario
575:
Third, Berker argues Greene's criteria to classify impersonal and personal moral dilemmas do not map onto the distinction of deontological and consequentialist moral judgements. It is not the case consequentialist judgements only arise if cases involve impersonal factors. Berkner highlights the "Lazy
485:
judgements. According to him, moral constructivism searches for reasonable grounds whereas deontological judgements rely on hasty and emotional responses. Singer argues our most immediate moral intuitions should be challenged. A normative ethic must not be evaluated by the extent to which it matches
453:
gives a better descriptive process for how these moral norms are derived from evolutionary processes and natural selection. For example, selective pressures favour self-sacrifice for the benefit of the group and punish those who do not. This provides a better explanation of the cost-benefit ratio for
207:
research, some scientists have argued that serial and parallel models fail to capture the true nature of the interaction between dual process systems. They contend that some operations that are commonly said to belong to the deliberative system can in fact also be cued by the intuitive system, and we
187:
Serial models assume that there is initially an exclusive focus on the intuitive system to make judgements but that this default processing might be followed by deliberative processing at a later stage. Greene et al.'s model is usually placed within this category. In contrast, in a parallel model, it
183:
There is lack of agreement on whether and how the two processes interact with one another. It is unclear whether deontological responders, for example, rely blindly on the intuitively cued response without any thought of utilitarian considerations or whether they recognise the alternative utilitarian
681:
Greene then states that the evidence for dual-process theory might give us reason to question judgements which are based upon moral intuitions, in cases where those moral intuitions might be based upon morally irrelevant factors. He gives the example of incestuous siblings. Intuition might tell us
353:
also points to a possible dissociation between emotional and rational decision processes. Damage to this area is typically associated with antisocial personality traits and impairments of moral decision making. Patients with these lesions tend to show a more frequent endorsement of the "utilitarian"
769:
cite studies in which people were randomly divided into two groups and immediately favored members of their own group in allocating resources -- even when they knew the assignment was random." Instead, Wright proposes that "nourishing the seeds of enlightenment indigenous to the world's tribes is a
604:
The dual process model, however, rules out the possibility of moral compromises. According to Greene and colleagues, people experience the footbridge problem as a dilemma because "two processes yield different answers to the same question". On the one hand, System 2 outputs a utilitarian judgment:
327:
brain regions when presented with situations involving the use of personal force (e.g. the 'footbridge' case). The dorsolateral prefrontal cortex and the parietal lobe are 'cognitive' brain regions; subjects show increased activity in these two regions when presented with impersonal moral dilemmas.
326:
In addition, Greene's results show that some brain areas, such as the medial prefrontal cortex, the posterior cingulate/precuneus, the posterior superior temporal sulcus/inferior parietal lobe, and the amygdala, are associated with emotional processes. Subjects exhibited increased activity in these
255:
The Footbridge Case: "A runaway trolley is heading down the tracks toward five workmen who will be killed if the trolley proceeds on its present course. You are on a footbridge over the tracks, in between the approaching trolley and the five workmen. Next to you on this footbridge is a stranger who
653:
Greene takes such observations as point of departure to argue that judgments produced by automatic-emotional processes lack normative force in comparison to those produced by conscious-controlled processes. Relying on automatic, emotional responses when dealing with unfamiliar moral dilemmas would
587:
utilitarian and deontological inclinations, but only by dissociating these moral inclinations with a more advanced protocol that was not used in early dual process theoretic research. Further, there is evidence that utilitarian decisions are associated with more emotional regret than deontological
522:
ignores the motivational aspect of decision making in human social contexts. A more specific example of this criticism focuses on the ventromedial prefrontal cortex lesion data. Although patients with this damage display characteristically "cold-blooded" behaviour in the trolley problem, they show
498:
Singer relies on evolutionary theories to justify his claim. For most of our evolutionary history, human beings have lived in small groups where violence was ubiquitous. Deontological judgements linked to emotional and intuitive responses were developed by human beings as they were confronted with
322:
Greene points to a large body of evidence from cognitive science suggesting that inclination to deontological or consequentialist judgment depends on whether emotional-intuitive reactions or more calculated ones were involved in the judgment-making process. For example, encouraging deliberation or
865:
in order to adapt, handle and promptly respond to such situations of violence within their groups. Cases of impersonal violence, instead, do not raise the same innate alarm and therefore they leave room for more accurate and analytical judgement of the situation. Thus, according to this argument,
736:
Greene's 2008 article "The Secret Joke of Kant's Soul" argues that Kantian/deontological ethics tends to be driven by emotional respondes and is best understood as rationalization rather than rationalism—an attempt to justify intuitive moral judgments post-hoc, although the author states that his
269:
Greene and his colleagues carried out fMRI experiments in order to investigate which regions of the brain were activated in subjects while responding to 'personal dilemmas' such as the footbridge dilemma and 'impersonal dilemmas' such as the switch dilemma. 'Personal dilemmas' were defined as any
191:
Models of the former category lend support to the view that humans, in an effort to minimise cognitive effort, will choose to refrain from the more demanding deliberative system where possible. Only utilitarian responders will have opted into it. This further implies that deontological responders
600:
pointed out that moral dilemmas were a recurrent adaptive problem for ancestral humans, whose social life created multiple responsibilities to others (siblings, parents and offspring, cooperative partners, coalitional allies, and so on). Intermediate solutions, ones that strike a balance between
407:
in "personal" dilemmas, while those choosing the "deontological" path remained unaffected. Cognitive load, in general, is also found to increase the likelihood of "deontological" judgment These laboratory findings are supplemented by work that looks at the decision-making processes of real-world
174:
Greene concedes that his analogy has limited force. While a photographer can switch back-and-forth between automatic and manual mode, the automatic-intuitive processes of human reasoning are always active: conscious deliberations needs to "override" our intuitions. In addition to that, automatic
248:
The Switch Case "You are at the wheel of a runaway trolley quickly approaching a fork in the tracks. On the tracks extending to the left is a group of five railway workmen. On the tracks extending to the right is a single railway workman. If you do nothing the trolley will proceed to the left,
702:
The appropriateness of applying our intuitive and automatic mode of reasoning to a given moral problem thus hinges on how the process was formed in the first place. Shaped by trial-and-error experience, automatic settings will only function well when one has sufficient experience of the
443:
as deriving from natural phenomena common to all humans. For instance, he mentions the "common or natural cause of our passions" and the generation of love for others represented through self-sacrifice for the greater good of the group. Hume's work is sometimes cited as an inspiration for
921:, or the belief that a widespread common negative intuition towards something is evidence that there is something morally wrong about it. This opposes Greene's conclusion that intuitions should not be expected to "perform well" or give us good ethical reasoning for some ethical problems. 737:
argument is speculative and will not be conclusive. Several philosophers have written critical responses, mainly criticising the necessary linking between process, automatic or controlled, intuitive or counterintuitive/rational, with containt, respectively, deontological or utilitarian.
641:
that focus on "best results" can be explained by the dual-process organisation of the human mind. Ethical decisions that fall under 'right action' correspond to automatic-emotional (system 1) processing, whereas 'best results' correspond to conscious-controlled reasoning (system 2).
559:
Berker's second methodological worry is that Greene et al.  presented the response-time data to moral dilemmas in a statistically invalid way. Rather than calculating the average difference in response time between the "appropriate" responses and "inappropriate responses for
119:
involves slow and deliberative reasoning. Moral judgments of this type are less influenced by the immediate emotional features of decision-making. Instead, they may draw from general knowledge and abstract moral conceptions, combined with a more controlled analysis of situational
900:
response. Regarding P2, he argues that factors that make a dilemma personal or impersonal are not necessarily morally irrelevant. Moreover, he adds, P2 is 'armchair philosophizing': it cannot be deduced from neuroscientific results that the closeness of a dilemma is bears on its
354:
path in trolley problem dilemmas. Greene et al. claim that this shows that when emotional information is removed through context or damage to brain regions necessary to render such information, the process associated with rational, controlled reasoning dominates decision making.
163:
camera which operates in two complementary modes: automatic and manual. A photographer can either employ the automatic "point-and-shoot" setting, which is fast and highly efficient, or adjust and refine settings in manual mode, which gives the photographer greater flexibility.
608:
Trolley problems cannot be used to test for non-negotiability, because they force extreme responses (e.g., push or do not push). So, to test the prediction, Guzmán and colleagues designed a sacrificial moral dilemma that permits compromise judgments, of the form "sacrifice
543:
Other criticisms focus on the methodology of using moral dilemmas such as the trolley problem. These criticisms note the lack of affective realism in contrived moral dilemmas and their tendency to use the actions of strangers to offer a view of human moral sentiments.
719:
This has implications for philosophical discussion of what Greene calls "unfamiliar problems", or ethical problems ones with which we have inadequate evolutionary, cultural, or personal experience. We might have to attentively revise our intuitions for topics like
674:. For example, he considers the normative statement "capital juries make good judgements". Scientific findings could lead us to revise this judgement if it were found that capital juries were, in fact, sensitive to race if we accept the uncontroversial normative 494:
as morally wrong. However, a consequentialist judgement brings another conclusion. As the brother and sister did not tell anyone and they used contraceptives, the incest did not have any harmful consequences. Thus, in that case, incest is not necessarily wrong.
2845:
Goldstein-Greenwood, J., Conway, P., Summerville, A., & Johnson, B. N. (2020). (How) Do You Regret Killing One to Save Five? Affective and Cognitive Regret Differ After Utilitarian and Deontological Decisions. Personality and Social Psychology Bulletin.
319:: "Characteristically deontological judgements are preferentially supported by automatic emotional responses, while characteristically consequentialist judgments are preferentially supported by conscious reasoning and allied processes of cognitive control". 551:
Berker has raised three methodological worries about Greene's empirical findings. First It is not the case that only deontological judgements are tied to cognitive processes. In fact, one region of the brain traditionally associated with the emotions –
821:
for two reasons. First, because there is no support for the claim that emotionally driven intuitions are less reliable than those guided by reason. Secondly, because the argument seems to rely on the assumption that deontological intuitions involve
402:
Another critical piece of evidence supporting the dual process account comes from reaction time data associated with moral dilemma experiments. Subjects who choose the "utilitarian" path in moral dilemmas showed increased reaction times under high
728:, global terrorism, global poverty, etc. As Greene states, this doesn't mean that our intuitions will always be wrong, but it means we need to pay attention as to where they come from and how they fare compared to more rational argument. 693:
With regards to automatic settings, Greene says we should only rely on these when faced with a moral problem that is sufficiently "familiar" to us. Familiarity, on Greene's conception, can arise from three sources - evolutionary history,
698:
and personal experience. It is possible that fear of snakes, for instance, can be traced to genetic dispositions, whereas a reluctance to place one's hand on a stove is caused by previous experience on burning one's hand on a hot stove.
860:
It draws on the idea that our different moral responses towards personal and impersonal harms are evolutionarily based. In fact, since personal violence has been known since ancient age, human developed emotional responses as innate
1004:
levels will likely be ineffective in improving their overall moral agency, because such a disposition relies heavily on psychological, social and situational contexts, as well as their deeply held convictions and beliefs. Rather,
658:. Greene subsequently proposes that this vindicates consequentialism. He rejects deontology as a moral framework as holds that deontological theories may be reduced to "post-hoc" rationalisations of arbitrary emotional responses. 582:
More recent methodological concerns stem from new evidence suggesting that deontological inclinations are not necessarily more emotional or less rational than utilitarian inclinations. For example, cognitive reflection predicts
192:
will not experience any conflict from the "utilitarian pull" of the dilemma: they have not engaged in the processing that gives rise to these considerations in the first place. In contrast, in a parallel model both utilitarian
2819:
Byrd, N., & Conway, P. (2019). Not all who ponder count costs: Arithmetic reflection predicts utilitarian tendencies, but logical reflection predicts both deontological and utilitarian tendencies. Cognition, 192, 103995.
992:
victim, as opposed to the weaker emotional response experienced when responding to the suffering of a large-scale, anonymous group (even though the benefit conferred by the subject would be of equal utility in both cases).
601:
conflicting moral values, would have often promoted fitness better than neglecting one value to fully satisfy others. A capacity to make intermediate or "compromise" judgments would have been favored by natural selection.
933:. Kass attempts to make a case against human cloning on the basis of the widespread strong feelings of repugnance at cloning. He lists examples of the various unpalatable consequences of cloning and appeals to notions of 314:
is responsible for weighing up the consequentialist response against the emotional response. Thus, three brain regions are primarily implicated in the making of moral judgements. This gives way to what Greene calls the
473:
explains justice from the evolutionary perspective by stating the instinct of reciprocity improved fitness for survival, therefore those who did not reciprocate were considered cheaters and cast-off from the group.
714:
When we are dealing with unfamiliar* moral problems, we ought to rely less on automatic settings (automatic emotional responses) and more on manual mode (conscious, controlled reasoning), lest we bank on cognitive
980:
primarily due to the deleterious effects that can arise when entrusting emotional, un-reasoned responses to tackling complex ethical issues, which can only be adequately addressed via rationality and reflection.
90:
Figure 1: Schematic representation of Greene's dual processes model of moral judgement.This figure describes the processes underlying individuals' judgement about (a) the Trolley dilemma and (b) the Footbridge
2832:
Conway, P., & Gawronski, B. (2013). Deontological and utilitarian inclinations in moral decision making: A process dissociation approach. Journal of Personality and Social Psychology, 104(2), 216–235.
841:
that is associated with emotional processes. Hence, the claim that deontological judgements are less reliable than consequentialist judgements because they are influenced by emotions cannot be justified.
229:
A diagram depicting how Greene supposes the affective (automatic) response would override the consequentialist (manual) response in the footbridge case (below), but not in the switch case (above).
184:
response but, on consideration, decide against it. These alternative interpretations point to different models of interaction: a serial (or "default-interventionist") model, and a parallel model.
527:. It is argued that moral decisions are better understood as integrating emotional, rational, and motivational information, the last of which has been shown to involve areas of the brain in the 212:
models in light of this evidence. Hybrid models would lend support to the notion of a "utilitarian intuition" - a utilitarian response cued by the automatic, "emotion-driven" cognitive system.
83:. The original fMRI investigation proposing the dual process account has been cited in excess of 2000 scholarly articles, generating extensive use of similar methodology as well as criticism. 690:
Greene is not making the claim that moral judgements based on emotion are categorically bad. His position is that the different "settings" are appropriate for different scenarios.
761:
has called Joshua Greene's proposal for global harmony ambitious adding, "I like ambition!" But he also claims that people have a tendency to see facts in a way that serves their
200:
that they face conflicting responses, but they do not engage in deliberative processing to a sufficient extent to enable them to override the intuitive (deontological) response.
1013:
one's moral responses in a flexible, reason-sensitive, and context-dependent way would be a more reliable, and in most cases more desirable, means to agential moral enhancement.
905:
relevance. Eventually, Berker concludes that even if we accept P1 and P2, C1 doesn't necessarily entail C2. This is because it may be the case that consequentialist intuitions
817:
intuitions imply abstract reasoning. Therefore, deontological intuitions don't have any normative force, whereas consequentialist intuitions do. Berker claims that this is
2992: 588:
decisions. Evidence like this complicates dual process theorists' claims that utilitarian thinking is more rational or that deontological thinking is more emotional.
2795: 3197: 834:. As a matter of fact, Greene's research itself shows that consequentialist responses to personal moral dilemmas involve at least one brain region - the 408:
altruists in life-or-death situations. These heroes overwhelming described their actions as fast, intuitive, and virtually never as carefully reasoned.
1756:
Białek, Michał; De Neys, Wim (2016-03-02). "Conflict detection during moral decision-making: evidence for deontic reasoners' utilitarian sensitivity".
1616:
Białek, Michał; De Neys, Wim (2016-07-03). "Conflict detection during moral decision-making: evidence for deontic reasoners' utilitarian sensitivity".
802:
While the three bad arguments identified by Berker are not explicitly made by Greene and Singer, Berker considers them as implicit in their reasoning.
1530:
Cushman F, Young L, Hauser M (December 2006). "The role of conscious reasoning and intuition in moral judgment: testing three principles of harm".
882:"P1. The emotional processing that gives rise to deontological intuitions responds to factors that make a dilemma personal rather than impersonal. 3422: 382:, commenting on Phineas Gage case, Damasio said that after the accident the railroad worker was able "To know, but not to feel." As explained by 1034:
Greene JD, Sommerville RB, Nystrom LE, Darley JM, Cohen JD (September 2001). "An fMRI investigation of emotional engagement in moral judgment".
1799:
Handley, Simon J.; Trippas, Dries (2015). "Dual Processes and the Interplay between Knowledge and Structure: A New Parallel Processing Model".
331: 1088:
Greene JD, Nystrom LE, Engell AD, Darley JM, Cohen JD (October 2004). "The neural bases of cognitive conflict and control in moral judgment".
514:
representation of a two proposal ultimatum game. Player 1 can offer a fair (F) or unfair (U) proposal; player 2 can accept (A) or reject (R).
2866:"A moral trade-off system produces intuitive judgments that are rational and coherent and strike a balance between conflicting moral values" 2431:
Trémolière B, Neys WD, Bonnefon JF (September 2012). "Mortality salience and morality: thinking about death makes people less utilitarian".
629:
Greene ties the two processes to two existing classes of ethical theories in moral philosophy. He argues that the existing tension between
747:
is too quick to conclude utilitarianism specifically from the general goal of constructing an impartial morality; for example, he says,
100:
The dual-process theory of moral judgement asserts that moral decisions are the product of either one of two distinct mental processes.
386:, Joshua Greene thought that this could explain the difference in moral intuitions in different versions of the trolley problem: "We 1184:
Greene J (October 2003). "From neural 'is' to moral 'ought': what are the moral implications of neuroscientific moral psychology?".
770:
better bet than trying to convert all the tribes to utilitarianism -- both more likely to succeed, and more effective if it does."
786:
critically analyzed four arguments that might be inferred from Greene and Singer's conclusion. He labels three of them as merely
3211:
Kleingeld, Pauline (2014). "Debunking Confabulation: Emotions and the Significance of Empirical Psychology for Kantian Ethics".
1000:
debate that more is required than the amplification of certain emotions. Increasing an agent's empathy by artificially raising
518:
Several scientific criticisms have been leveled against the dual process account. One asserts that the dual emotional/rational
294:), while when they responded to impersonal dilemmas, they displayed increased activity in regions of the brain associated with 888:
C1. So, the emotional processing that gives rise to deontological intuitions responds to factors that are morally irrelevant.
3230: 3021: 3015: 909:
respond to morally irrelevant factors. Unless we can show that this is not the case, the inference from C1 to C2 is invalid.
234: 373:
showed a correlation between such "moral" and character transformations and injuries to the ventromedial prefrontal cortex.
3278: 3247: 2533:
Moll J, De Oliveira-Souza R, Zahn R (March 2008). "The neural basis of moral cognition: sentiments, concepts, and values".
486:
those moral intuitions. He gives the example of a brother and sister who secretly decide to have sex with each other using
3376: 3112:
Meyers, C. D. (May 19, 2015). "Brains, trolleys, and intuitions: Defending deontology from the Greene/Singer argument".
159:
As an illustration of his dual-process theory of moral reasoning, Greene compares the dual-process in human brains to a
3351: 2105:
Klein, Colin (5 June 2010). "The Dual Track Theory of Moral Decision-Making: a Critique of the Neuroimaging Evidence".
2248:
Greene JD (August 2007). "Why are VMPFC patients more utilitarian? A dual-process theory of moral judgment explains".
244:
There are 2 versions of trolley problem. They are trolley driver dilemma and footbridge dilemma presented as follows.
124:
Following neuroscientific experiments, in which subjects were confronted with ethical dilemmas following the logic of
2311: 1816: 790:
or "bad arguments", and the last one as "the argument from irrelevant factors". According to Berker, all of them are
160: 1573:
Gürçay, Burcu; Baron, Jonathan (2017-01-02). "Challenges for the sequential two-system model of moral judgement".
3492: 151:
judgments, on the other hand, seem to be supported by conscious-controlled processes and deliberative reasoning.
2668:
Moll J, de Oliveira-Souza R (August 2007). "Response to Greene: Moral sentiments and reason: friends or foes?".
2020:
Greene JD (November 2015). "Beyond point-and-shoot morality: Why cognitive (neuro) science matters for ethics".
1141:
Greene JD (October 2017). "The rat-a-gorical imperative: Moral intuition and the limits of affective learning".
891:
C2. So, deontological intuitions, unlike consequentialist intuitions, do not have any genuine normative force."
350: 311: 299: 330:
Arguments for the dual process theory relying on neuroimaging data have been criticized for their reliance on
265:
A diagram showing the three main regions of the brain which Greene thinks are responsible for moral judgments.
2621:"Irrational economic decision-making after ventromedial prefrontal damage: evidence from the Ultimatum Game" 1712:"Dual processes and moral conflict: Evidence for deontological reasoners' intuitive utilitarian sensitivity" 1282:
Greene JD (2014-07-01). "Beyond Point-and-Shoot Morality: Why Cognitive (Neuro)Science Matters for Ethics".
960:
In particular, the role of empathy in morality has recently been heavily criticized by commentators such as
112:
behaviours and judgments. The factors affecting moral judgment of this type may be consciously inaccessible.
778:
In a widely cited critique of Greene's work and the philosophical implications of the dual process theory,
133: 39: 237:
to evaluate the brain activities and responses of people confronted with different variants of the famous
996:
This exemplifies the potential for empathy to 'misfire' and motivates the widely shared consensus in the
383: 175:
settings of our brains are not necessarily "hard-wired", but can be changed through (cultural) learning.
1885:"Tell Us What You Really Think: A Think Aloud Protocol Analysis of the Verbal Cognitive Reflection Test" 985: 836: 758: 553: 436: 2056: 3487: 965: 766: 545: 2334:
Would You Kill the Fat Man? The Trolley Problem and What Your Answer Tells Us about Right and Wrong
941:
to show that our disgust is the emotional expression of deep wisdom that is not fully articulable.
878:
Berker argued that the most promising argument from neural "is" to moral "ought" is the following.
287: 283: 2937: 964:, who describes it as "prone to biases that render moral judgment potentially harmful." Similarly 421: 147:
are preferentially supported by automatic-emotional processes and intuitions. Characteristically
86: 64: 1958:
Greene, J. D. (2001-09-14). "An fMRI Investigation of Emotional Engagement in Moral Judgment".
762: 2864:
Guzmán, Ricardo Andrés; Barbato, María Teresa; Sznycer, Daniel; Cosmides, Leda (2022-10-18).
1489: 196:
deontological responders will have engaged both processing systems. Deontological responders
136:
claims that the two processes can be linked to two classes of ethical theories respectively.
2779:
Greene, Joshua (2007). "The secret joke of Kant's Soul". In Sinnott-Amstrong, Walter (ed.).
2476:"Risking your life without a second thought: intuitive decision-making and extreme altruism" 1325:
Railton P (July 2014). "The Affective Dog and Its Rational Tale: Intuition and Attunement".
885:
P2. The factors that make a dilemma personal rather than impersonal are morally irrelevant.
394:
it better to save five rather than one life. And the feeling and the thought are distinct."
2877: 2703:
Bloom P (2011). "Family, community, trolley problems, and the crisis in moral psychology".
2542: 2487: 2204: 1967: 1043: 818: 810: 671: 630: 519: 482: 420:
is often given an evolutionary rationale (in this basic sense, the theory is an example of
144: 8: 3303:
Kamm FM (September 2009). "Neuroscience and Moral Reasoning: A Note on Recent Research".
3000:
Moral psychology: The neuroscience of morality: emotion, brain disorders, and development
725: 511: 417: 378: 204: 51: 2881: 2546: 2491: 2208: 1971: 1911: 1884: 1047: 26:
that posits that human beings possess two distinct cognitive subsystems that compete in
3340: 3173: 3148: 3129: 3094: 3059: 2973: 2908: 2865: 2756: 2731: 2685: 2645: 2620: 2601: 2566: 2510: 2475: 2456: 2408: 2383: 2364: 2273: 2225: 2192: 2168: 2141: 2122: 2087: 2037: 1999: 1941: 1865: 1781: 1733: 1685: 1641: 1598: 1555: 1512: 1470: 1400: 1350: 1307: 1209: 1166: 1123: 1067: 831: 779: 3002:. Moral psychology. Vol. 3. Cambridge, Massachusetts: MIT Press. pp. 35–80. 2191:
Koenigs M, Young L, Adolphs R, Tranel D, Cushman F, Hauser M, Damasio A (April 2007).
3463: 3455: 3407: 3357: 3347: 3320: 3316: 3226: 3178: 3133: 3098: 3063: 3011: 3007: 2965: 2957: 2913: 2895: 2761: 2747: 2716: 2650: 2558: 2515: 2448: 2413: 2307: 2300: 2265: 2230: 2173: 2126: 2079: 2041: 1991: 1983: 1916: 1857: 1849: 1812: 1773: 1633: 1602: 1590: 1547: 1543: 1462: 1458: 1442: 1354: 1342: 1299: 1201: 1158: 1115: 1059: 997: 271: 258:(Most people judge that it is not appropriate to push the stranger onto the tracks.) 47: 3248:"You Can't Learn About Morality from Brain Scans: The problem with moral psychology" 2689: 2605: 2570: 2460: 2444: 2399: 2368: 2091: 1869: 1785: 1737: 1689: 1645: 1559: 1516: 1404: 1213: 1170: 1154: 3403: 3312: 3218: 3168: 3160: 3121: 3086: 3051: 3003: 2977: 2949: 2903: 2885: 2751: 2743: 2712: 2677: 2640: 2636: 2632: 2593: 2550: 2505: 2495: 2440: 2403: 2395: 2356: 2277: 2257: 2220: 2212: 2163: 2153: 2114: 2071: 2029: 2003: 1975: 1906: 1896: 1841: 1804: 1765: 1723: 1675: 1625: 1582: 1539: 1504: 1474: 1454: 1392: 1334: 1311: 1291: 1193: 1150: 1127: 1105: 1097: 1071: 1051: 814: 478: 275: 80: 69: 3090: 2821: 1901: 1769: 1629: 1586: 988:, where subjects exhibit a much stronger emotional reaction to the suffering of a 3125: 2953: 2500: 2140:
Boes AD, Grafft AH, Joshi C, Chuang NA, Nopoulos P, Anderson SW (December 2011).
1101: 969: 945: 930: 646: 358: 342: 238: 129: 59: 27: 23: 2584:
Sun R (December 2013). "Moral judgment, human motivation, and neural networks".
507: 50:
and others, the theory can be seen as a domain-specific example of more general
2681: 2261: 2075: 1808: 1234:
Greene JD (2008). Sinnott-Armstrong W (ed.). "The Secret Joke of Kant's Soul".
721: 636: 564:
moral dilemma, Greene et al.  calculated the average response time of the
524: 447: 404: 295: 148: 43: 35: 3459: 2597: 2360: 2142:"Behavioral effects of congenital ventromedial prefrontal cortex malformation" 2118: 1728: 1680: 1663: 1508: 1396: 579:
Notwithstanding the above, the later criticism has been considered by Greene.
188:
is assumed that both processes are simultaneously engaged from the beginning.
3481: 3324: 3279:"Why Can't We All Just Get Along? The Uncertain Biological Basis of Morality" 2961: 2899: 2158: 1987: 1853: 1845: 1777: 1637: 1594: 1466: 1346: 1303: 867: 748: 666:
Greene firstly argues that scientific findings can help us reach interesting
597: 528: 487: 303: 125: 3361: 3222: 2890: 2554: 1979: 1055: 783: 3467: 3182: 2969: 2917: 2847: 2765: 2654: 2562: 2519: 2452: 2417: 2269: 2234: 2177: 2083: 2033: 1995: 1920: 1861: 1551: 1205: 1162: 1119: 1063: 934: 740: 470: 370: 362: 251:(Most people judge that it is appropriate to hit the switch in this case.) 2796:"Notes on 'The Normative Insignificance of Neuroscience' by Selim Berker" 1883:
Byrd, Nick; Joseph, Brianna; Gongora, Gabriela; Sirota, Miroslav (2023).
961: 918: 462: 2216: 2193:"Damage to the prefrontal cortex increases utilitarian moral judgements" 3164: 2384:"Cognitive load selectively interferes with utilitarian moral judgment" 1711: 1110: 752: 455: 432: 55: 2382:
Greene JD, Morelli SA, Lowenberg K, Nystrom LE, Cohen JD (June 2008).
3212: 3039: 922: 848: 667: 532: 450: 428: 365:. On the 13th of September 1848, while working on a railway track in 279: 261: 109: 79:
The dual-process theory has had significant influence on research in
72:
implications of the theory, which has started an extensive debate in
1197: 952:
and their role in philosophy, and intuitions' relationship to them.
944:
There is a widespread debate on the role of moral emotions, such as
225: 3077:
Königs, Peter (April 3, 2018). "Two types of debunking arguments".
3055: 2938:"Finding faults: How moral dilemmas illuminate cognitive structure" 2834: 1338: 1295: 1001: 787: 523:
more likelihood of endorsement of emotionally laden choices in the
440: 307: 291: 167:
Dual-process moral reasoning is an effective response to a similar
3040:"Moral Implications from Cognitive (Neuro)Science? No Clear Route" 949: 938: 791: 695: 675: 466: 366: 31: 826:
emotional processes whereas consequentialist intuitions involve
357:
A popular medical case, studied in particular by neuroscientist
310:
is primarily responsible for the emotional response, whilst the
34:, the other slow, requiring conscious deliberation and a higher 873: 491: 346: 73: 2732:"Methodological Issues in the Neuroscience of Moral Judgement" 2336:. Princeton, NJ: Princeton University Press. pp. 137–139. 1943:
Moral Tribes: Emotion, Reason, and the Gap Between Us and Them
1033: 917:
Many philosophers appeal to what is colloquially known as the
591: 469:, which is born out of the ability to detect those who cheat. 3149:"On the Wrong Track: Process and Content in Moral Psychology" 2381: 2057:"Can cognitive processes be inferred from neuroimaging data?" 454:
the generation of love for others as originally mentioned by
2863: 929:
presents a prime example of a feelings-based response to an
1664:"An assessment of the temporal dynamics of moral decisions" 830:
abstract reasoning. For Berker, this assumption also lacks
62:"system1"/"system 2" distinction popularised in his book, 22:
within moral psychology is an influential theory of human
2532: 2190: 1087: 706:
In light of these considerations, Greene formulates the "
490:. Our first intuitive reaction is a firm condemnation of 1490:"The Science of Morality and its Normative Implications" 1009:
it is likely that augmenting higher-order capacities to
1882: 755:
offer other impartial approaches to ethical questions.
678:
that capital juries ought not to be sensitive to race.
2430: 2302:
Descartes' Error: Emotion, Reason, and the Human Brain
685: 645:
One illustration of this tension are intuitions about
2667: 2139: 3423:"Why Paul Bloom Is Wrong About Empathy and Morality" 1029: 1027: 661: 3374: 3339: 2299: 1940: 1832:De Neys, Wim (January 2012). "Bias and Conflict". 1529: 1383:Singer P (October 2005). "Ethics and Intuitions". 1024: 3479: 2870:Proceedings of the National Academy of Sciences 2822:https://doi.org/10.1016/j.cognition.2019.06.007 2327: 2325: 2323: 2293: 2291: 2289: 2287: 1487: 845:The second bad argument presented by Berker is 624: 306:). In recent work, Greene has stated that the 2936:Cushman, Fiery; Greene, Joshua D. (May 2012). 2015: 2013: 1798: 1443:"The Normative Insignificance of Neuroscience" 1436: 1434: 1277: 1275: 1273: 1271: 1269: 1267: 1265: 1236:Moral Psychology: The Neuroscience of Morality 1083: 1081: 178: 3272: 3270: 3268: 2935: 2729: 2618: 1755: 1709: 1615: 1432: 1430: 1428: 1426: 1424: 1422: 1420: 1418: 1416: 1414: 1378: 1376: 1374: 1372: 1370: 1368: 1366: 1364: 1263: 1261: 1259: 1257: 1255: 1253: 1251: 1249: 1247: 1245: 1229: 1227: 1225: 1223: 461:Another example of an evolutionarily derived 143:Moral judgments that can be characterised as 42:along with Brian Sommerville, Leigh Nystrom, 2612: 2526: 2467: 2424: 2375: 2320: 2284: 2241: 2184: 2133: 2048: 1523: 874:The Argument from Morally Irrelevant Factors 439:' we find speculations about the origins of 108:is fast and unconscious, which gives way to 3198:"The Secret Emptiness of Greene's Argument" 2473: 2010: 1572: 1078: 978:"narrow-minded, parochial, and innumerate", 743:has argued that Joshua Greene, in his book 731: 635:of ethics that focus on "right action" and 592:Failure of the non-negotiability hypothesis 390:that we shouldn't push the fat man. But we 3265: 2535:Annals of the New York Academy of Sciences 2347:Singer P (2005). "Ethics and Intuitions". 1411: 1361: 1242: 1220: 3239: 3210: 3172: 2907: 2889: 2755: 2644: 2509: 2499: 2407: 2224: 2167: 2157: 2098: 1910: 1900: 1727: 1679: 1488:Bruni T, Mameli M, Rini RA (2013-08-25). 1109: 858:"The Argument from Evolutionary History". 813:intuitions are driven by emotions, while 538: 411: 3394:Prinz, Jesse (2011). "Against empathy". 2848:https://doi.org/10.1177/0146167219897662 2054: 506: 502: 349:focusing on patients with damage to the 260: 224: 85: 3420: 3375:Scarantino A, de Sousa R (2018-09-25). 2331: 2297: 1831: 1324: 361:, was that of American railroad worker 3480: 3276: 3146: 3111: 3076: 2998:. In Sinnott-Armstrong, Walter (ed.). 2990: 2793: 2778: 2346: 2247: 2019: 1957: 1938: 1440: 1382: 1281: 1233: 1183: 1140: 955: 912: 797: 773: 215: 3445: 3443: 3393: 3245: 2931: 2929: 2927: 2859: 2857: 2855: 2730:Kahane G, Shackel N (November 2010). 2702: 2104: 1934: 1932: 1930: 1834:Perspectives on Psychological Science 1801:Psychology of Learning and Motivation 1751: 1749: 1747: 1710:BiaĹ‚ek, MichaĹ‚; De Neys, Wim (2017). 1705: 1703: 1701: 1699: 1657: 1655: 477:Peter Singer agrees with Greene that 3449: 3337: 3302: 3037: 2619:Koenigs M, Tranel D (January 2007). 2022:The Law & Ethics of Human Rights 1661: 444:contemporary dual process theories. 3381:Stanford Encyclopedia of Philosophy 2794:Greene, Joshua (14 December 2010). 2583: 896:which elicits a characteristically 95: 30:processes: one fast, intuitive and 13: 3440: 2924: 2852: 1927: 1744: 1696: 1652: 1238:. Cambridge, MA: MIT Press: 35–79. 972:: The Case for Rational Compassion 670:conclusions, without crossing the 481:judgements are to be favored over 68:. Greene has often emphasized the 14: 3504: 3195: 397: 154: 3421:Cummins, Denise (Oct 20, 2013). 3408:10.1111/j.2041-6962.2011.00069.x 3317:10.1111/j.1088-4963.2009.01165.x 2993:"The secret joke of Kant's soul" 2835:https://doi.org/10.1037/a0031021 2748:10.1111/j.1468-0017.2010.01401.x 2717:10.1111/j.1467-9736.2011.00701.x 2256:(8): 322–3, author reply 323–4. 1544:10.1111/j.1467-9280.2006.01834.x 1459:10.1111/j.1088-4963.2009.01164.x 986:'The Identifiable victim effect' 337: 169:efficiency-flexibility trade-off 3414: 3387: 3368: 3331: 3305:Philosophy & Public Affairs 3296: 3204: 3189: 3140: 3105: 3070: 3031: 2984: 2839: 2826: 2813: 2787: 2772: 2723: 2696: 2661: 2577: 2445:10.1016/j.cognition.2012.05.011 2400:10.1016/j.cognition.2007.11.004 2340: 1951: 1876: 1825: 1792: 1758:Journal of Cognitive Psychology 1618:Journal of Cognitive Psychology 1609: 1566: 1481: 1447:Philosophy & Public Affairs 1155:10.1016/j.cognition.2017.03.004 809:argument. According to it, our 708:No Cognitive Miracles Principle 220: 3396:Southern Journal of Philosophy 3008:10.7551/mitpress/7504.003.0004 2637:10.1523/JNEUROSCI.4606-06.2007 1318: 1177: 1134: 807:"Emotions Bad, Reasoning Good" 596:Guzmán, Barbato, Sznycer, and 351:ventromedial prefrontal cortex 312:Ventromedial prefrontal cortex 300:Dorsolateral prefrontal cortex 16:Theory of human moral judgment 1: 3091:10.1080/09515089.2018.1426100 2055:Poldrack, R (February 2006). 1902:10.3390/jintelligence11040076 1770:10.1080/20445911.2016.1156118 1630:10.1080/20445911.2016.1156118 1587:10.1080/13546783.2016.1216011 1017: 3277:Wright R (23 October 2013). 3126:10.1080/09515089.2013.849381 3038:Lott, Micah (October 2016). 2954:10.1080/17470919.2011.614000 2783:. MIT Press. pp. 35–80. 2670:Trends in Cognitive Sciences 2501:10.1371/journal.pone.0109687 2474:Rand DG, Epstein ZG (2014). 2306:. New York: Grosset/Putnam. 2250:Trends in Cognitive Sciences 2064:Trends in Cognitive Sciences 1803:. Elsevier. pp. 33–58. 1716:Judgment and Decision Making 1668:Judgment and Decision Making 1186:Nature Reviews. Neuroscience 1102:10.1016/j.neuron.2004.09.027 625:Alleged ethical implications 117:conscious-controlled process 7: 3342:The ethics of human cloning 2991:Greene, Joshuea D. (2007). 2625:The Journal of Neuroscience 1441:Berker S (September 2009). 179:Interaction between systems 106:automatic-emotional process 10: 3509: 2682:10.1016/j.tics.2007.06.011 2262:10.1016/j.tics.2007.06.004 2076:10.1016/j.tics.2005.12.004 1809:10.1016/bs.plm.2014.09.002 856:The third bad argument is 3214:Kant on Emotion and Value 2598:10.1007/s12559-012-9181-0 2361:10.1007/s10892-005-3508-y 2119:10.1007/s12152-010-9077-1 1729:10.1017/S1930297500005696 1681:10.1017/S1930297500003636 1662:Koop, Gregory. J (2013). 1509:10.1007/s12152-013-9191-y 1397:10.1007/s10892-005-3508-y 686:Greene's "indirect route" 317:Central Tension Principle 3114:Philosophical Psychology 3079:Philosophical Psychology 2159:10.1186/1471-2377-11-151 1846:10.1177/1745691611429354 1575:Thinking & Reasoning 732:Philosophical criticisms 437:Treatise of Human Nature 288:Inferior parietal lobule 284:Superior temporal sulcus 141:Central Tension Problem: 38:. Initially proposed by 3223:10.1057/9781137276650_8 2891:10.1073/pnas.2214005119 2555:10.1196/annals.1440.005 1980:10.1126/science.1062872 1939:Greene, Joshua (2014). 1889:Journal of Intelligence 1056:10.1126/science.1062872 662:Greene's "direct route" 554:the posterior cingulate 422:evolutionary psychology 65:Thinking, Fast and Slow 3493:Psychological theories 3452:Moral neuroenhancement 3246:Nagel T (2013-11-02). 2034:10.1515/lehr-2015-0011 1015: 984:An example of this is 893: 717: 632:deontological theories 568:"appropriate" and the 539:Methodological Worries 515: 412:Evolutionary rationale 266: 230: 92: 3379:. In Zalta EN (ed.). 2586:Cognitive Computation 2349:The Journal of Ethics 1532:Psychological Science 1385:The Journal of Ethics 1007: 880: 782:philosophy professor 712: 510: 503:Scientific criticisms 264: 228: 89: 52:dual process accounts 3450:Earp, Brian (2017). 3217:. pp. 146–165. 3147:Kahane, Guy (2012). 2781:Big Moral Psychology 927:Wisdom of Repugnance 638:utilitarian theories 208:need to think about 48:Jonathan David Cohen 3153:Mind & Language 2942:Social Neuroscience 2882:2022PNAS..11914005G 2876:(42): e2214005119. 2801:(Unpublished notes) 2736:Mind & Language 2547:2008NYASA1124..161M 2492:2014PLoSO...9j9687R 2217:10.1038/nature05631 2209:2007Natur.446..908K 1972:2001Sci...293.2105G 1966:(5537): 2105–2108. 1048:2001Sci...293.2105G 956:The role of empathy 913:Intuition as wisdom 847:"The Argument from 837:posterior cingulate 798:Three bad arguments 774:Berker's criticisms 726:genetic engineering 703:situation at hand. 656:cognitive miracles" 451:evolutionary theory 418:dual process theory 216:Scientific evidence 20:Dual process theory 3165:10.1111/mila.12001 2332:Edmonds D (2014). 2298:Damasio A (1994). 976:labels empathy as 832:empirical evidence 516: 431:thinking, such as 343:Neuropsychological 267: 231: 139:He calls this the 93: 32:emotionally-driven 3232:978-1-349-44676-6 3017:978-0-262-19564-5 998:moral enhancement 805:The first is the 654:mean to bank on " 332:reverse inference 272:Prefrontal cortex 60:Daniel Kahneman's 3500: 3488:Moral psychology 3472: 3471: 3447: 3438: 3437: 3435: 3433: 3427:Psychology Today 3418: 3412: 3411: 3391: 3385: 3384: 3372: 3366: 3365: 3345: 3335: 3329: 3328: 3300: 3294: 3293: 3291: 3289: 3274: 3263: 3262: 3260: 3258: 3243: 3237: 3236: 3208: 3202: 3201: 3193: 3187: 3186: 3176: 3144: 3138: 3137: 3109: 3103: 3102: 3074: 3068: 3067: 3035: 3029: 3028: 3026: 3020:. Archived from 2997: 2988: 2982: 2981: 2933: 2922: 2921: 2911: 2893: 2861: 2850: 2843: 2837: 2830: 2824: 2817: 2811: 2810: 2808: 2806: 2800: 2791: 2785: 2784: 2776: 2770: 2769: 2759: 2727: 2721: 2720: 2700: 2694: 2693: 2665: 2659: 2658: 2648: 2616: 2610: 2609: 2581: 2575: 2574: 2530: 2524: 2523: 2513: 2503: 2471: 2465: 2464: 2428: 2422: 2421: 2411: 2379: 2373: 2372: 2355:(3–4): 331–352. 2344: 2338: 2337: 2329: 2318: 2317: 2305: 2295: 2282: 2281: 2245: 2239: 2238: 2228: 2203:(7138): 908–11. 2188: 2182: 2181: 2171: 2161: 2137: 2131: 2130: 2102: 2096: 2095: 2061: 2052: 2046: 2045: 2017: 2008: 2007: 1955: 1949: 1948: 1946: 1936: 1925: 1924: 1914: 1904: 1880: 1874: 1873: 1829: 1823: 1822: 1796: 1790: 1789: 1753: 1742: 1741: 1731: 1707: 1694: 1693: 1683: 1659: 1650: 1649: 1613: 1607: 1606: 1570: 1564: 1563: 1527: 1521: 1520: 1494: 1485: 1479: 1478: 1438: 1409: 1408: 1391:(3–4): 331–352. 1380: 1359: 1358: 1322: 1316: 1315: 1279: 1240: 1239: 1231: 1218: 1217: 1181: 1175: 1174: 1138: 1132: 1131: 1113: 1085: 1076: 1075: 1042:(5537): 2105–8. 1031: 970:'Against Empathy 898:consequentialist 819:question begging 815:consequentialist 479:consequentialist 379:Descartes' Error 282:, the posterior 276:Cingulate cortex 274:, the posterior 132:(see Figure 1), 96:Core commitments 81:moral psychology 3508: 3507: 3503: 3502: 3501: 3499: 3498: 3497: 3478: 3477: 3476: 3475: 3448: 3441: 3431: 3429: 3419: 3415: 3402:(s1): 214–233. 3392: 3388: 3373: 3369: 3354: 3338:Kass L (1998). 3336: 3332: 3301: 3297: 3287: 3285: 3275: 3266: 3256: 3254: 3244: 3240: 3233: 3209: 3205: 3194: 3190: 3145: 3141: 3110: 3106: 3075: 3071: 3036: 3032: 3024: 3018: 2995: 2989: 2985: 2934: 2925: 2862: 2853: 2844: 2840: 2831: 2827: 2818: 2814: 2804: 2802: 2798: 2792: 2788: 2777: 2773: 2728: 2724: 2705:The Yale Review 2701: 2697: 2666: 2662: 2617: 2613: 2582: 2578: 2531: 2527: 2486:(10): e109687. 2472: 2468: 2429: 2425: 2380: 2376: 2345: 2341: 2330: 2321: 2314: 2296: 2285: 2246: 2242: 2189: 2185: 2138: 2134: 2103: 2099: 2059: 2053: 2049: 2018: 2011: 1956: 1952: 1937: 1928: 1881: 1877: 1830: 1826: 1819: 1797: 1793: 1754: 1745: 1708: 1697: 1660: 1653: 1614: 1610: 1571: 1567: 1528: 1524: 1492: 1486: 1482: 1439: 1412: 1381: 1362: 1323: 1319: 1280: 1243: 1232: 1221: 1198:10.1038/nrn1224 1182: 1178: 1139: 1135: 1086: 1079: 1032: 1025: 1020: 958: 931:ethical dilemma 915: 876: 868:false dichotomy 800: 776: 734: 688: 664: 627: 613:people to save 594: 541: 505: 414: 400: 359:Antonio Damasio 340: 239:Trolley problem 223: 218: 203:Within generic 181: 157: 126:Philippa Foot's 98: 28:moral reasoning 24:moral judgement 17: 12: 11: 5: 3506: 3496: 3495: 3490: 3474: 3473: 3439: 3413: 3386: 3367: 3353:978-0844740508 3352: 3330: 3311:(4): 330–345. 3295: 3264: 3238: 3231: 3203: 3196:Fiala, Brian. 3188: 3159:(5): 519–545. 3139: 3120:(4): 466–486. 3104: 3085:(3): 383–402. 3069: 3056:10.1086/687337 3050:(1): 241–256. 3030: 3027:on 2011-08-18. 3016: 2983: 2948:(3): 269–279. 2923: 2851: 2838: 2825: 2812: 2786: 2771: 2742:(5): 561–582. 2722: 2695: 2660: 2611: 2576: 2525: 2466: 2423: 2394:(3): 1144–54. 2374: 2339: 2319: 2312: 2283: 2240: 2183: 2132: 2113:(2): 143–162. 2097: 2047: 2009: 1950: 1926: 1875: 1824: 1817: 1791: 1764:(5): 631–639. 1743: 1722:(2): 148–167. 1695: 1674:(5): 527–539. 1651: 1624:(5): 631–639. 1608: 1565: 1538:(12): 1082–9. 1522: 1503:(2): 159–172. 1480: 1453:(4): 293–329. 1410: 1360: 1339:10.1086/675876 1333:(4): 813–859. 1317: 1296:10.1086/675875 1290:(4): 695–726. 1241: 1219: 1176: 1133: 1096:(2): 389–400. 1077: 1022: 1021: 1019: 1016: 957: 954: 914: 911: 875: 872: 799: 796: 775: 772: 733: 730: 722:climate change 687: 684: 663: 660: 626: 623: 593: 590: 540: 537: 525:Ultimatum Game 512:Extensive form 504: 501: 488:contraceptives 413: 410: 405:cognitive load 399: 398:Reaction times 396: 347:lesion studies 345:evidence from 339: 336: 296:working memory 222: 219: 217: 214: 180: 177: 156: 155:Camera analogy 153: 122: 121: 113: 97: 94: 36:cognitive load 15: 9: 6: 4: 3: 2: 3505: 3494: 3491: 3489: 3486: 3485: 3483: 3469: 3465: 3461: 3457: 3454:. Routledge. 3453: 3446: 3444: 3428: 3424: 3417: 3409: 3405: 3401: 3397: 3390: 3382: 3378: 3371: 3363: 3359: 3355: 3349: 3346:. AEI Press. 3344: 3343: 3334: 3326: 3322: 3318: 3314: 3310: 3306: 3299: 3284: 3280: 3273: 3271: 3269: 3253: 3249: 3242: 3234: 3228: 3224: 3220: 3216: 3215: 3207: 3199: 3192: 3184: 3180: 3175: 3170: 3166: 3162: 3158: 3154: 3150: 3143: 3135: 3131: 3127: 3123: 3119: 3115: 3108: 3100: 3096: 3092: 3088: 3084: 3080: 3073: 3065: 3061: 3057: 3053: 3049: 3045: 3041: 3034: 3023: 3019: 3013: 3009: 3005: 3001: 2994: 2987: 2979: 2975: 2971: 2967: 2963: 2959: 2955: 2951: 2947: 2943: 2939: 2932: 2930: 2928: 2919: 2915: 2910: 2905: 2901: 2897: 2892: 2887: 2883: 2879: 2875: 2871: 2867: 2860: 2858: 2856: 2849: 2842: 2836: 2829: 2823: 2816: 2797: 2790: 2782: 2775: 2767: 2763: 2758: 2753: 2749: 2745: 2741: 2737: 2733: 2726: 2718: 2714: 2710: 2706: 2699: 2691: 2687: 2683: 2679: 2675: 2671: 2664: 2656: 2652: 2647: 2642: 2638: 2634: 2630: 2626: 2622: 2615: 2607: 2603: 2599: 2595: 2592:(4): 566–79. 2591: 2587: 2580: 2572: 2568: 2564: 2560: 2556: 2552: 2548: 2544: 2541:(1): 161–80. 2540: 2536: 2529: 2521: 2517: 2512: 2507: 2502: 2497: 2493: 2489: 2485: 2481: 2477: 2470: 2462: 2458: 2454: 2450: 2446: 2442: 2439:(3): 379–84. 2438: 2434: 2427: 2419: 2415: 2410: 2405: 2401: 2397: 2393: 2389: 2385: 2378: 2370: 2366: 2362: 2358: 2354: 2350: 2343: 2335: 2328: 2326: 2324: 2315: 2313:9780399138942 2309: 2304: 2303: 2294: 2292: 2290: 2288: 2279: 2275: 2271: 2267: 2263: 2259: 2255: 2251: 2244: 2236: 2232: 2227: 2222: 2218: 2214: 2210: 2206: 2202: 2198: 2194: 2187: 2179: 2175: 2170: 2165: 2160: 2155: 2151: 2147: 2146:BMC Neurology 2143: 2136: 2128: 2124: 2120: 2116: 2112: 2108: 2101: 2093: 2089: 2085: 2081: 2077: 2073: 2069: 2065: 2058: 2051: 2043: 2039: 2035: 2031: 2028:(2): 141–72. 2027: 2023: 2016: 2014: 2005: 2001: 1997: 1993: 1989: 1985: 1981: 1977: 1973: 1969: 1965: 1961: 1954: 1945: 1944: 1935: 1933: 1931: 1922: 1918: 1913: 1908: 1903: 1898: 1894: 1890: 1886: 1879: 1871: 1867: 1863: 1859: 1855: 1851: 1847: 1843: 1839: 1835: 1828: 1820: 1818:9780128022733 1814: 1810: 1806: 1802: 1795: 1787: 1783: 1779: 1775: 1771: 1767: 1763: 1759: 1752: 1750: 1748: 1739: 1735: 1730: 1725: 1721: 1717: 1713: 1706: 1704: 1702: 1700: 1691: 1687: 1682: 1677: 1673: 1669: 1665: 1658: 1656: 1647: 1643: 1639: 1635: 1631: 1627: 1623: 1619: 1612: 1604: 1600: 1596: 1592: 1588: 1584: 1580: 1576: 1569: 1561: 1557: 1553: 1549: 1545: 1541: 1537: 1533: 1526: 1518: 1514: 1510: 1506: 1502: 1498: 1491: 1484: 1476: 1472: 1468: 1464: 1460: 1456: 1452: 1448: 1444: 1437: 1435: 1433: 1431: 1429: 1427: 1425: 1423: 1421: 1419: 1417: 1415: 1406: 1402: 1398: 1394: 1390: 1386: 1379: 1377: 1375: 1373: 1371: 1369: 1367: 1365: 1356: 1352: 1348: 1344: 1340: 1336: 1332: 1328: 1321: 1313: 1309: 1305: 1301: 1297: 1293: 1289: 1285: 1278: 1276: 1274: 1272: 1270: 1268: 1266: 1264: 1262: 1260: 1258: 1256: 1254: 1252: 1250: 1248: 1246: 1237: 1230: 1228: 1226: 1224: 1215: 1211: 1207: 1203: 1199: 1195: 1192:(10): 846–9. 1191: 1187: 1180: 1172: 1168: 1164: 1160: 1156: 1152: 1148: 1144: 1137: 1129: 1125: 1121: 1117: 1112: 1107: 1103: 1099: 1095: 1091: 1084: 1082: 1073: 1069: 1065: 1061: 1057: 1053: 1049: 1045: 1041: 1037: 1030: 1028: 1023: 1014: 1012: 1006: 1003: 999: 994: 991: 987: 982: 979: 975: 973: 967: 963: 953: 951: 947: 942: 940: 936: 932: 928: 924: 920: 910: 908: 904: 899: 892: 889: 886: 883: 879: 871: 869: 864: 863:alarm systems 859: 854: 852: 850: 843: 840: 838: 833: 829: 825: 820: 816: 812: 811:deontological 808: 803: 795: 793: 789: 785: 781: 771: 768: 764: 760: 759:Robert Wright 756: 754: 750: 749:Immanuel Kant 746: 745:Moral Tribes, 742: 738: 729: 727: 723: 716: 711: 709: 704: 700: 697: 691: 683: 679: 677: 673: 669: 659: 657: 651: 648: 647:trolley-cases 643: 640: 639: 634: 633: 622: 620: 616: 612: 606: 602: 599: 589: 586: 580: 577: 573: 571: 567: 563: 557: 555: 549: 547: 536: 534: 530: 529:limbic system 526: 521: 513: 509: 500: 496: 493: 489: 484: 483:deontological 480: 475: 472: 468: 464: 459: 457: 452: 449: 445: 442: 438: 434: 430: 425: 423: 419: 409: 406: 395: 393: 389: 385: 384:David Edmonds 381: 380: 374: 372: 368: 364: 360: 355: 352: 348: 344: 338:Brain lesions 335: 333: 328: 324: 320: 318: 313: 309: 305: 304:Parietal lobe 301: 297: 293: 289: 285: 281: 277: 273: 263: 259: 257: 252: 250: 245: 242: 240: 236: 227: 213: 211: 206: 201: 199: 195: 189: 185: 176: 172: 170: 165: 162: 152: 150: 146: 145:deontological 142: 137: 135: 134:Joshua Greene 131: 127: 118: 114: 111: 107: 103: 102: 101: 88: 84: 82: 77: 75: 71: 67: 66: 61: 57: 53: 49: 45: 41: 40:Joshua Greene 37: 33: 29: 25: 21: 3451: 3430:. Retrieved 3426: 3416: 3399: 3395: 3389: 3380: 3370: 3341: 3333: 3308: 3304: 3298: 3286:. Retrieved 3283:The Atlantic 3282: 3255:. Retrieved 3252:New Republic 3251: 3241: 3213: 3206: 3191: 3156: 3152: 3142: 3117: 3113: 3107: 3082: 3078: 3072: 3047: 3043: 3033: 3022:the original 2999: 2986: 2945: 2941: 2873: 2869: 2841: 2828: 2815: 2805:20 September 2803:. Retrieved 2789: 2780: 2774: 2739: 2735: 2725: 2711:(2): 26–43. 2708: 2704: 2698: 2676:(8): 323–4. 2673: 2669: 2663: 2631:(4): 951–6. 2628: 2624: 2614: 2589: 2585: 2579: 2538: 2534: 2528: 2483: 2479: 2469: 2436: 2432: 2426: 2391: 2387: 2377: 2352: 2348: 2342: 2333: 2301: 2253: 2249: 2243: 2200: 2196: 2186: 2149: 2145: 2135: 2110: 2106: 2100: 2070:(2): 59–63. 2067: 2063: 2050: 2025: 2021: 1963: 1959: 1953: 1942: 1892: 1888: 1878: 1840:(1): 28–38. 1837: 1833: 1827: 1800: 1794: 1761: 1757: 1719: 1715: 1671: 1667: 1621: 1617: 1611: 1581:(1): 49–80. 1578: 1574: 1568: 1535: 1531: 1525: 1500: 1496: 1483: 1450: 1446: 1388: 1384: 1330: 1326: 1320: 1287: 1283: 1235: 1189: 1185: 1179: 1146: 1142: 1136: 1093: 1089: 1039: 1035: 1010: 1008: 995: 989: 983: 977: 971: 968:, author of 959: 943: 935:human nature 926: 916: 906: 902: 897: 894: 890: 887: 884: 881: 877: 862: 857: 855: 846: 844: 835: 827: 823: 806: 804: 801: 784:Selim Berker 777: 757: 744: 741:Thomas Nagel 739: 735: 718: 713: 707: 705: 701: 692: 689: 680: 672:is–ought gap 665: 655: 652: 644: 637: 631: 628: 618: 614: 610: 607: 603: 595: 584: 581: 578: 574: 569: 565: 561: 558: 550: 542: 517: 497: 476: 471:Peter Singer 460: 446: 426: 415: 401: 391: 387: 377: 376:In his book 375: 371:neuroimaging 363:Phineas Gage 356: 341: 329: 325: 321: 316: 268: 254: 253: 247: 246: 243: 233:Greene uses 232: 221:Neuroimaging 209: 205:dual process 202: 197: 193: 190: 186: 182: 173: 168: 166: 158: 140: 138: 130:Trolley Case 123: 116: 105: 99: 78: 63: 19: 18: 3288:24 November 3257:24 November 2107:Neuroethics 1497:Neuroethics 1111:10983/15961 919:yuck factor 241:in ethics. 161:digital SLR 149:utilitarian 44:John Darley 3482:Categories 3460:1027761018 1947:. Penguin. 1018:References 966:Paul Bloom 962:Jess Prinz 849:Heuristics 792:fallacious 767:Paul Bloom 753:John Rawls 546:Paul Bloom 533:brain stem 58:, such as 56:psychology 3377:"Emotion" 3325:0048-3915 3134:146547149 3099:148678250 3064:151940241 2962:1747-0919 2900:0027-8424 2433:Cognition 2388:Cognition 2127:143640307 2042:199486714 1988:0036-8075 1895:(4): 76. 1854:1745-6916 1778:2044-5911 1638:2044-5911 1603:148524895 1595:1354-6783 1467:0048-3915 1355:143579026 1347:0014-1704 1304:0014-1704 1149:: 66–77. 1143:Cognition 923:Leon Kass 715:miracles. 668:normative 617:", where 429:Darwinian 280:Precuneus 198:recognise 120:features. 110:intuitive 70:normative 3468:29630194 3432:24 April 3362:38989719 3183:23335831 2970:21942995 2918:36215511 2766:22427714 2690:54374285 2655:17251437 2606:18746213 2571:44258054 2563:18400930 2520:25333876 2480:PLOS ONE 2461:41664054 2453:22698994 2418:18158145 2369:49914215 2270:17625951 2235:17377536 2178:22136635 2092:13498984 2084:16406760 1996:11557895 1921:37103261 1912:10146599 1870:32261626 1862:26168420 1786:13751886 1738:13744641 1690:17148259 1646:13751886 1560:17294896 1552:17201791 1517:55999301 1405:49914215 1214:14438498 1206:14523384 1171:13948078 1163:28343626 1120:15473975 1064:11557895 1011:modulate 1002:oxytocin 788:rhetoric 598:Cosmides 570:combined 566:combined 448:Darwin's 441:morality 308:Amygdala 302:and the 292:Amygdala 290:and the 91:dilemma. 3174:3546390 2978:9947014 2909:9586309 2878:Bibcode 2757:3303120 2646:2490711 2543:Bibcode 2511:4198114 2488:Bibcode 2409:2429958 2278:7035116 2226:2244801 2205:Bibcode 2169:3265436 2152:: 151. 2004:1437941 1968:Bibcode 1960:Science 1475:5952062 1312:9063016 1128:9061712 1072:1437941 1044:Bibcode 1036:Science 950:empathy 939:dignity 780:Harvard 763:ingroup 696:culture 676:premise 467:justice 427:In pre- 367:Vermont 128:famous 3466:  3458:  3360:  3350:  3323:  3229:  3181:  3171:  3132:  3097:  3062:  3044:Ethics 3014:  2976:  2968:  2960:  2916:  2906:  2898:  2764:  2754:  2688:  2653:  2643:  2604:  2569:  2561:  2518:  2508:  2459:  2451:  2416:  2406:  2367:  2310:  2276:  2268:  2233:  2223:  2197:Nature 2176:  2166:  2125:  2090:  2082:  2040:  2002:  1994:  1986:  1919:  1909:  1868:  1860:  1852:  1815:  1784:  1776:  1736:  1688:  1644:  1636:  1601:  1593:  1558:  1550:  1515:  1473:  1465:  1403:  1353:  1345:  1327:Ethics 1310:  1302:  1284:Ethics 1212:  1204:  1169:  1161:  1126:  1118:  1090:Neuron 1070:  1062:  492:incest 433:Hume's 210:hybrid 74:ethics 3130:S2CID 3095:S2CID 3060:S2CID 3025:(PDF) 2996:(PDF) 2974:S2CID 2799:(PDF) 2686:S2CID 2602:S2CID 2567:S2CID 2457:S2CID 2365:S2CID 2274:S2CID 2123:S2CID 2088:S2CID 2060:(PDF) 2038:S2CID 2000:S2CID 1866:S2CID 1782:S2CID 1734:S2CID 1686:S2CID 1642:S2CID 1599:S2CID 1556:S2CID 1513:S2CID 1493:(PDF) 1471:S2CID 1401:S2CID 1351:S2CID 1308:S2CID 1210:S2CID 1167:S2CID 1124:S2CID 1068:S2CID 990:known 946:guilt 903:moral 520:model 392:think 298:(the 3464:PMID 3456:OCLC 3434:2019 3358:OCLC 3348:ISBN 3321:ISSN 3290:2013 3259:2013 3227:ISBN 3179:PMID 3012:ISBN 2966:PMID 2958:ISSN 2914:PMID 2896:ISSN 2807:2024 2762:PMID 2651:PMID 2559:PMID 2539:1124 2516:PMID 2449:PMID 2414:PMID 2308:ISBN 2266:PMID 2231:PMID 2174:PMID 2080:PMID 1992:PMID 1984:ISSN 1917:PMID 1858:PMID 1850:ISSN 1813:ISBN 1774:ISSN 1634:ISSN 1591:ISSN 1548:PMID 1463:ISSN 1343:ISSN 1300:ISSN 1202:PMID 1159:PMID 1116:PMID 1060:PMID 937:and 907:also 828:only 824:only 751:and 585:both 562:each 531:and 463:norm 456:Hume 416:The 388:feel 235:fMRI 115:The 104:The 3404:doi 3313:doi 3219:doi 3169:PMC 3161:doi 3122:doi 3087:doi 3052:doi 3048:127 3004:doi 2950:doi 2904:PMC 2886:doi 2874:119 2752:PMC 2744:doi 2713:doi 2678:doi 2641:PMC 2633:doi 2594:doi 2551:doi 2506:PMC 2496:doi 2441:doi 2437:124 2404:PMC 2396:doi 2392:107 2357:doi 2258:doi 2221:PMC 2213:doi 2201:446 2164:PMC 2154:doi 2115:doi 2072:doi 2030:doi 1976:doi 1964:293 1907:PMC 1897:doi 1842:doi 1805:doi 1766:doi 1724:doi 1676:doi 1626:doi 1583:doi 1540:doi 1505:doi 1455:doi 1393:doi 1335:doi 1331:124 1292:doi 1288:124 1194:doi 1151:doi 1147:167 1106:hdl 1098:doi 1052:doi 1040:293 948:or 925:'s 710:": 465:is 424:). 194:and 54:in 3484:: 3462:. 3442:^ 3425:. 3400:49 3398:. 3356:. 3319:. 3309:37 3307:. 3281:. 3267:^ 3250:. 3225:. 3177:. 3167:. 3157:27 3155:. 3151:. 3128:. 3118:28 3116:. 3093:. 3083:31 3081:. 3058:. 3046:. 3042:. 3010:. 2972:. 2964:. 2956:. 2944:. 2940:. 2926:^ 2912:. 2902:. 2894:. 2884:. 2872:. 2868:. 2854:^ 2760:. 2750:. 2740:25 2738:. 2734:. 2709:99 2707:. 2684:. 2674:11 2672:. 2649:. 2639:. 2629:27 2627:. 2623:. 2600:. 2588:. 2565:. 2557:. 2549:. 2537:. 2514:. 2504:. 2494:. 2482:. 2478:. 2455:. 2447:. 2435:. 2412:. 2402:. 2390:. 2386:. 2363:. 2351:. 2322:^ 2286:^ 2272:. 2264:. 2254:11 2252:. 2229:. 2219:. 2211:. 2199:. 2195:. 2172:. 2162:. 2150:11 2148:. 2144:. 2121:. 2109:. 2086:. 2078:. 2068:10 2066:. 2062:. 2036:. 2024:. 2012:^ 1998:. 1990:. 1982:. 1974:. 1962:. 1929:^ 1915:. 1905:. 1893:11 1891:. 1887:. 1864:. 1856:. 1848:. 1836:. 1811:. 1780:. 1772:. 1762:28 1760:. 1746:^ 1732:. 1720:12 1718:. 1714:. 1698:^ 1684:. 1670:. 1666:. 1654:^ 1640:. 1632:. 1622:28 1620:. 1597:. 1589:. 1579:23 1577:. 1554:. 1546:. 1536:17 1534:. 1511:. 1499:. 1495:. 1469:. 1461:. 1451:37 1449:. 1445:. 1413:^ 1399:. 1387:. 1363:^ 1349:. 1341:. 1329:. 1306:. 1298:. 1286:. 1244:^ 1222:^ 1208:. 1200:. 1188:. 1165:. 1157:. 1145:. 1122:. 1114:. 1104:. 1094:44 1092:. 1080:^ 1066:. 1058:. 1050:. 1038:. 1026:^ 870:. 794:. 724:, 535:. 458:. 334:. 76:. 46:, 3470:. 3436:. 3410:. 3406:: 3383:. 3364:. 3327:. 3315:: 3292:. 3261:. 3235:. 3221:: 3200:. 3185:. 3163:: 3136:. 3124:: 3101:. 3089:: 3066:. 3054:: 3006:: 2980:. 2952:: 2946:7 2920:. 2888:: 2880:: 2809:. 2768:. 2746:: 2719:. 2715:: 2692:. 2680:: 2657:. 2635:: 2608:. 2596:: 2590:5 2573:. 2553:: 2545:: 2522:. 2498:: 2490:: 2484:9 2463:. 2443:: 2420:. 2398:: 2371:. 2359:: 2353:9 2316:. 2280:. 2260:: 2237:. 2215:: 2207:: 2180:. 2156:: 2129:. 2117:: 2111:4 2094:. 2074:: 2044:. 2032:: 2026:9 2006:. 1978:: 1970:: 1923:. 1899:: 1872:. 1844:: 1838:7 1821:. 1807:: 1788:. 1768:: 1740:. 1726:: 1692:. 1678:: 1672:8 1648:. 1628:: 1605:. 1585:: 1562:. 1542:: 1519:. 1507:: 1501:7 1477:. 1457:: 1407:. 1395:: 1389:9 1357:. 1337:: 1314:. 1294:: 1216:. 1196:: 1190:4 1173:. 1153:: 1130:. 1108:: 1100:: 1074:. 1054:: 1046:: 974:' 851:" 839:- 619:N 615:y 611:x 435:' 286:/ 278:/

Index

moral judgement
moral reasoning
emotionally-driven
cognitive load
Joshua Greene
John Darley
Jonathan David Cohen
dual process accounts
psychology
Daniel Kahneman's
Thinking, Fast and Slow
normative
ethics
moral psychology

intuitive
Philippa Foot's
Trolley Case
Joshua Greene
deontological
utilitarian
digital SLR
dual process

fMRI
Trolley problem

Prefrontal cortex
Cingulate cortex
Precuneus

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑